Strides in Development of Medical Education

Document Type : Review

Authors

1 Evidence Based Medicine Center, Hormozgan University of Medical Sciences, Bandar Abbas, Iran

2 Clinical Research Development Center of Shahid Mohammadi Hospital, Hormozgan University of Medical Sciences, Bandar Abbas, Iran

3 Faculty of Medicine, Hormozgan University of Medical Sciences, Bandar Abbas, Iran

4 Student Research Committee, Faculty of Nursing and Midwifery, Hormozgan University of Medical Sciences, Bandar Abbas, Iran

5 Faculty of Dentistry, Hormozgan University of Medical Sciences, Bandar Abbas, Iran

6 Department of Medical Education, School of Medicine, Hormozgan University of Medical Sciences, Bandar Abbas, Iran

Abstract

Background: Universities were compelled to transition from in-person instruction to online instruction following the COVID-19 pandemic. Nevertheless, this transition has presented a plethora of obstacles, necessitating the development of innovative solutions to guarantee that all students have access to effective and inclusive learning experiences.
Objectives: We conducted a review investigating the innovative E-learning methods in health sciences education.
Methods: This study was designed as systematic review. We searched PubMed, Scopus, Eric, Google Scholar, and Iran Scientific Database from start of the pandemic to September 2021. Papers meeting our inclusion criteria were reviewed for data extraction. Screening of articles, data extraction, and risk of bias assessment was done in double-blind groups. Our primary objective was to provide a comprehensive overview of emerging
e-learning methods and to assess their effectiveness using eleven commonly used metrics that are used to quantify the efficacy of online learning. As a secondary outcome, we employed Kirkpatrick’s four levels of learning to assess the quality of online learning.
Results: Six Thousand Four Hundred Ninety Two papers were identified, thirty records were included, the majority of studies reported (n= 25) a transition of previous methods to online formats. Five studies described innovative methods. The included studies were classified as moderate to high risk of bias. In the study of success factors, the most ignored area was 'usage of suitable assignments' (37.5%), while 'optimal quality of media' emerged as the most consistently used component (67%). All studies assessed Kirkpatrick’s level 1, 76% had assessed level 2. None described Kirkpatrick levels 3 or 4.
Conclusion: E-learning can be alternatively utilized for health education. Future
E-learning research should incorporate randomized designs and adhere to principles of Kirkpatrick model, with a particular focus on levels 3 and 4, to develop more effective evidence-based systems.

Keywords

Background

COVID-19 pandemic has emerged as a substantial global hazard, resulting in a substantial number of disruptions, including the closure of educational institutions. Despite the numerous obstacles associated with this precipitous transition, educational institutions were compelled to rapidly transition from traditional learning methods to e-learning (electronic learning) platforms in order to mitigate the virus's spread (1, 2). Although the pandemic's effects are felt in all nations, it was anticipated that underdeveloped nations would be more severely impacted because of their lack of proper infrastructure, technology, and medical facilities (3). Notably, China, Europe, Iran, South Korea, and the United States are among the countries that were first severely affected by significant epidemics (4, 5). The provision and use of online learning materials during COVID-19 pandemic become a central challenge for many universities (1). E-learning profoundly affected the education sector, transforming the entire system and emerging as a significant topic of interest among academics (2). In response to universities shutting down the role of Information and communication technologies (ICTs) became more prominent. ICTs provide unique opportunities for educational advancement, offering potential improvements in teaching, learning, and fostering innovation and creativity for E-learning (3). Another definition stresses E-learning as the systematic delivery, organization, and administration of online learning activities such as student enrollment, tests, assignments, course descriptions, and lesson plans (6). E-learning systems became a crucial source of information in terms of their accessibility (available anytime and anywhere), affordability, user-friendliness, and interactive nature (1). For the individuals living far from universities, E-learning offers the advantage of saving time and effort, making it a preferred option for many scholars (4). Moreover, many users of E-learning platforms believe that online learning simplifies course management, making it easier for learners to access both teachers and teaching materials (5).

Nonetheless, a notable disadvantage of E-learning is the deficiency of interaction, both between students and instructors and among peers (7). Furthermore, issues such as inadequate internet access, low ICT proficiency, and insufficient content creation have posed significant hurdles for universities, particularly in developing countries. This study aimed to provide a summary of education via E-learning after COVID-19, focusing on the recently adopted methods, comparing and assessing their quality using two different measures: first, the eleven commonly used success factors of E-learning, and second, Kirkpatrick’s four levels of learning.

Objectives

We conducted a review investigating the innovative E-learning methods in health sciences education.

Methods

Study design: This study was designed as a systematic review to summarize the innovations and quality of post-COVID-19 E-learning in health education. The ethical committee of Hormozgan University of Medical Sciences has registered and endorsed our proposal (IR.HUMS.REC.1401.070). The Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) reporting guideline was implemented (8). We used distinct stages, designing PICO, identifying inclusion and exclusion criteria, developing search strategies, screening, developing standardized pilot data extraction forms, extracting articles data, quality assessments, coding data, and summarizing the results (9).

Searching databases: A comprehensive search was conducted in PubMed, Scopus, Eric, Google Scholar, and the Iranian Scientific Information Database. We searched the databases using a mix of keywords, concentrating on the important ones, such as E-learning; "Distance learning" or "E-learning", Health sciences education; "Medical education" or "Nursing education", COVID-19; "SARS-CoV-2" or "Corona virus". Table A1 in Appendix 1 shows PubMed search query. We limited our search period to after the beginning of pandemic in terms of the heterogeneity of pre-pandemic studies and we aimed to assess and compare recent methods in the new era of health education. The search was performed in April 2020 and was updated in September 2021. Search results were recorded and deduplicated using EndNoteX7.

Eligibility criteria and study selection: Inclusion Criteria: 1; controlled and uncontrolled interventional studies, including randomized controlled trail (RCT), non-randomized controlled trial (NRCT), and before-after design. 2; studies with educational intervention for E-learning or virtual learning during physical distancing. 3; Sampling from health sciences students.

Exclusion criteria: We didn’t include the studies if the sampling was from medical residency programs or if the study type was observational, case report, case series, review, editorial, or commentary. However, editorial or commentary studies with original data were included.

The title, abstract, and full-text articles of potentially eligible studies were evaluated by pairs of independent reviewers in accordance with the inclusion and exclusion criteria (Figure 1). As a selection criterion for potentially eligible studies, we implemented the PICOTS system (10) (Figure A1 in Appendix 1).

Data extraction: A standardized pilot data extraction form was designed by a medical education specialist. Data extraction of full-text was performed in teams of paired reviews.

All conflicts were resolved by the consensus or verdict of a third reviewer when necessary. Risk of bias assessment: To evaluate the risk of bias in uncontrolled interventional studies, we employed a questionnaire comprising five items: Adequate sample size, clearly defined study population (inclusion and exclusion criteria), methods for controlling confounding factors, appropriate statistical analyses including parametric and non-parametric tests (e.g.,
t-test and chi-squared test), and explicit methods for measuring exposure and outcomes. In the case of interventional studies, most of them were conducted as pre-post-trial designs or interventions without a control group, leading to a potentially high risk of bias. For randomized controlled trials, we used the Cochrane ROB 2.0 assessment tool (11) (Figure A2 in Appendix 1 for the complete description).

Data synthesis: The primary authors conducted data synthesis and engaged in discussions with the research team and medical education specialist (F.K.). It was not possible to conduct a meta-analysis due to the significant variation in instructional design, clinical, population, comparator, and methodological aspects among the identified studies. As a result, this review's conclusions are summarized in a narrative report.

Outcomes: Our primary outcome was to summarize new E-learning methods, and assess their success factors. To compare and assess the methods and models of E-learning and unify their diversity reported in the included studies, we used 11 items based on homologous qualities in articles using different (7-15). These items were as follows, 1. Optimal Quality of Media Factors 2. Appropriate Evaluation Strategies 3. Provide an appropriate assignment 4. Useful, Relevant and Up-to-date content 5. Optimal Interaction 6. Student Satisfaction 7. Educator competence 8. System training and clear instructions for using E-learning Components 9. Constructive answers and feedback to students 10. Availability 11. Ease of use (Technical difficulties). Find the detailed description of each mode in Table A3 in Appendix 1.

Our secondary outcome was to check the levels of learning for this purpose we used the Kirkpatrick evaluation model. The model is composed of four levels, with the upper levels demonstrating a more profound level of learning. The immediate reaction and satisfaction of students are measured at level 1, while learning and skill development are assessed through exams at level 2. Behavior changes in real-world tasks are typically assessed through observations at level 3, and productivity and business metrics are employed to evaluate results at level 4.

Results

Study selection: The first search uncovered 6492 papers in electronic databases. 3506 title abstracts were examined after eliminating duplicates. Of the 81 papers selected for full-text screening, 30 met the inclusion requirements. The PRISMA flowchart is shown in Figure 1. Of the sixteen investigations, one was an RCT; thirteen were NRCTs; and sixteen were uncontrolled interventional before-after studies. Table A2 in Appendix 1 lists the typical results of the thirty articles examined.

Risk of bias in studies: Among 30 articles, 6 articles had a high risk of bias. To reduce the potential chances of bias, we didn’t include the high risk of bias group in evaluating E-learning success factors in articles. The quality of the remaining 24 articles was mostly rated as moderate, with a mean score of 10.92. Out of a total of 16, the studies' risk of bias score varied from 7 to 15 points (The visualized risk of bias assessment is attached in Figure A2 in Appendix 1). Of the examined studies, only one was constructed as RCT, Suppan (12), which evaluated the risk of bias as rather troubling.

Geographic origin of studies: The included studies' geographic spread is as follows: Of the eleven investigations (37%), seven were conducted in North America, four (13%) in Eastern Asia, three (10%) in Western Asia, and seven (23%) in Europe. Most of the articles' geographical range includes affluent regions (Figure A3 in appendix 1).

Characteristic of participants: There were 22,373 total competitors. Each research included between 16 and 19050 students. Seventeen percent of five studies did not provide the participant count. Undergraduate students participated in all studies. Of the 21,801 people counted, 97% were medical students, according to 21 research. Of the two studies, 252 nursing students participated. Table A2 in Appendix 1, contains the baseline information of included studies, and Figure A4 in Appendix 1 shows the prevalence of each educational field.

The studies covered a variety of content areas and medical specialties, including, five studies (17%) focused on the basic science (e.g. anatomy and pathology), one (3%) on medical ethics, and two (7%) on clinical skills (e.g. communication skills and physical exam skills). In two studies (3%), virtual training was conducted for all medical lectures during pandemic (Table A2 in Appendix 1). Eleven studies (57%) utilized online platforms to deliver content, while three studies (9%) employed offline methods. Furthermore, eleven studies (34%) used a combination of both online and offline content (Figure A5 in Appendix 1)

Educational methods and innovations: Among online education methods described in the studies, most adapted traditional face-to-face approaches into online or offline formats using various platforms or applications. However, some studies introduced innovative educational strategies developed in response to the pandemic. For instance, Suppan et al. evaluated the impact of a gamified E-learning module on the adequacy of personal protective equipment (PPE) for paramedic students, Similarly, Kang et al. investigated the learning effects of Virtual Reality Simulation
(V-Sim) on nursing students caring for children with asthma (13). Table 1 shows the explanation of different methods applied in each study, and Figure A6 in Appendix 1 shows the prevalence of each method. The most widely used platform was Zoom (34%). We outlined the features of platform to assist in selecting the appropriate educational methods (Table 1 and Table A4 in Appendix 1).

Evaluation of the success factors of E-learning: In the studies that were included, we evaluated the presence of eleven success factors that are frequently employed in online education (Figure 2). ‘Optimal quality of E-learning system and utilization of desirable media factors' was reported in nearly 80% of the studies, making it the most prevalent factor. Furthermore, "System training and clear instructions for using E-learning" were incorporated into 42% of the studies. Furthermore, 37.5% of the studies provided 'Appropriate assignments', while an equal percentage did not include this factor, indicating it was the least reported.

Sixty-seven percent of studies incorporated 'Optimal evaluation strategies and their relevance to the content'. Various examination methods used to assess students are detailed in Table 1.

We compared the included articles based on the eleven online education success factors identified in this study. In general, our results indicate that Michener's study (9) achieved the highest educational quality.

Kirkpatrick's levels of evaluation in studies: All of the evaluations' report findings were consistent with levels 1 (reaction/satisfaction) and 2 (learning, based on self-report, quizzes) (N=24). No study had provided information about level 3 (behavior) or level 4 (result) of Kirkpatrick’s pyramid (Table 1 and Figure 3).

To assess participant reactions, various methods were employed across studies. Most studies used surveys with a 4- or 5-point Likert scale, along with pre- and post-class multiple-choice questions (MCQs), and post-class quizzes. Other methods were as follows (13); confidence in practice tool developed by Kim (15), Student Evaluation of Educational Quality (SEEQ) survey, which used a five-point Likert scale (16), interactive patient case management using EvaSys software, open-book exam, Canvas discussion board to gather real-time feedback throughout the course, and objective structured clinical examination (OSCE) stations (Table 1).

Discussion

We identified 30 studies on electronic educational interventions in this systematic review. In the aftermath of the COVID-19 pandemic, online education has emerged as a prospective new method that is highly regarded by medical education experts. Our objective was to offer a summary and comparison of innovative E-learning methods that were supported by evidence in order to resolve the voids and provide guidance for the future.

Educational methods and innovations: "Live didactic" and "case-based activities" were the most commonly used methods, along with group activities and problem-based learning sessions. Lectures were shared online and offline, with synchronous E-learning preferred over asynchronous methods to provide better control and motivation for students. The transition to 'new normal' brings challenges such as video-conferencing fatigue, suboptimal social interaction, and a high educational load for faculty (17). Virtual stimulation has been used as an innovative teaching method, and AMEE 2024 has also highlighted the impact of virtual stimulation as an emerging teaching method (18). Artificial intelligence is increasingly transforming education by enabling personalized, adaptive learning experiences and streamlining content creation. These advancements offer real-time feedback and efficient, self-directed learning. However, challenges persist, including data privacy concerns, algorithmic bias, unequal access to technology, and the absence of comprehensive legal frameworks. Addressing these issues is crucial to ensure that AI contributes positively to the future of education (19).

Evaluation of the success factors of E-learning: The primary purpose of this research was to delineate effective ways of E-learning. To do this, we created a quality assessment sheet grounded on many success elements of E-learning. Ehlers et al. have classified the success variables into seven frameworks: institutional, technical, instructional design, pedagogical, student support, faculty support, and evaluation (20). The provision of educational assistance is crucial for the success of E-learning (21). The dominant trend seems to be moving from text to graphics or video and using 'richer' media in the design (22). Assignments had the least attention paid among reviewed papers. Based on Alqahtani et al., students should comprehend their responsibility during social distancing, develop distinct attitudes, and discover self-motivation for academic achievement (21). Although many factors for effective E-learning were identified, there is still a lack of standardized and high-quality questionnaires for assessing E-learning quality. Having a reference tool for E-learning quality assessment would help universities and lecturers identify and address deficiencies. Sinclair et al. found that the differences in intervention design and evaluation methods prevented generalizable conclusions about the effectiveness of E-learning on health worker behavior (23). High-income countries had produced the highest number of articles, which may potentially bias the rating quality of E-learning outcomes. Inadequate infrastructure, such as bandwidth limitations makes it hard to implement E-learning properly in LMICs (24). Despite the many advantages of E-learning, ongoing limitations prevent it from reaching its full potential in LMICs (9).

Assessment of learning: The assessment of participants in E-learning environments has undergone significant transformation in the pandemic. These methods should assess students' knowledge, but also evaluate their engagement, satisfaction, and overall learning outcomes in this new educational paradigm. One of the most widely used assessment methods in post-pandemic E-learning is the survey, particularly those utilizing 4- or 5-point Likert scales. Likert scales assist instructors in statistically evaluating students' feedback on course components such as organization, clarity, and delivery (25). Open book examinations were used in some research, since OBE improves the learning environment and often supplants surface learning with deep learning (26). However, recent studies conducted during the pandemic indicated that the average scores achieved by students in an Outcome-Based Education (OBE) framework are much greater than those obtained in a closed-book assessment (26, 27). Throughout the epidemic, online platforms emerged as viable substitutes for content delivery and learning evaluations. Learning Management Systems (LMS), such as Moodle and Canvas, enhance course delivery and evaluations, allowing instructors to design quizzes and monitor student progress (28).

The shift to online assessments ensured continuity in education and offered flexibility. However, the challenges included technical issues, on the faculty side, the primary issue is insufficient training in online assessment methods (29), while on the students' side, there are concerns about dishonesty and misconduct. On the other note, the compulsory establishment of online learning infra-structure improved the quality of education post-COVID-19 era in the universities (30).

Kirkpatrick model: In most articles, interventions were evaluated based on the first and second levels of Kirkpatrick’s pyramid (reaction and/or learning). However, the third and fourth levels were not measured in any of the articles. The rapid transition to E-learning, along with its inherent limitations, posed significant obstacles (31). Levels 3 and 4 are crucial as they help organizations target job-critical skills, though their value is sometimes questioned. Kirkpatrick et al. noted that no final results from a training program can be expected unless positive behavior change occurs (32). Kirkpatrick et al. noted that no final results from a training program can be expected unless positive behavior change occurs (32). A qualitative study by Badri et al. revealed that the crisis caused by the COVID-19 pandemic significantly impacted the ability of new general physicians to achieve the desired competence, as many completed parts of their clinical training during that period (29). Therefore, more emphasis needs to be placed on the quality of operations, teaching content, and pedagogic and technical support services for E-learning.

Risk of bias: We utilized a checklist of seven elements to evaluate the quality of all non-RCT studies. Six items were considered high risk. The majority of the remaining articles exhibited a moderate risk of bias. Fatani et al. (16) demonstrated the lowest risk of bias (12). Time and resource restrictions caused by the epidemic have compounded the situation (17). The one RCT trial included was evaluated using the Cochrane ROB2 technique and had moderate quality of evidence. There is a scarcity of high-quality RCTs in education research. This absence of evidence makes it difficult for decision-makers to make sound judgments.

Limitations: Our scoping review is primarily descriptive, focusing on English and Persian studies. It's possible that it left out important insights from other languages. The review was limited to a specific time period, so it may not fully reflect the changing landscape of E-learning in health sciences education. Using gray literature can provide useful information. However, our review did not include these sources. The studies we included showed considerable heterogeneity in terms of methodologies and contexts. Consequently, conducting a meta-analysis was not feasible. We strongly recommend designing and proposing consistent quantitative and standardized assessment models for online education to support meta-analysis and precise decision making.

Conclusion

Our findings indicate that E-learning can be a potential alternative method for face-to-face learning and several factors can be modified to enhance learning. We suggest designing a comprehensive framework to address the unique aspects of E-learning. Future E-learning research should use more randomized trial design and place a greater emphasis on levels 3 and 4 of Kirkpatric model to develop more effective, evidence-based systems.

  1. Maatuk AM, Elberkawi EK, Aljawarneh S, Rashaideh H, Alharbi H. The COVID-19 pandemic and E-learning: challenges and opportunities from the perspective of students and instructors. J Comput High Educ. 2022;34(1):21-38. doi: 1007/s12528-021-09274-2. [PMID: 33967563] [PMCID: PMC8091987]
  2. Bacher-Hicks A, Goodman J, Mulhern C. Inequality in household adaptation to schooling shocks: Covid-induced online learning engagement in real time. J Public Econ. 2021 Jan:193:104345. doi: 1016/j.jpubeco.2020.104345. [PMID: 34629567] [PMCID: PMC8486492]
  3. Blundell R, Costa Dias M, Joyce R, Xu X. COVID‐19 and Inequalities. Fisc Stud. 2020 Jun;41(2):291-319. doi: 1111/1475-5890.12232. [PMID: 32836542] [PMCID: PMC7362053]
  4. Sahu P. Closure of universities due to coronavirus disease 2019 (COVID-19): impact on education and mental health of students and academic staff. Cureus. 2020 Apr 4;12(4):e7541. doi: 7759/cureus.7541. [PMID: 32377489] [PMCID: PMC7198094]
  5. Mseleku Z. A literature review of E-learning and E-teaching in the era of Covid-19 pandemic. California, USA: Sage; 2020.
  6. Haghshenas M. A Model for Utilizing Social Softwares in Learning Management System of E-Learning. Quarterly of Iranian Distance Education Journal. 2019; 1(4): 25-38. doi: 30473/idej.2019.6124.
  7. Somayeh M, Dehghani M, Mozaffari F, Ghasemnegad SM, Hakimi H, Samaneh B. The effectiveness of E-learning in learning: A review of the literature. International Journal of Medical Research & Health Sciences. 2016;5(2):86-91.
  8. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Rev Esp Cardiol (Engl Ed). 2021 Sep;74(9):790-799. doi: 1016/j.rec.2021.07.010. [PMID: 34446261]
  9. Barteit S, Guzek D, Jahn A, Bärnighausen T, Jorge MM, Neuhann F. Evaluation of e-learning for medical education in low-and middle-income countries: A systematic review. Comput Educ. 2020 Feb:145:103726. doi: 1016/j.compedu.2019.103726. [PMID: 32565611] [PMCID: PMC7291921]
  10. Clephas PRD, Hoeks SE, Trivella M, Guay CS, Singh PM, Klimek M, et al. Prognostic factors for chronic post-surgical pain after lung or pleural surgery: a protocol for a systematic review and meta-analysis. BMJ Open. 2021 Jun 15;11(6):e051554. doi: 1136/bmjopen-2021-051554. [PMID: 34130966] [PMCID: PMC8207993]
  11. Eldridge S, Campbell M, Campbell M, Dahota A, Giraudeau B, Higgins J, et al. Revised Cochrane risk of bias tool for randomized trials (RoB 2.0): additional considerations for cluster-randomized trials. Cochrane Methods Cochrane Database Syst Rev. 2016;10(suppl 1): 1-17.
  12. Suppan L, Stuby L, Gartner B, Larribau R, Iten A, Abbas M, et al. Impact of an e-learning module on personal protective equipment knowledge in student paramedics: a randomized controlled trial. Antimicrob Resist Infect Control. 2020 Nov 10;9(1):185. doi: 10.1186/s13756-020-00849-9.
  13. Kang K-A, Kim S-J, Lee M-N, Kim M, Kim S. Comparison of learning effects of virtual reality simulation on nursing students caring for children with asthma. Int J Environ Res Public Health. 2020 Nov 13;17(22):8417. doi: 3390/ijerph17228417. [PMID: 33202996] [PMCID: PMC7696217]
  14. Michener A, Fessler E, Gonzalez M, Miller RK. The 5 Mʼs and More: A New Geriatric Medical Student Virtual Curriculum During the COVID‐19 Pandemic. J Am Geriatr Soc. 2020 Nov;68(11):E61-E63. doi: 1111/jgs.16855. [PMID: 32955724] [PMCID: PMC7537043]
  15. Kim YH, Jang KS. Effect of a simulation-based education on cardio-pulmonary emergency care knowledge, clinical performance ability and problem solving process in new nurses. J Korean Acad Nurs. 2011 Apr;41(2):245-55. doi: 10.4040/jkan.2011.41.2.245. [PMID: 21551996]
  16. Fatani TH. Student satisfaction with videoconferencing teaching quality during the COVID-19 pandemic. BMC Med Educ. 2020 Oct 31;20(1):396. doi: 1186/s12909-020-02310-2. [PMID: 33129295] [PMCID: PMC7602774]
  17. Stojan J, Haas M, Thammasitboon S, Lander L, Evans S, Pawlik C, et al. Online learning developments in undergraduate medical education in response to the COVID-19 pandemic: A BEME systematic review: BEME Guide No. 69. Med Teach. 2022 Feb;44(2):109-29. doi: 1080/0142159X.2021.1992373. [PMID: 34709949]
  18. Keshavarzi MH, Hashemi A, Norouzi A, Ramezani G. Twelve Tips to Improve the Quality of Medical Education, the Experience of AMEE Live 2024 Conference. Strides Dev Med Educ. 2025;22(1): e1471. doi: 22062/sdme.2025.200537.1471.
  19. Keshavarzi MH, Ramezani G, Shafian S. Challenges and Opportunities of Artificial Intelligence (AI) in Teaching-Learning Process. Strides in Development of Medical Education. 2025;22(1): e1457. doi: 22062/sdme.2024.200449.1457.
  20. Ehlers U-D. Quality in e-learning from a learner's perspective. European Journal of Open, Distance and E-learning. 2004;7(1): 1-7.
  21. Alqahtani AY, Rajkhan AA. E-learning critical success factors during the covid-19 pandemic: A comprehensive analysis of e-learning managerial perspectives. Education sciences. 2020;10(9):216. doi:3390/educsci10090216.
  22. Masoumi D, Lindström B. Quality in e‐learning: a framework for promoting and assuring quality in virtual institutions. Journal of Computer Assisted Learning. 2012;28(1):27-41. doi:1111/j.1365-2729.2011.00440.x.
  23. Sinclair PM, Kable A, Levett-Jones T, Booth D. The effectiveness of Internet-based e-learning on clinician behaviour and patient outcomes: a systematic review. Int J Nurs Stud. 2016 May:57: 70-81. doi: 1016/j.ijnurstu.2016.01.011. [PMID: 27045566]
  24. Frehywot S, Vovides Y, Talib Z, Mikhail N, Ross H, Wohltjen H, et al. E-learning in medical education in resource constrained low-and middle-income countries. Hum Resour Health. 2013 Feb 4:11:4. doi: 1186/1478-4491-11-4. [PMID: 23379467] [PMCID: PMC3584907]
  25. Dawson K. Teacher inquiry: A vehicle to merge prospective teachers’ experience and reflection during curriculum-based, technology-enhanced field experiences. Journal of Research on Technology in Education. 2006;38(3):265-92. doi:1080/15391523.2006.10782460.
  26. Ashri D, Sahoo BP. Open book examination and higher education during COVID-19: Case of University of Delhi. Journal of Educational Technology Systems. 2021;50(1):73-86. doi:1177/0047239521013783.
  27. Eurboonyanun C, Wittayapairoch J, Aphinives P, Petrusa E, Gee DW, Phitayakorn R. Adaptation to open-book online examination during the COVID-19 pandemic. J Surg Educ. 2021 May-Jun;78(3):737-739. doi: 1016/j.jsurg.2020.08.046. [PMID: 33011103] [PMCID: PMC7467022]
  28. Pardim VI, Contreras Pinochet LH, Viana ABN, Souza CAd. Where is the student who was here? Gamification as a strategy to engage students. The International Journal of Information and Learning Technology. 2023;40(2):177-92. doi:1108/IJILT-05-2022-0122.
  29. Badri B, Azemian A, Mokhtari Zanjani M, Shafian S, Yazdankhafard M. How Did the COVID-19 Pandemic Affect the Clinical Skills and Competencies Among New General Physicians: A Qualitative Study. Strides Dev Med Educ. 2025;22(1): e1462. doi: 22062/sdme.2025.200457.1462.
  30. Mohammadi A, Norouzadeh R, Hosseini SE, Mohammadi M, Tahrekhani M. Investigating the Effectiveness of Online Learning in the Post-Coronavirus (COVID-19) Era from the Perspectives of Students and Faculty Members of Selected Universities of Medical Sciences in Iran. Strides Dev Med Educ. 2025;22(1): e1447. doi: 22062/sdme.2025.200378.1447.
  31. Krawiec C, Myers A. Remote assessment of video-recorded oral presentations centered on a virtual case-based module: a COVID-19 feasibility study. Cureus. 2020 Jun 20;12(6):e8726. doi: 7759/cureus.8726. [PMID: 32699721] [PMCID: PMC7372193]
  32. Kirkpatrick DL. Another look at evaluating training programs: Fifty articles from training & development and technical training: Magazines cover the essentials of evaluation and return-on-investment. Alexandria, VA: American Society for Training & Development; 1998.