Strides in Development of Medical Education

Document Type : Original Article

Authors

1 Master of Educational Technology in Medical Sciences, Faculty of Medicine, Tehran University of Medical Sciences, Tehran, Iran

2 Professor of Department Health Information Technology, School of Allied Medical Sciences, Nursing and Midwifery Care Research Center, Tehran University of Medical Sciences, Tehran, Iran

3 Assistant Professor, Faculty of Educational Sciences and Psychology, Shahid Beheshti University, Tehran, Iran

10.22062/sdme.2025.199547.1347

Abstract

Background: With the rapid growth of e-learning technology, it has become one of the most common educational approaches in educational institutions.
Objectives: This study was conducted with the aim of psychometrically evaluating the Online Student Engagement Scale (OSE) in Persian to assess the level of engagement in online courses.
Methods: In this descriptive-cross-sectional study, 125 online students completed the Persian version of the OSE. The construct validity of the OSE was examined using exploratory factor analysis (EFA) and Pearson correlations. Cronbach’s alpha was calculated to determine internal consistency. Confirmatory factor analysis (CFA) was performed using Smart PLS 3 software to analyze the confirmatory factor structure of the instrument. The Larcker-Fornell criterion was used to check divergent validity.
Results: Four factors of learning engagement (social, behavioral, cognitive, and affective) were identified using factor analysis. The reliability was confirmed with a Cronbach's alpha score of 0.923. The factor structure was supported by fit indices (Χ²/df = 2316.824,
p < .0001, SRMR = 0.166, dULS = 29.682, RMS Theta = 0.231). The construct validity and reliability of the tool were confirmed with composite construct validity values above 0.7.
Conclusion: The Persian version of the OSE appears to be a reliable and potentially valid instrument for use in Iranian online students. It may be favorable for evaluating e-learning engagement.

Keywords

Background

Effective e-learning can lead to deeper and higher levels of learning achievement and higher-order thinking abilities because it allows students to be actively engaged in learning anytime and anywhere (1, 2). Student engagement and participation are defined as the basic level of effort or interaction over time with learning resources that develops the learning outcomes and learning experience. The National Survey of Student Engagement and Active Engagement (NSSE) measures four factors: Academic challenges, learning with peers, interaction with school institutions, and supportive learning environments (3). The Student Engagement Instrument (SEI) measures affective and cognitive factors, and the Motivational Strategies for Learning Questionnaire (MSLQ) measures engagement based on cognitive strategies (4).

Early studies defined student participation and engagement as a single behavioral dimension. Mosher emphasized the behavioral characteristics of engagement and defined it as "an attitude toward a learning program or cooperative behavior" (5). However, these definitions lack other dimensions, such as the knowledge of learning and the psychological state of the learner (6). Currently, there are different definitions of student engagement. Lewis et al. defined engagement as "the extent to which students' thoughts, feelings, and activities are actively engaged in learning" (7). These changes in the definition of student engagement mean that student engagement expands from behavioral aspects to psychological and cognitive aspects, while the scope of participation ranges from in-curriculum learning activities (e.g., learning time, effort, and strategy) to extra-curricular learning activities (e.g., club activities and volunteer activities).

As shown in the definitions above, student engagement consists of both behavioral and emotional dimensions. Burch (8) designed a questionnaire to measure students' engagement using exploratory factor analysis, a tool with factors of physical engagement, motivational engagement, cognitive engagement in class, cognitive engagement outside of class, resistance, and learning. Deep engagement was the result of their activity. Among the tools for studying student engagement is the Student Engagement Questionnaire, which is part of the Australian student engagement tool. In Choi and Lee's study (9) titled "A survey to measure the perspective of students' engagement in an active learning classroom," they prepared a questionnaire with three components: Activity value, personal effort, and teacher's help.

A transcultural adaptation and psychometric study was carried out on 125 students engaged in online studies at Tehran University of Medical Sciences. These studies are limited in that many concepts specific to face-to-face environments are implemented in an e-learning environment. Therefore, the characteristics of the e-learning environment and the factors of the Online Student Engagement Scale (OSE) were developed by Lee, Hae-deok-sung, and Ah Jung-Hong in 2019 in Seoul, South Korea (10).

It should be noted that the validity and reliability of an instrument in one language do not guarantee that these characteristics will be preserved when translated into other languages due to cultural, linguistic, and geographical differences. When using a tool in a context different from where it was developed, these factors must be considered. This study aimed to adapt the Online Student Engagement Scale (OSE) into Persian and examine its psychometric properties.

Objectives

This study was conducted with the aim of psychometrically evaluating the Online Student Engagement Scale (OSE) in Persian to assess the level of engagement in online courses.

Methods

Study Design and Setting: This psychometric study was conducted at Tehran University of Medical Sciences from February 2023 to January 2024. The inclusion criteria consisted of all virtual students who had studied at least two semesters online at Tehran University of Medical Sciences and were willing to participate in this study. Participants who did not complete more than 7.4% of the questionnaire were excluded from the study.

Participants and Sampling: All participants were informed about the study objective, and it was clarified that their responses would remain anonymous. The study participants included a diverse group consisting of two bilingual translators proficient in Persian, one native English speaker with a strong grasp of Persian (during the translation phase), six experts in educational sciences and psychology familiar with psychometric procedures for assessing content validity, and 125 students selected through convenience sampling. These students were surveyed online to complete the OSE. Out of the 135 questionnaires received, 125 were eligible for use. The questionnaire was distributed to several instructors teaching online courses, who were asked to distribute the questionnaires to their students who met the inclusion criteria (students who had completed at least two semesters online and were willing to participate in the study). First-semester students were excluded due to their limited exposure to the university environment.

Tools/Instruments

Online Student Engagement: To assess learners' engagement in online courses, the OSE questionnaire was utilized, measuring four types of engagement: Social, behavioral, cognitive, and affective. Each item was rated on a 5-point Likert scale (1 = strongly disagree, 2 = disagree, 3 = moderately agree, 4 = agree, 5 = strongly agree), with higher scores indicating increased engagement. The translation and transcultural adaptation were carried out, followed by determining the questionnaire’s validity and reliability.

Translation Technique and Transcultural Adaptation: The Korean OSE questionnaire, originally developed by Lee and Hong (10), was translated into Persian using the standard forward–backward technique. To ensure a common psychological meaning between the original version and its translation, all translated versions were discussed for clarification, with minor changes made to the temporary Persian OSE.

Validity and Reliability

The validity and reliability of the Persian version of the Online Student Engagement Scale (OSE) in online courses were determined:

Validity: To determine the validity of the OSE, face validity, content validity, and construct validity were assessed.

- Content validity: The relative coefficient of content validity (CVR) was used. Experts were asked to evaluate each item based on a three-point scale: “The item is necessary,” “the item is useful but not necessary,” and “the item is not necessary.” Given that there were six experts, the acceptable limit for the content validity ratio (CVR = 0.91) was determined. Items with a CVR lower than this value were removed. After this process, one item was removed, 20 items were reviewed and edited, and 24 items were approved. The final questionnaire’s validity and reliability were determined by experts.

- Construct validity: Exploratory factor analysis (EFA) was initially utilized in SPSS version 18 to determine the factor structure and assess the questionnaire’s construct validity.

- Reliability: Cronbach’s alpha was calculated to assess the questionnaire's internal consistency.
The overall reliability was confirmed with a Cronbach’s alpha score of 0.923. The Cronbach's alpha values for the individual factors were as follows: Social = 0.917, behavioral = 0.883, cognitive = 0.843, and affective = 0.872. (Figure 2)

Data Collection: The study was conducted in two stages. The first stage involved translating the tool and adapting it culturally during an in-person session with translators. The second phase focused on evaluating the tool’s psychometric features, including its validity (face, content, and construct validity) and reliability. After confirming content validity, the translated questionnaire was distributed to six experts for evaluation. Subsequently, the translated questionnaire, confirmed for reliability and validity, was sent electronically via email to all online students of the university who had at least two semesters of online education experience (135 students).

In the second part of the questionnaire, the level of engagement in the e-learning environment among online students was investigated with 24 items using a five-point Likert scale.

Results

Demographic Characteristics: A total of 125 participants responded to the questionnaire (92.6% response rate). The minimum age of the respondents was 23, and the maximum age was 55. The mean age was 36.99 ± 7.974. The age of the participants ranged from 23 to 54 years, with an average age of 28.31 (SD = 7.974). The group consisted of 38 males (30.4%) and 78 females (69.6%), with 67 e-learning students (53.6%), 13 medical education students (10.4%), and 45 educational technology students (36%). Other demographic characteristics are presented in Table 1.

Table 1. Demographic Characteristics of the Participants

Characteristic

Percentage

Number

Male

30.4

38

Female

69.6

87

E-learning

53.6

67

Medical education

10.4

13

Educational technology

36

45

Single

24

30

Married

76

95

Working while studying

79.2

99

Not working while studying

20.8

26

Dormitory resident

10.4

13

Not a dormitory resident

89.6

112

Faculty member

20.8

26

Not a faculty member

79.2

99

Lecturer

19.2

5

Assistant professor

53.9

14

Associate professor

26.9

7

Data Analysis: The collected data were analyzed using SPSS version 18 for exploratory factor analysis and Smart PLS version 3 for confirmatory factor analysis of the instrument. The Larcker–Fornell criterion was used to assess divergent and diagnostic validity, indicating the presence of partial correlations between the indicators of one structure and the indicators of other structures.

An EFA using the Principal Axis Factoring extraction method with Varimax rotation was performed on the 23 items of the scale in a sample (n = 125) to examine its factorial structure and construct validity. The appropriateness of performing EFA was confirmed by the Kaiser–Meyer–Olkin measure of sampling adequacy (KMO = 0.801) and Bartlett’s test of sphericity (χ² = 2316.824, p < 0.0001).

Descriptive Statistics: The extracted factors from the questionnaire estimated the total variance across four different areas. Questions in each field showed the highest correlation with their respective field and low to relatively weak correlation with other fields. Varimax rotation was used to justify the changes in total engagement in students’ learning. The rotated components matrix was applied, and the rotation converged in four iterations. The level of students’ engagement was calculated using a five-point Likert scale, determining the percentage of grades and the average of the components. The lowest and highest averages of all components were also identified.

Upon examining the Scree Plot, and considering that four factors had eigenvalues greater than 1, it was observed that these four factors could provide strong separation between the questions, forming four distinct areas of student engagement in learning.

Exploratory Factor Analysis: Initially, the Kaiser–Meyer–Olkin measure and Bartlett’s Test of Sphericity were used to assess data suitability for factor analysis. The KMO index of 0.801 indicated adequate sampling, while the Bartlett’s Sphericity Index was significant (2316.824, p < 0.001), confirming that the correlation matrix was not an identity matrix. The analysis included reviewing eigenvalues (greater than one), explained variance, and a scree plot to identify the number of factors present. According to Table 2 and Figure 1, both the principal components analysis and the scree plot supported the extraction of four factors, which explained 60.754% of the total variance, validating the structure of the scale.

Table 2. Factor Loadings of the Items of the Persian Version Online Student Engagement Scale

Item

CVR

Social

Behavioral

Cognitive

Affective

I study the contents of virtual courses with other students.

1

0.855

 

 

 

I am satisfied with the virtual class that I am participating in.

1

0.726

 

 

 

I try to solve difficult problems with the help of other classmates.

1

0.715

 

 

 

I answer virtual projects or assignments with the help of other classmates.

1

0.680

 

 

 

I manage my learning using the virtual system.

1

0.664

 

 

 

I tend to apply the knowledge I have learned in virtual classes
to real problems or new situations.

1

0.617

 

 

 

I frequently interact with other students in my virtual classes.

1

 

0.831

 

 

I often ask the professor about the lesson material

1

 

0.741

 

 

I feel that I am connected with classmates in virtual classes

1

 

0.706

 

 

I try to answer the questions that other students ask.

1

 

0.647

 

 

I am in private contact with the professor for further guidance.

1

 

0.643

 

 

I feel like I belong to the community of my virtual classmates.

1

 

0.577

 

 

After passing a virtual lesson, I am waiting for the next lesson

1

 

0.441

 

 

I can deeply analyze thoughts, experiences and theories related to the knowledge I have learned in my virtual classes

1

 

 

0.848

 

I can gain new interpretations from the knowledge I have learned in virtual classes

1

 

 

0.753

 

When participating in online classes, I remove all environmental factors that cause distraction

1

 

 

0.669

 

After my virtual class, I also study related educational materials

1

 

 

0.650

 

I can judge the value of information related to the knowledge
learned in my virtual classes

1

 

 

0.553

 

When I participate in the virtual class, I am motivated to study.

1

 

 

 

0.794

Virtual classes are useful for me

1

 

 

 

0.790

Virtual classes increase my interest in learning.

1

 

 

 

0.718

It is interesting to participate in virtual classes.

1

 

 

 

0.662

Following the rotation process, factor loadings for the 23 items are presented in Table 2. In the component matrix, four factors were extracted. In the rotated components matrix, each item aligned with only one factor, indicating a clear structure. Question 4 had a low correlation with the rest of the questions and was removed. In the initial factor analysis, five factors were identified; however, the fifth factor included only two items and was therefore removed. Ultimately, an exploratory factor analysis was conducted with four factors. These four factors had an eigenvalue greater than 1 and explained 60% of the variation in the data. Rotation converged in four iterations. After rotating the factors, the following factor structure was identified: Items (12, 13, 14, 15, 16, 22, 24) loaded on factor 1 (social); items (17, 18, 19, 20, 21, 23) loaded on factor 2 (behavioral); items (1, 2, 3, 5, 6) loaded on factor 3 (cognitive); and items (7, 8, 9, 10) loaded on factor 4 (affective).

The analyses yielded a four-factor solution with eigenvalues greater than 1 and factor loadings of 0.30 or higher, explaining 60.754% of the variance.

Construct Validity and Reliability: Formulating hypotheses about the concepts under study, testing these hypotheses, and calculating the correlation of the obtained results with the initial measurement are key steps in assessing construct validity. A high correlation coefficient indicates high construct validity.

SmartPLS 3 output reports include Rho-A for reliability and construct validity. The matrix also provides Cronbach’s alpha (CA), composite reliability (CR), and Average Variance Extracted (AVE). The Rho-A value between Cronbach’s alpha and composite reliability indicates strong reliability (Table 3).

Table 3. Construct Reliability and Validity and Changes in the Total Factor Loadings from Varimax Rotation

Factor

N

Cronbach's Alpha

Rho-A

Composite Reliability

AVE

Rotation Sums of Squared Loadings

Total

% of Variance

% Cumulative

Social

7

0.917

0.922

0.934

0.669

4.287

18.638

18.638

Behavioral

7

0.883

0.894

0.910

0.592

4.001

17.397

36.035

Cognitive

5

0.843

0.844

0.889

0.616

2.929

12.736

48.771

Affective

4

0.872

0.888

0.912

0.722

2.756

11.983

60.754

Confirmatory Factor Analysis

All subsequent analyses were conducted to examine the confirmatory factor structure of the tool using Smart PLS 3 software. The factor structure was supported by SRMR = 0.166, dULS = 29.682, RMS Theta = 0.231, SSO = 2,875.00, SSE = 1,952.95, and Q² = 1 - SSE/SSO = 0.321, confirming the validity of the first-order factor model.

Construct Cross-validated Communality Factors: The covariance test results showed that the covariance validity index for the latent variables was positive, indicating that the measurement model had acceptable quality. As shown in Table 4, the Q² index revealed the predictive power of the model in endogenous constructs. The positive Q² value indicated good fit and appropriate predictive power.

Table 4. Construct Cross-validated Communality

Construct cross-validated communality

SSO

SSE

Q² = 1-SSE/SSO

Social

875

481.703

0.449

Behavioral

875

406.756

0.535

Cognitive

625

364.881

0.416

Learning engagement

2875.00

1952.95

0.321

Affective

500

248.752

0.502

Variance Inflation Factor (VIF): This factor indicates the degree of collinearity between latent variables. Values less than 5 are acceptable. All latent variables in the structure demonstrated acceptable values:

(Social → Learning engagement = 2.067, behavioral → Learning engagement = 1.834, cognitive → Learning engagement = 1.295, affective → Learning engagement = 1.436).

Path Coefficients: The structural model and detailed observations showed significant effects of all latent variables:

- Social → Learning engagement (O = 0.475, M = 0.475, STDEV = 0.031, T = 15.103, P < 0.001)

- Behavioral → Learning engagement (O = 0.391, M = 0.391, STDEV = 0.023, T = 16.725, P < 0.001)

- Cognitive → Learning engagement (O = 0.186, M = 0.182, STDEV = 0.026, T = 7.139, P < 0.001)

- Affective → Learning engagement (O = 0.212, M = 0.210, STDEV = 0.021, T = 10.286, P < 0.001)

Convergent Validity, Reliability, and Composite Construct Reliability: All construct reliability and composite construct reliability values for the questionnaire dimensions were above 0.7, and convergent validity values exceeded 0.5, confirming high validity and reliability. The four-factor model was validated, and items within each factor demonstrated acceptable correlations.

Construct Divergent Validity: According to the Fornell–Larcker criterion (Table 5), the square root of the AVE values on the main diagonal exceeded the correlation values in the corresponding rows and columns.

Table 5. Discriminant Validity/Fornell-Larcker Criterion

Discriminant Validity/Fornell-Larcker Criterion

Affective

Behavioral

Cognitive

Learning Engagement

Social

Affective

0.850

 

 

 

 

Behavioral

0.518

0.770

 

 

 

Cognitive

0.147

0.252

0.785

 

 

Learning engagement

0.661

0.843

0.539

0.622

 

Social

0.461

0.622

0.470

0.904

0.818

This confirmed that constructs interacted more strongly with their own indicators than with others, establishing acceptable divergent validity.

In Figure 3, the t-statistics related to the factor loadings of each item are shown under each structure. The acceptable criterion is a value greater than 1.96. All items met this threshold. The standardized values indicate the factor loadings for each question relative to the various components, demonstrating each question’s contribution to explaining component variance. Figure 3 further demonstrates that the path coefficients for the four-factor model are statistically significant (T > 1.96, P < 0.001), confirming a robust fit for the OSE.

Factor Loadings of the Model Sheets: Factor loadings greater than 0.7 are considered acceptable, indicating that each factor demonstrated strong loadings on its respective items.

Discussion

This study investigated the psychometric characteristics of the Persian version of the Online Student Engagement Scale (OSE). The OSE, comprising 23 self-report items across four engagement dimensions (cognitive, behavioral, social, and affective), was obtained. The psychometric analysis of the localized version of the tool showed that the tool is valid and reliable. The psychometry of this tool was done using standard procedures and using experts' analysis and review. The conducted investigations confirmed the face and content validity of the tool. The construct validity of the tool showed that this tool can adequately examine the factors of learning engagement. Comparison of the results of this study with the original version showed that one of the components (q4) was removed, but some items (11, 22, 23, 24) were moved in the Persian version, and components (social, behavioral, cognitive, and affective) were obtained from the rotation of components. In the study and analysis of students' engagement in the e-learning environment, four factors were found (social, behavioral, cognitive, and affective).

The first factor in the present study was social (original sample = 0.391, sample Mean (M) = 0.388, STDEV = 0.024, T-statistics = 16.156, P-values = 0.000, AVE = 0.669, Cronbach's Alpha = 0.917, SSO = 875, SSE = 481.703, Q² = 0.449) and is related to collaborative learning activities with other students and refers to activities in which students discuss knowledge and issues with participation. Since cooperative learning and interaction are becoming increasingly important in the e-learning environment, it is important that cooperative learning appears as a separate factor in this study.

The second factor was behavioral (original sample = 0.475, sample mean (M) = 0.476, STDEV = 0.029, T-statistics = 16.343, P-values = 0.000, AVE = 0.592, Cronbach's Alpha = 0.883, SSO = 625, SSE = 364.881, Q² = 0.416), which engages students in learning and refers to psychological factors, such as perceived bonds and connection with other students. Fredericks (Figure 2) (3) emphasized that the sense of belonging helps students to participate in classes. According to Fredericks, in the light of this perspective, the Behavioral factor is obtained from this study in the sense that it is directly related and significant to the measures of the student's psychological status.

The third factor, cognitive (original sample = 0.186, sample mean (M) = 0.183, STDEV = 0.026, T-statistics = 7.136, P-values = 0.000, AVE = 0.616, Cronbach's Alpha = 0.843, SSO = 500, SSE = 248.752, Q² = 0.502), was proposed as the influencing factor of learning engagement among students, which is related to the internalization of tasks. It is cognitive, which shows the importance of the process of acquiring, understanding, and using knowledge in learning. These are important factors because they affect learning progress. Fredericks (Figure 2) (3) emphasized that it is important to strengthen this skill, especially in students who have chosen distance learning methods (based on hybrid methods), due to the independence they have in receiving training and curriculum. Horton (11) explained that the measurement of engagement in face-to-face settings has mainly focused on behavioral or emotional types of engagement. Researchers have recently begun to pay attention to the cognitive process of learning.

The fourth factor was affective (original sample = 0.212, sample mean (M) = 0.21, STDEV = 0.021, T-statistics = 10.324, P-values = 0.000, AVE = 0.722, Cronbach's Alpha = 0.872, SSO = 500, SSE = 248.752, Q² = 0.502) among students, which is related to the psychological aspect of learning. This factor represents the thoughts or feelings of learners, such as interest, expectations, and motivation associated with e-learning. The Online Student Engagement Scale (OSE) developed by Dixson (12) comprises 19 items on a 5-point Likert scale that assesses the extent to which individuals perceive their thoughts, behaviors, and feelings as representative of themselves or their conduct. Learning motivation and learning expectations are necessary for a higher level of learning activities in the e-learning environment. Jung (13) stated that the level of interaction and engagement is higher when students feel the educational presence in the field of real learning with the professor.

This study shows that students want to actively participate in learning and learn actively in virtual courses. Garrison (14) stated that teaching presence is facilitated when students regularly interact with professors. Joo and Lim (15) emphasized that students learn successfully when they feel a high level of teaching presence through interaction and continuous engagement with the professor in online courses. Parkes (16) believed that the criteria for measuring engagement in learning tend to focus on the quantity and quality of students' engagement in activities and situations related to their studies. It seems that e-learning environments are of special importance by creating cognitive and intellectual interaction and strong scientific content to create changes in students' understanding, perspective, or cognitive structures. Parks (16) stated that this factor is related to active and self-directed learning activities for students in an independent learning environment.

All paths have significant coefficients, which indicate that the paths and relationships between latent variables have been chosen correctly. The matrix of correlation coefficients of latent variables was calculated and showed that learning engagement has the highest correlation with endogenous cognitive variables, then with behavioral variables. Also, the cognitive, behavioral, and cognitive endogenous variables have the highest correlation with the exogenous latent variables of their domain. This confirms that the endogenous variables and their associated factors are properly selected. Based on the obtained results, a positive and significant statistical relationship between learning engagement and the mentioned components was shown. Considering that learning engagement includes behavioral, cognitive, and emotional engagement, the findings of the present research are consistent with the findings of Fredericks (3) regarding the main factors of student engagement. The factor of Social is related to cognitive engagement, and the factors of Affective and Social are related to emotional engagement. Professors' supportive behaviors also motivate students and increase their engagement in the learning environment (16), showing that interaction with professors, which refers to communicative actions such as asking for additional help from the professor or asking questions about course content, can be considered an important predictor of student engagement in e-learning.

The Larcker-Fornell criterion was used to check the divergent validity and diagnostic validity of the model, which indicates the existence of partial correlations between the indicators of one structure and the indicators of other structures. The Larcker-Fornell criterion refers to the fact that the square root of the explained variance (AVE) of each construct is greater than the correlation values of that construct with other constructs. The values on the main diagonal of the following matrix must be greater than all the values in the corresponding column, which is the case. Therefore, the divergent validity of the following construct was confirmed. AVE is the value of extracted variance related to the constructs, and the acceptable value for this criterion is greater than 0.5, which shows that our constructs have good validity.

Assessing student engagement can pinpoint both students who are thriving and those who require additional support to excel (17). Tools designed to gauge engagement in traditional university settings may not effectively reflect the unique engagement behaviors seen in e-learning environments, where learner competencies, motivations, and navigation methods often differ significantly (18). Studies have underscored the critical role of engagement in predicting academic success in online settings (19, 20).

The results of face validity, simplicity feedback, and fluency showed that this study is consistent with previous research using similar instruments to assess face validity across different contexts and demographics (21). In conclusion, this finding suggests that participants found the questionnaire items relevant and appropriate for measuring their online engagement, indicating a comprehensive understanding of the concept being measured.

The findings showed that the CVR value for each questionnaire item and the average CVR (0.91) exceeded the minimum acceptable level. Thus, these outcomes affirm the questionnaire’s content validity. These findings indicate that the questionnaire has satisfactory content validity for assessing online student engagement. These findings suggest that the questionnaire adequately captures the domain of online student engagement, which encompasses Social, Behavioral, Cognitive, and Affective engagement. This implies that the questionnaire is characterized by high relevance, clarity, and simplicity for assessing this construct. These results are aligned with prior studies that have utilized or adapted the questionnaire in different settings and among different populations (22).

In the process of performing EFA, four distinct factors were revealed: Behavioral, affective, cognitive, and social engagement. These factors collectively explained 60.754% of the total variance. Furthermore, CFA supported a first-order factor structure with robust fit indices, confirming that the questionnaire is valid for assessing various dimensions of online student engagement. This indicates that items of the questionnaire measure the four distinct dimensions of online student engagement, including social engagement, cognitive engagement, affective engagement, and behavioral engagement, as described by the theoretical framework of engagement, thereby affirming the construct validity of its Persian adaptation, as it yielded a similar factor structure to the original instrument. Such consistency aligns with previous studies that utilized the same questionnaire to gauge online student engagement across diverse settings and demographics (23). For instance, Hoi and Hang (24) used the questionnaire on 363 undergraduate students enrolled in an online English as a foreign language program at a large multidisciplinary university in Ho Chi Minh City, Vietnam. Their findings revealed that the questionnaire exhibited a four-factor structure with strong reliability and validity.

The complete scale and each of the four subscales (cognitive, behavioral, social, and affective engagements) had Cronbach’s alpha 0.923. This demonstrated acceptable internal consistency and further supported its psychometric robustness. High Cronbach’s alpha coefficients (above 0.7) indicated that the items on a scale consistently measured the intended construct, thereby enhancing the validity of research findings. This result aligns with prior studies that utilized the same questionnaire to assess online student engagement across various contexts and populations (25, 26). The current study findings suggest that the questionnaire consistently produces reliable scores when measuring online student engagement, indicating that its items are clear, straightforward, and pertinent.

This research gives professors and university officials the possibility to measure the level of students' engagement in research activities to provide the basis for recognizing and educating active and capable students in the field of practice and to be used as an efficient tool in universities and higher education institutions.

Conclusion

This study provides strong evidence for the psychometric properties of the OSE based on a study conducted at Tehran University of Medical Sciences. Analysis of the 4-factor structure (social, behavioral, cognitive, and affective) with satisfactory fit indices and reliability scores was identified. The findings significantly increase the existing knowledge about online engagement measures by confirming the reliability and validity of the OSE instrument for use in online learning contexts.

  1. Chen PSD, Lambert AD, Guidry KR. Engaging online learners: The impact of Web-based learning technology on college student engagement. Computers & Education. 2010; 54(4); 1222-32. doi:1016/j.compedu.2009.11.008.
  2. Lewis AD, Huebner ES, Malone PS, Valois RF. Life satisfaction and student engagement in adolescents. J Youth Adolesc. 2011 Mar;40(3):249-62. doi: 1007/s10964-010-9517-6. [PMID: 20204687]
  3. Fredricks J, McColskey W, Meli J, Mordica J, Montrosse B, Mooney K. Measuring Student Engagement in Upper Elementary through High School: A Description of 21 Instruments [cited 2011 Jan 20]. Available from: https://eric.ed.gov/?id=ED514996.
  4. National Survey of Student Engagement. Engagement Insights. Survey Findings on the Quality of Undergraduate Education. Bloomington, Indiana: Center for Postsecondary Research, Indiana University, Annual Results; 2015.
  5. Li F, Qi J, Wang G, Wang X. Traditional Classroom VS E-learning in Higher Education: Difference between Students’ Behavioral Engagement. International Journal of Emerging Technologies in Learning. 2014; 9(2): 48–52. doi:3991/ijet.v9i2.3268.
  6. Mosher R, MacGowan B. Assessing Student Engagement in Secondary Schools: Alternative Conceptions, Strategies of Assessing, and Instruments. [cited 2019 Jan 16]. Available from: https://eric.ed.gov/?id= ED272812.
  7. Reschly AL, Christenson SL. Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In: Christenson SL, Reschly AL, Wylie C. Handbook of Research on Student Engagement. New York: Springer: 2012: 97–131. doi:1007/978-1-4614-2018-7.
  8. Burch GF, Heller NA, Burch JJ, Freed R, Steed SA. Student engagement: Developing a conceptual framework and survey instrument Journal of Education for Business. 2015; 9(4): 224-9. doi:1080/08832323.2015.1019821.
  9. Lee Y, Choi J. A review of online course dropout research: Implications for practice and future research. Educational Technology Research and Development. 2011; 59(5): 593–618. doi:1007/s11423-010-9177-y.
  10. Lee J, Song HD, Hong AJ. Exploring factors, and indicators for measuring students' sustainable engagement in e-learning. Sustainability. 2019;11(4): 985. doi:3390/su11040985.
  11. Horton W. E-learning by Design. San Francisco, USA: John Wiley & Sons: 2011.
  12. Dixson, M.D. Measuring Student Engagement in the Online Course: The Online Student Engagement Scale (OSE). Online Learning. 2015; 19(4): n4. doi:24059/olj.v19i4.561.
  13. Jung YJ, Lee JM. Learning engagement and persistence in massive open online courses (MOOCS). Computers & Education. 2018; 122: 9-22. doi: 1016/j.compedu.2018.02.013.
  14. Garrison DR, Anderson T, Archer W. The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education. 2010; 13(1-2): 5–9. doi: 1016/j.iheduc.2009.10.003.
  15. Joo YJ, Lim KY, Kim EK. Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictors in a structural model. Computers & Education. 2011, 57(2): 1654–64. doi: 1016/j.compedu.2011.02.008.
  16. Parkes M, Reading C, Stein S. The competencies required for effective performance in a university e-learning environment. Australasian Journal of Educational Technology. 2013; 29(6): 777–91. doi: 14742/ajet.38.
  17. Henrie CR, Halverson LR, Graham CR. Measuring student engagement in technology-mediated learning: A review. Computers & Education. 2015; 90:36–53. doi: 1016/j.compedu.2015.09.005.
  18. Watted A, Barak M. Motivating factors of MOOC completers: Comparing between university-affiliated students and general participants. The Internet and Higher Education. 2018; 37: 11–20. doi: 1016/j.iheduc.2017.12.001.
  19. Bond M, Buntins K, Bedenlier S, Zawacki-Richter O, Kerres M. Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education. 2020;17(1):2. doi:1186/s41239-019-0176-8.
  20. Zen Z, Ariani F. Academic achievement: the effect of project-based online learning method and student engagement. Heliyon. 2022 Nov 12;8(11):e11509. doi: 1016/j.heliyon.2022.e11509. [PMID: 36411883] [PMCID: PMC9674908]
  21. Wong L. Student engagement with online resources and its impact on learning outcomes. Journal of Information Technology Education: Innovations in Practice. 12(1); 129-46. doi:28945/1829.
  22. Lamborn S, Newmann F, Wehlage G. The Significance and Sources of Student Engagement. In: Newman FM. Student Engagement and Achievement in American Secondary Schools. New York: Teachers College Press; 1992: 11-39.
  23. Deng R, Benckendorff P, Gannaway D. Learner engagement in MOOCs: Scale development and validation. British Journal of Educational Technology. 2020; 51(1): 245–62. doi: 1111/bjet.12810.
  24. Hoi VN, Le Hang H. The structure of student engagement in online learning: A bi-factor exploratory structural equation modeling approach. Journal of Computer Assisted Learning. 2021;37(4):1141–53. doi:1111/jcal.12551.
  25. Taghizade A, Musavian SS, Hosseininik SS. Psychometric Properties of the Persian Version of the Online Student Engagement Questionnaire: A Transcultural Adaptation and Psychometric Study. Interdisciplinary Journal of Virtual Learning in Medical Sciences, 2024; 15(3): 226-40. doi: 30476/ijvlms.2024.102125.1299.
  26. Ben-Eliyahu A, Moore D, Dorph R, Schunn CD. Investigating the multidimensionality of engagement: Affective, behavioral, and cognitive engagement across science activities and contexts. Contemporary Educational Psychology. 2018;53:87–105. doi: 10.1016/j.cedpsych.2018.01.002.