Azim Mirzazadeh; Maryam Alizadeh; Mohammad Shariati; Leyla Sadighpour
Abstract
Background Whereas much has been written about the strategies, barriers and facilitator factors of effective and interactive lecturing in medical education little has been written about the effective and interactive lecturing skills educational programs for medical teachers based on peer observation ...
Read More
Background Whereas much has been written about the strategies, barriers and facilitator factors of effective and interactive lecturing in medical education little has been written about the effective and interactive lecturing skills educational programs for medical teachers based on peer observation of teaching. Objectives The current study aimed at designing and implementing an interactive and effective lecturing workshop using peer observation and feedback, and finally evaluating its results. Methods The current descriptive study was conducted in Tehran University of Medical Sciences from 2015 to 2016. The study population consisted of faculty members participating in the effective and interactive lecturing workshop and the study subjects were selected by convenience sampling method. The Kirkpatrick method was used to assess the workshop; for this purpose, the level of reaction, learning, and performance were evaluated using a valid and reliable questionnaire, as well as the one minute note technique, and a form addressing the extent of using interactive techniques in the classroom six months after participation in the workshop. Data were analyzed by using the SPSS 22. Data are presented as frequency and mean where appropriated. The notes were analyzed using manual content analyses. Results The participants believed that the workshop could successfully encourage them to use lecturing principles and interactive lecturing techniques, and provide them with the opportunity to practice and rethink the teaching process. The interactive techniques mostly used six months after participation in the workshop belonged to the question and answer (Q& A) techniques, active evaluation, and use of scenario. Conclusions It seems that the provision of training opportunities, observation of performance, and giving feedback were effective to improve the quality of empowerment programs. It was suggested that other empowerment programs should also address this point.
Roghayeh Gandomkar; Azim Mirzazadeh; Leyla Sadighpour; Mohammad Jalili; Mojgan Safari; Batool Amini
Volume 12, Supplement , July 2015, , Pages 111-118
Abstract
Background and Objective: One of the potential strategies for ensuring the quality of educational programs is adopting a systematic approach to its evaluation. Current evidence indicates the lack of high quality program evaluation activities in the field of medical education. The aim of this study was ...
Read More
Background and Objective: One of the potential strategies for ensuring the quality of educational programs is adopting a systematic approach to its evaluation. Current evidence indicates the lack of high quality program evaluation activities in the field of medical education. The aim of this study was to review the current status of program evaluation activities in Tehran University of Medical Sciences, Tehran, Iran, and formulate guidelines to promote program evaluation activities at the University level.
Methods: A survey was conducted to investigate the current conditions of program evaluation using a questionnaire in 2012. Then, the comprehensive course evaluation guidelines, consisting of 22 items, were developed based on literature review, survey results, and experts’ opinions. Finally, each affiliated school developed its own evaluation plan. The evaluation taskforce reviewed evaluation plans using a checklist.
Results: Using one tool or resource, 9 schools (90%) conducted course evaluation at least once. The views of students, faculty, staff or alumni were used occasionally. Moreover, 4 schools (40%) reported the evaluation results. After reviewing 14 submitted course plans based on the checklist, 51 feedbacks were provided. Most and least feedbacks were related to evaluation design and implementation and evaluation infrastructure, respectively.
Conclusion: The process of developing guidelines and plans resulted in stakeholders reaching a common understanding of course evaluation, and in turn, creating evaluation capacity and more accountability.