Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
30 "Assessment"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Technical report
Feasibility of applying computerized adaptive testing to the Clinical Medical Science Comprehensive Examination in Korea: a psychometric study  
Jeongwook Choi, Sung-Soo Jung, Eun Kwang Choi, Kyung Sik Kim, Dong Gi Seo
J Educ Eval Health Prof. 2025;22:29.   Published online October 1, 2025
DOI: https://doi.org/10.3352/jeehp.2025.22.29
  • 1,149 View
  • 182 Download
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to investigate the feasibility of transitioning the Clinical Medical Science Comprehensive Examination (CMSCE) to computerized adaptive testing (CAT) in Korea, thereby providing greater opportunities for medical students to accurately compare their clinical competencies with peers nationwide and to monitor their own progress.
Methods
A medical self-assessment using CAT was conducted from March to June 2023, involving 1,541 medical students who volunteered from 40 medical colleges in Korea. An item bank consisting of 1,145 items from previously administered CMSCE examinations (2019–2021) hosted by the Medical Education Assessment Corporation was established. Items were selected through 2-stage filtering, based on classical test theory (discrimination index above 0.15) and item response theory (discrimination parameter estimates above 0.6 and difficulty parameter estimates between –5 and +5). Maximum Fisher information was employed as the item selection method, and maximum likelihood estimation was used for ability estimation.
Results
The CAT was successfully administered without significant issues. The stopping rule was set at a standard error of measurement of 0.25, with a maximum of 50 items for ability estimation. The mean ability score was 0.55, with an average of 28 items administered per student. Students at extreme ability levels reached the maximum of 50 items due to the limited availability of items at appropriate difficulty levels.
Conclusion
The medical self-assessment CAT, the first of its kind in Korea, was successfully implemented nationwide without significant problems. These results indicate strong potential for expanding the use of CAT in medical education assessments.
Research article
Validation of the 21st Century Skills Assessment Scale for public health students in Thailand: a methodological study  
Suphawadee Panthumas, Kaung Zaw, Wirin Kittipichai
J Educ Eval Health Prof. 2024;21:37.   Published online December 10, 2024
DOI: https://doi.org/10.3352/jeehp.2024.21.37
  • 5,155 View
  • 403 Download
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to develop and validate the 21st Century Skills Assessment Scale (21CSAS) for Thai public health (PH) undergraduate students using the Partnership for 21st Century Skills framework.
Methods
A cross-sectional survey was conducted among 727 first- to fourth-year PH undergraduate students from 4 autonomous universities in Thailand. Data were collected using self-administered questionnaires between January and March 2023. Exploratory factor analysis (EFA) was used to explore the underlying dimensions of 21CSAS, while confirmatory factor analysis (CFA) was conducted to test the hypothesized factor structure using Mplus software (Muthén & Muthén). Reliability and item discrimination were assessed using Cronbach’s α and the corrected item-total correlation, respectively.
Results
EFA performed on a dataset of 300 students revealed a 20-item scale with a 6-factor structure: (1) creativity and innovation; (2) critical thinking and problem-solving; (3) information, media, and technology; (4) communication and collaboration; (5) initiative and self-direction; and (6) social and cross-cultural skills. The rotated eigenvalues ranged from 2.12 to 1.73. CFA performed on another dataset of 427 students confirmed a good model fit (χ2/degrees of freedom=2.67, comparative fit index=0.93, Tucker-Lewis index=0.91, root mean square error of approximation=0.06, standardized root mean square residual=0.06), explaining 34%–71% of variance in the items. Item loadings ranged from 0.58 to 0.84. The 21CSAS had a Cronbach’s α of 0.92.
Conclusion
The 21CSAS proved be a valid and reliable tool for assessing 21st century skills among Thai PH undergraduate students. These findings provide insights for educational system to inform policy, practice, and research regarding 21st-century skills among undergraduate students.
Review
Attraction and achievement as 2 attributes of gamification in healthcare: an evolutionary concept analysis  
Hyun Kyoung Kim
J Educ Eval Health Prof. 2024;21:10.   Published online April 11, 2024
DOI: https://doi.org/10.3352/jeehp.2024.21.10
  • 7,136 View
  • 403 Download
  • 3 Web of Science
  • 4 Crossref
AbstractAbstract PDFSupplementary Material
This study conducted a conceptual analysis of gamification in healthcare utilizing Rogers’ evolutionary concept analysis methodology to identify its attributes and provide a method for its applications in the healthcare field. Gamification has recently been used as a health intervention and education method, but the concept is used inconsistently and confusingly. A literature review was conducted to derive definitions, surrogate terms, antecedents, influencing factors, attributes (characteristics with dimensions and features), related concepts, consequences, implications, and hypotheses from various academic fields. A total of 56 journal articles in English and Korean, published between August 2 and August 7, 2023, were extracted from databases such as PubMed Central, the Institute of Electrical and Electronics Engineers, the Association for Computing Machinery Digital Library, the Research Information Sharing Service, and the Korean Studies Information Service System, using the keywords “gamification” and “healthcare.” These articles were then analyzed. Gamification in healthcare is defined as the application of game elements in health-related contexts to improve health outcomes. The attributes of this concept were categorized into 2 main areas: attraction and achievement. These categories encompass various strategies for synchronization, enjoyable engagement, visual rewards, and goal-reinforcing frames. Through a multidisciplinary analysis of the concept’s attributes and influencing factors, this paper provides practical strategies for implementing gamification in health interventions. When developing a gamification strategy, healthcare providers can reference this analysis to ensure the game elements are used both appropriately and effectively.

Citations

Citations to this article as recorded by  
  • The effect of DECO-MOM mobile application for a prenatal environmental health program on environmental health behaviors: a pilot test
    Hae Kyung Jo, Hyun Kyoung Kim
    BMC Pregnancy and Childbirth.2025;[Epub]     CrossRef
  • Virtual reality-enhanced rehabilitation for improving musculoskeletal function and recovery after trauma
    Phani Paladugu, Rahul Kumar, Joshua Ong, Ethan Waisberg, Kyle Sporn
    Journal of Orthopaedic Surgery and Research.2025;[Epub]     CrossRef
  • Emotional distress among medical students before and after the first exposure to a full cadaver dissection: a quantitative study using DASS-21
    Miral Nagy Fahmy Salama, Ahmed Mohamed Abdelkhalek, Mayssah Ahmed El Nayal, Ramya Rathan, Mona Ghazi Sayegh, Marwa Mady
    Medical Education Online.2025;[Epub]     CrossRef
  • Short-Term Impact of Digital Mental Health Interventions on Psychological Well-Being and Blood Sugar Control in Type 2 Diabetes Patients in Riyadh
    Abdulaziz M. Alodhialah, Ashwaq A. Almutairi, Mohammed Almutairi
    Healthcare.2024; 12(22): 2257.     CrossRef
Research articles
Use of learner-driven, formative, ad-hoc, prospective assessment of competence in physical therapist clinical education in the United States: a prospective cohort study  
Carey Holleran, Jeffrey Konrad, Barbara Norton, Tamara Burlis, Steven Ambler
J Educ Eval Health Prof. 2023;20:36.   Published online December 8, 2023
DOI: https://doi.org/10.3352/jeehp.2023.20.36
  • 5,109 View
  • 231 Download
  • 1 Web of Science
  • 3 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
The purpose of this project was to implement a process for learner-driven, formative, prospective, ad-hoc, entrustment assessment in Doctor of Physical Therapy clinical education. Our goals were to develop an innovative entrustment assessment tool, and then explore whether the tool detected (1) differences between learners at different stages of development and (2) differences within learners across the course of a clinical education experience. We also investigated whether there was a relationship between the number of assessments and change in performance.
Methods
A prospective, observational, cohort of clinical instructors (CIs) was recruited to perform learner-driven, formative, ad-hoc, prospective, entrustment assessments. Two entrustable professional activities (EPAs) were used: (1) gather a history and perform an examination and (2) implement and modify the plan of care, as needed. CIs provided a rating on the entrustment scale and provided narrative support for their rating.
Results
Forty-nine learners participated across 4 clinical experiences (CEs), resulting in 453 EPA learner-driven assessments. For both EPAs, statistically significant changes were detected both between learners at different stages of development and within learners across the course of a CE. Improvement within each CE was significantly related to the number of feedback opportunities.
Conclusion
The results of this pilot study provide preliminary support for the use of learner-driven, formative, ad-hoc assessments of competence based on EPAs with a novel entrustment scale. The number of formative assessments requested correlated with change on the EPA scale, suggesting that formative feedback may augment performance improvement.

Citations

Citations to this article as recorded by  
  • Shifting Anatomy Away From a Time‐Based Model: Competency‐Based Education Insights for Anatomy Educators
    Jeb Helms, Skye Donovan
    Clinical Anatomy.2026; 39(1): 60.     CrossRef
  • Clinical Anatomy in a Physical Therapist Competency-Based Education Course: Challenges and Solutions
    Jeb Helms, Lindsay Conn, Austin Sheldon, Carl DeRosa, Kelly Macauley
    Journal of Physical Therapy Education.2026; 40(1): 23.     CrossRef
  • The Anatomy of Assessment: Competency-Based Educational Outcomes in a Doctor of Physical Therapy Anatomy Course
    Michael Morgan, Jeb Helms, Taylor Lane
    Journal of Rehabilitation Practices and Research.2026;[Epub]     CrossRef
Enhancement of the technical and non-technical skills of nurse anesthesia students using the Anesthetic List Management Assessment Tool in Iran: a quasi-experimental study  
Ali Khalafi, Maedeh Kordnejad, Vahid Saidkhani
J Educ Eval Health Prof. 2023;20:19.   Published online June 16, 2023
DOI: https://doi.org/10.3352/jeehp.2023.20.19
  • 4,107 View
  • 129 Download
  • 1 Web of Science
  • 1 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study investigated the effect of evaluations based on the Anesthetic List Management Assessment Tool (ALMAT) form on improving the technical and non-technical skills of final-year nurse anesthesia students at Ahvaz Jundishapur University of Medical Sciences (AJUMS).
Methods
This was a semi-experimental study with a pre-test and post-test design. It included 45 final-year nurse anesthesia students of AJUMS and lasted for 3 months. The technical and non-technical skills of the intervention group were assessed at 4 university hospitals using formative-feedback evaluation based on the ALMAT form, from induction of anesthesia until reaching mastery and independence. Finally, the students’ degree of improvement in technical and non-technical skills was compared between the intervention and control groups. Statistical tests (the independent t-test, paired t-test, and Mann-Whitney test) were used to analyze the data.
Results
The rate of improvement in post-test scores of technical skills was significantly higher in the intervention group than in the control group (P˂0.0001). Similarly, the students in the intervention group received significantly higher post-test scores for non-technical skills than the students in the control group (P˂0.0001).
Conclusion
The findings of this study showed that the use of ALMAT as a formative-feedback evaluation method to evaluate technical and non-technical skills had a significant effect on improving these skills and was effective in helping students learn and reach mastery and independence.

Citations

Citations to this article as recorded by  
  • Anesthesi̇a students’ behavi̇ors towards a medi̇cal errors: a phenomenologi̇cal quali̇tati̇ve scenari̇o study
    Havva Karadeni̇z, Hatun Erkuran
    BMC Anesthesiology.2025;[Epub]     CrossRef
Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia  
Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
J Educ Eval Health Prof. 2021;18:23.   Published online September 23, 2021
DOI: https://doi.org/10.3352/jeehp.2021.18.23
  • 8,220 View
  • 254 Download
  • 4 Web of Science
  • 5 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
It aimed to compare the use of the tele-objective structured clinical examination (teleOSCE) with in-person assessment in high-stakes clinical examination so as to determine the impact of the teleOSCE on the assessment undertaken. Discussion follows regarding what skills and domains can effectively be assessed in a teleOSCE.
Methods
This study is a retrospective observational analysis. It compares the results achieved by final year medical students in their clinical examination, assessed using the teleOSCE in 2020 (n=285), with those who were examined using the traditional in-person format in 2019 (n=280). The study was undertaken at the University of New South Wales, Australia.
Results
In the domain of physical examination, students in 2020 scored 0.277 points higher than those in 2019 (mean difference=–0.277, P<0.001, effect size=0.332). Across all other domains, there was no significant difference in mean scores between 2019 and 2020.
Conclusion
The teleOSCE does not negatively impact assessment in clinical examination in all domains except physical examination. If the teleOSCE is the future of clinical skills examination, assessment of physical examination will require concomitant workplace-based assessment.

Citations

Citations to this article as recorded by  
  • Feasibility of an online clinical assessment of competence in physiotherapy students
    Brooke Flew, Lucy Chipchase, Darren Lee, Jodie A. McClelland
    Physiotherapy Theory and Practice.2025; 41(3): 508.     CrossRef
  • Navigating digital assessments in medical education: Findings from a scoping review
    Chin-Siang Ang, Sakura Ito, Jennifer Cleland
    Medical Teacher.2025; 47(8): 1274.     CrossRef
  • Feasibility and reliability of the pandemic-adapted online-onsite hybrid graduation OSCE in Japan
    Satoshi Hara, Kunio Ohta, Daisuke Aono, Toshikatsu Tamai, Makoto Kurachi, Kimikazu Sugimori, Hiroshi Mihara, Hiroshi Ichimura, Yasuhiko Yamamoto, Hideki Nomura
    Advances in Health Sciences Education.2024; 29(3): 949.     CrossRef
  • Radiography education in 2022 and beyond - Writing the history of the present: A narrative review
    Y.X. Tay, J.P. McNulty
    Radiography.2023; 29(2): 391.     CrossRef
  • Newly appointed medical faculty members’ self-evaluation of their educational roles at the Catholic University of Korea College of Medicine in 2020 and 2021: a cross-sectional survey-based study
    Sun Kim, A Ra Cho, Chul Woon Chung
    Journal of Educational Evaluation for Health Professions.2021; 18: 28.     CrossRef
Review
Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review  
Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
J Educ Eval Health Prof. 2021;18:11.   Published online June 1, 2021
DOI: https://doi.org/10.3352/jeehp.2021.18.11
  • 11,732 View
  • 481 Download
  • 17 Web of Science
  • 18 Crossref
AbstractAbstract PDFSupplementary Material
The coronavirus disease 2019 (COVID-19) pandemic has required educators to adapt the in-person objective structured clinical examination (OSCE) to online settings in order for it to remain a critical component of the multifaceted assessment of a student’s competency. This systematic scoping review aimed to summarize the assessment methods and validity and reliability of the measurement tools used in current online OSCE (hereafter, referred to as teleOSCE) approaches. A comprehensive literature review was undertaken following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews guidelines. Articles were eligible if they reported any form of performance assessment, in any field of healthcare, delivered in an online format. Two reviewers independently screened the results and analyzed relevant studies. Eleven articles were included in the analysis. Pre-recorded videos were used in 3 studies, while observations by remote examiners through an online platform were used in 7 studies. Acceptability as perceived by students was reported in 2 studies. This systematic scoping review identified several insights garnered from implementing teleOSCEs, the components transferable from telemedicine, and the need for systemic research to establish the ideal teleOSCE framework. TeleOSCEs may be able to improve the accessibility and reproducibility of clinical assessments and equip students with the requisite skills to effectively practice telemedicine in the future.

Citations

Citations to this article as recorded by  
  • Intra-rater reliability of in-person versus simulated remote synchronous faculty evaluation of pharmacy student objective structured clinical examinations
    Alex N. Isaacs, Jenny L. Newlon, Kimberly S. Illingworth, Alan J. Zillich, Zachary A. Weber, Jamie L. Woodyard
    Currents in Pharmacy Teaching and Learning.2026; 18(2): 102520.     CrossRef
  • Introducing a Smartphone Tele-Objective Structured Clinical Examination to Support High-Stakes Competency Decisions: A Quasi-Experimental Study and Curricular Implications
    Xiaozhi Wang, Junjie Du, Binlin Luo, Liling Chen, Huanhuan Chen, Surong Jiang, Wei Sun, Lei Zhou, Lars Konge, Hua Huang, Qiang Ding
    Journal of Medical Education and Curricular Development.2026;[Epub]     CrossRef
  • Validity of pediatric index of mortality 2 scoring in a pediatric ICU setting in Ethiopia: A prospective observational study
    Mikiyas G. Teferi, Bethel A. Awoke, Iyassu S. Melkie, Oghenekenu Oshare, Soliana Solomon Birhanu, Gelila Gemeda Gage, Biruk T. Mengistie, Chernet T. Mengistie, Alehilign M. Abebe
    Global Pediatrics.2026; 15: 100322.     CrossRef
  • Assessing fundamental clinical skills of osteopathic medical students
    John R. Boulet, Jeanne M. Sandella, John Gimpel, Richard LaBaere
    Journal of Osteopathic Medicine.2025; 125(10): 469.     CrossRef
  • Exploring an online clinical competency assessment: an alternative to a traditional in-person assessment for internationally trained physiotherapists
    Brooke Flew, Lucy Chipchase, Darren Lee, Jodie A. McClelland
    BMC Medical Education.2025;[Epub]     CrossRef
  • Structured oral clinical assessment for pharmacotherapy competencies in medical education: a study of validity and reliability analyses of seven domains
    Abdul Khairul Rizki Purba, David Sontani Perdanakusuma, Arifa Mustika, Tanja Fens, Maarten Jacobus Postma
    Korean Journal of Medical Education.2025; 37(3): 331.     CrossRef
  • Feasibility and reliability of the pandemic-adapted online-onsite hybrid graduation OSCE in Japan
    Satoshi Hara, Kunio Ohta, Daisuke Aono, Toshikatsu Tamai, Makoto Kurachi, Kimikazu Sugimori, Hiroshi Mihara, Hiroshi Ichimura, Yasuhiko Yamamoto, Hideki Nomura
    Advances in Health Sciences Education.2024; 29(3): 949.     CrossRef
  • A level playing field? Evaluation of the virtual Objective Structured Clinical Examination in Psychiatry and Addiction Medicine: A mixed methods study
    Rebecca E Reay, Paul A Maguire, Jeffrey CL Looi
    Australasian Psychiatry.2024; 32(4): 359.     CrossRef
  • Conducting an objective structured clinical examination under COVID-restricted conditions
    Andrea Gotzmann, John Boulet, Yichi Zhang, Judy McCormick, Mathieu Wojcik, Ilona Bartman, Debra Pugh
    BMC Medical Education.2024;[Epub]     CrossRef
  • The virtual Clinical Assessment of Skills and Competence: the impact and challenges of a digitised final examination
    Kenny Chu, Shivanthi Sathanandan
    BJPsych Bulletin.2023; 47(2): 110.     CrossRef
  • Virtual Learning and Assessment in Rheumatology Fellowship Training: Objective Structured Clinical Examination Revisited
    Rachel M. Wolfe, Faye N. Hant, Rumey C. Ishizawar, Lisa G. Criscione‐Schreiber, Beth L. Jonas, Kenneth S. O'Rourke, Marcy B. Bolster
    Arthritis Care & Research.2023; 75(12): 2435.     CrossRef
  • Innovations in assessment in health professions education during the COVID‐19 pandemic: A scoping review
    Jamal Giri, Claire Stewart
    The Clinical Teacher.2023;[Epub]     CrossRef
  • Evaluation of the Utility of Online Objective Structured Clinical Examination Conducted During the COVID-19 Pandemic
    Mona Arekat, Mohamed Hany Shehata, Abdelhalim Deifalla, Ahmed Al-Ansari, Archana Kumar, Mohamed Alsenbesy, Hamdi Alshenawi, Amgad El-Agroudy, Mariwan Husni, Diaa Rizk, Abdelaziz Elamin, Afif Ben Salah, Hani Atwa
    Advances in Medical Education and Practice.2022; Volume 13: 407.     CrossRef
  • Comparison of student pharmacists' performance on in-person vs. virtual OSCEs in a pre-APPE capstone course
    Justine S. Gortney, Joseph P. Fava, Andrew D. Berti, Brittany Stewart
    Currents in Pharmacy Teaching and Learning.2022; 14(9): 1116.     CrossRef
  • Is online objective structured clinical examination teaching an acceptable replacement in post-COVID-19 medical education in the United Kingdom?: a descriptive study
    Vashist Motkur, Aniket Bharadwaj, Nimalesh Yogarajah
    Journal of Educational Evaluation for Health Professions.2022; 19: 30.     CrossRef
  • Equal Z standard-setting method to estimate the minimum number of panelists for a medical school’s objective structured clinical examination in Taiwan: a simulation study
    Ying-Ying Yang, Pin-Hsiang Huang, Ling-Yu Yang, Chia-Chang Huang, Chih-Wei Liu, Shiau-Shian Huang, Chen-Huan Chen, Fa-Yauh Lee, Shou-Yen Kao, Boaz Shulruf
    Journal of Educational Evaluation for Health Professions.2022; 19: 27.     CrossRef
  • Applying the Student Response System in the Online Dermatologic Video Curriculum on Medical Students' Interaction and Learning Outcomes during the COVID-19 Pandemic
    Chih-Tsung Hung, Shao-An Fang, Feng-Cheng Liu, Chih-Hsiung Hsu, Ting-Yu Yu, Wei-Ming Wang
    Indian Journal of Dermatology.2022; 67(4): 477.     CrossRef
  • Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia
    Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
    Journal of Educational Evaluation for Health Professions.2021; 18: 23.     CrossRef
Research articles
Development and validation of a portfolio assessment system for medical schools in Korea  
Dong Mi Yoo, A Ra Cho, Sun Kim
J Educ Eval Health Prof. 2020;17:39.   Published online December 9, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.39
  • 8,288 View
  • 260 Download
  • 5 Web of Science
  • 6 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Consistent evaluation procedures based on objective and rational standards are essential for the sustainability of portfolio-based education, which has been widely introduced in medical education. We aimed to develop and implement a portfolio assessment system, and to assess its validity and reliability.
Methods
We developed a portfolio assessment system from March 2019 to August 2019 and confirmed its content validity through expert assessment by an expert group comprising 2 medical education specialists, 2 professors involved in education at medical school, and a professor of basic medical science. Six trained assessors conducted 2 rounds of evaluation of 7 randomly selected portfolios for the “Self-Development and Portfolio II” course from January 2020 to July 2020. These data are used inter-rater reliability was evaluated using intra-class correlation coefficients (ICCs) in September 2020.
Results
The portfolio assessment system is based on the following process; assessor selection, training, analytical/comprehensive evaluation, and consensus. Appropriately trained assessors evaluated portfolios based on specific assessment criteria and a rubric for assigning points. In the analysis of inter-rater reliability, the first round of evaluation grades was submitted, and all assessment areas except “goal-setting” showed a high ICC of 0.81 or higher. After the first round of assessment, we attempted to standardize objective assessment procedures. As a result, all components of the assessments showed close correlations, with ICCs of 0.81 or higher.
Conclusion
We confirmed that when assessors with an appropriate training conduct portfolio assessment based on specified standards through a systematic procedure, the results are reliable.

Citations

Citations to this article as recorded by  
  • Integrating portfolio and mentorship in competency-based medical education: a Middle East experience
    Mariam Shadan, Rania H. Shalaby, Arina Ziganshina, Samar Ahmed
    BMC Medical Education.2025;[Epub]     CrossRef
  • Development and integration of a clinical dashboard within a dental school setting
    Fatemeh S. Afshari, Judy Chia‐Chun Yuan, Cortino Sukotjo, Susan A. Rowan, Michael L. Spector
    Journal of Dental Education.2024; 88(11): 1539.     CrossRef
  • Inter-rater reliability and content validity of the measurement tool for portfolio assessments used in the Introduction to Clinical Medicine course at Ewha Womans University College of Medicine: a methodological study
    Dong-Mi Yoo, Jae Jin Han
    Journal of Educational Evaluation for Health Professions.2024; 21: 39.     CrossRef
  • Development of an electronic learning progression dashboard to monitor student clinical experiences
    Hollis Lai, Nazila Ameli, Steven Patterson, Anthea Senior, Doris Lunardon
    Journal of Dental Education.2022; 86(6): 759.     CrossRef
  • Medical Student Portfolios: A Systematic Scoping Review
    Rei Tan, Jacquelin Jia Qi Ting, Daniel Zhihao Hong, Annabelle Jia Sing Lim, Yun Ting Ong, Anushka Pisupati, Eleanor Jia Xin Chong, Min Chiam, Alexia Sze Inn Lee, Laura Hui Shuen Tan, Annelissa Mien Chew Chin, Limin Wijaya, Warren Fong, Lalit Kumar Radha K
    Journal of Medical Education and Curricular Development.2022;[Epub]     CrossRef
  • Development of Teaching and Learning Manual for Competency-Based Practice for Meridian & Acupuncture Points Class
    Eunbyul Cho, Jiseong Hong, Yeonkyeong Nam, Haegue Shin, Jae-Hyo Kim
    Korean Journal of Acupuncture.2022; 39(4): 184.     CrossRef
Estimation of item parameters and examinees’ mastery probability in each domain of the Korean Medical Licensing Examination using a deterministic inputs, noisy “and” gate (DINA) model  
Younyoung Choi, Dong Gi Seo
J Educ Eval Health Prof. 2020;17:35.   Published online November 17, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.35
  • 7,466 View
  • 109 Download
  • 1 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
The deterministic inputs, noisy “and” gate (DINA) model is a promising statistical method for providing useful diagnostic information about students’ level of achievement, as educators often want to receive diagnostic information on how examinees did on each content strand, which is referred to as a diagnostic profile. The purpose of this paper was to classify examinees of the Korean Medical Licensing Examination (KMLE) in different content domains using the DINA model.
Methods
This paper analyzed data from the KMLE, with 360 items and 3,259 examinees. An application study was conducted to estimate examinees’ parameters and item characteristics. The guessing and slipping parameters of each item were estimated, and statistical analysis was conducted using the DINA model.
Results
The output table shows examples of some items that can be used to check item quality. The probabilities of mastery of each content domain were also estimated, indicating the mastery profile of each examinee. The classification accuracy and consistency for 8 content domains ranged from 0.849 to 0.972 and from 0.839 to 0.994, respectively. As a result, the classification reliability of the cognitive diagnosis model was very high for the 8 content domains of the KMLE.
Conclusion
This mastery profile can provide useful diagnostic information for each examinee in terms of each content domain of the KMLE. Individual mastery profiles allow educators and examinees to understand which domain(s) should be improved in order to master all domains in the KMLE. In addition, all items showed reasonable results in terms of item parameters.

Citations

Citations to this article as recorded by  
  • Large-Scale Parallel Cognitive Diagnostic Test Assembly Using A Dual-Stage Differential Evolution-Based Approach
    Xi Cao, Ying Lin, Dong Liu, Henry Been-Lirn Duh, Jun Zhang
    IEEE Transactions on Artificial Intelligence.2024; 5(6): 3120.     CrossRef
Reviews
A proposal for the future of medical education accreditation in Korea  
Ki-Young Lim
J Educ Eval Health Prof. 2020;17:32.   Published online October 21, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.32
  • 7,505 View
  • 146 Download
  • 4 Web of Science
  • 8 Crossref
AbstractAbstract PDFSupplementary Material
For the past 20 years, the medical education accreditation program of the Korean Institute of Medical Education and Evaluation (KIMEE) has contributed significantly to the standardization and improvement of the quality of basic medical education in Korea. It should now contribute to establishing and promoting the future of medical education. The Accreditation Standards of KIMEE 2019 (ASK2019) have been adopted since 2019, with the goal of achieving world-class medical education by applying a learner-centered curriculum using a continuum framework for the 3 phases of formal medical education: basic medical education, postgraduate medical education, and continuing professional development. ASK2019 will also be able to promote medical education that meets community needs and employs systematic assessments throughout the education process. These are important changes that can be used to gauge the future of the medical education accreditation system. Furthermore, globalization, inter-professional education, health systems science, and regular self-assessment systems are emerging as essential topics for the future of medical education. It is time for the medical education accreditation system in Korea to observe and adopt new trends in global medical education.

Citations

Citations to this article as recorded by  
  • Development of Clinical Reasoning and History-Taking Remediation Training
    Giray Kolcu, Sebahat Ulusan, Mukadder İnci Başer Kolcu
    Journal of Medical Education and Family Medicine.2025; 2(1): 6.     CrossRef
  • Perceived effects of accreditation on education quality and health-related job outcomes: scales validation and correlates in Lebanon
    Deema Rahme, Daniele Saade, Hala Sacre, Chadia Haddad, Samah Tawil, Samar Merhi, Randa Aoun, Aline Hajj, Fouad Sakr, Jihan Safwan, Marwan Akel, Rony M. Zeenny, Pascale Salameh
    BMC Medical Education.2025;[Epub]     CrossRef
  • Analyzing the characteristics of mission statements in Korean medical schools based on the Korean Doctor’s Role framework
    Ye Ji Kang, Soomin Lee, Hyo Jeong Lee, Do-Hwan Kim
    Korean Journal of Medical Education.2024; 36(1): 99.     CrossRef
  • Challenges and potential improvements in the Accreditation Standards of the Korean Institute of Medical Education and Evaluation 2019 (ASK2019) derived through meta-evaluation: a cross-sectional study
    Yoonjung Lee, Min-jung Lee, Junmoo Ahn, Chungwon Ha, Ye Ji Kang, Cheol Woong Jung, Dong-Mi Yoo, Jihye Yu, Seung-Hee Lee
    Journal of Educational Evaluation for Health Professions.2024; 21: 8.     CrossRef
  • Accreditation standards items of post-2nd cycle related to the decision of accreditation of medical schools by the Korean Institute of Medical Education and Evaluation
    Kwi Hwa Park, Geon Ho Lee, Su Jin Chae, Seong Yong Kim
    Korean Journal of Medical Education.2023; 35(1): 1.     CrossRef
  • Continuing Professional Development of Pharmacists and The Roles of Pharmacy Schools
    Hyemin Park, Jeong-Hyun Yoon
    Korean Journal of Clinical Pharmacy.2022; 32(4): 281.     CrossRef
  • Definition of character for medical education based on expert opinions in Korea
    Yera Hur
    Journal of Educational Evaluation for Health Professions.2021; 18: 26.     CrossRef
  • Special reviews on the history and future of the Korean Institute of Medical Education and Evaluation to memorialize its collaboration with the Korea Health Personnel Licensing Examination Institute to designate JEEHP as a co-official journal
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2020; 17: 33.     CrossRef
Is accreditation in medical education in Korea an opportunity or a burden?  
Hanna Jung, Woo Taek Jeon, Shinki An
J Educ Eval Health Prof. 2020;17:31.   Published online October 21, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.31
  • 9,487 View
  • 176 Download
  • 13 Web of Science
  • 17 Crossref
AbstractAbstract PDFSupplementary Material
The accreditation process is both an opportunity and a burden for medical schools in Korea. The line that separates the two is based on how medical schools recognize and utilize the accreditation process. In other words, accreditation is a burden for medical schools if they view the accreditation process as merely a formal procedure or a means to maintain accreditation status for medical education. However, if medical schools acknowledge the positive value of the accreditation process, accreditation can be both an opportunity and a tool for developing medical education. The accreditation process has educational value by catalyzing improvements in the quality, equity, and efficiency of medical education and by increasing the available options. For the accreditation process to contribute to medical education development, accrediting agencies and medical schools must first be recognized as partners of an educational alliance working together towards common goals. Secondly, clear guidelines on accreditation standards should be periodically reviewed and shared. Finally, a formative self-evaluation process must be introduced for institutions to utilize the accreditation process as an opportunity to develop medical education. This evaluation system could be developed through collaboration among medical schools, academic societies for medical education, and the accrediting authority.

Citations

Citations to this article as recorded by  
  • Curriculum development as a governance process: insights from a new PharmD program
    Geneviève Gauthier, Françoise Moreau-Johnson, Christine Landry, Tim Dubé, Claire Touchie
    Advances in Health Sciences Education.2026;[Epub]     CrossRef
  • Equity in Basic Medical Education accreditation standards: a scoping review protocol
    Neelofar Shaheen, Usman Mahboob, Ahsan Sethi, Muhammad Irfan
    BMJ Open.2025; 15(1): e086661.     CrossRef
  • Making Medical Education Socially Accountable in Australia and Southeast Asia: A Systematic Review
    Jyotsna Rimal, Ashish Shrestha, Elizabeth Cardell, Stephen Billett, Alfred King-yin Lam
    Medical Science Educator.2025; 35(3): 1767.     CrossRef
  • Impacts of the Accreditation Process for Undergraduate Medical Schools: A Scoping Review
    Leticia C. Girotto, Karynne B. Machado, Roberta F. C. Moreira, Milton A. Martins, Patrícia Z. Tempski
    The Clinical Teacher.2025;[Epub]     CrossRef
  • Benefits, Challenges, and Opportunities of LCME Accreditation: A National Survey
    Colleen Hayden, David Muller, Allan R. Tunkel, Reena Karani, Jennifer Christner, Abbas Hyderi, Robert Fallar
    Medical Science Educator.2025; 35(5): 2533.     CrossRef
  • Impact of accreditation on medical education quality improvement in 82 medical schools in Japan: a descriptive study
    Nobuo Nara
    Journal of Educational Evaluation for Health Professions.2025; 22: 22.     CrossRef
  • Two decades of accreditation in Chilean medical education: outcomes and lessons learned
    Oscar Jerez Yañez, Carlos Schade Carter, Miguel Altamirano Rivas, Bárbara Carrasco García
    BMC Medical Education.2025;[Epub]     CrossRef
  • Global research agenda for medical education regulation: findings from a nominal group consensus exercise
    Valdes Roberto Bollela, Vanessa Burch, Kadambari Dharanipragada, Janneke Frambach, Janet Grant, Lois Haruna-Cooper, Homa Kabiri, James Kelly, Maria-Athina Martimianakis, Fernando Menezes da Silva, Lamiaa Mohsen, John-george Nicholson, Mohammed Ahmed Rashi
    BMJ Global Health.2025; 10(12): e016014.     CrossRef
  • To prove or improve? Examining how paradoxical tensions shape evaluation practices in accreditation contexts
    Betty Onyura, Abigail J. Fisher, Qian Wu, Shrutikaa Rajkumar, Sarick Chapagain, Judith Nassuna, David Rojas, Latika Nirula
    Medical Education.2024; 58(3): 354.     CrossRef
  • ASPIRE for excellence in curriculum development
    John Jenkins, Sharon Peters, Peter McCrorie
    Medical Teacher.2024; 46(5): 633.     CrossRef
  • Challenges and potential improvements in the Accreditation Standards of the Korean Institute of Medical Education and Evaluation 2019 (ASK2019) derived through meta-evaluation: a cross-sectional study
    Yoonjung Lee, Min-jung Lee, Junmoo Ahn, Chungwon Ha, Ye Ji Kang, Cheol Woong Jung, Dong-Mi Yoo, Jihye Yu, Seung-Hee Lee
    Journal of Educational Evaluation for Health Professions.2024; 21: 8.     CrossRef
  • Accreditation standards items of post-2nd cycle related to the decision of accreditation of medical schools by the Korean Institute of Medical Education and Evaluation
    Kwi Hwa Park, Geon Ho Lee, Su Jin Chae, Seong Yong Kim
    Korean Journal of Medical Education.2023; 35(1): 1.     CrossRef
  • The Need for the Standards for Anatomy Labs in Medical School Evaluation and Accreditation
    Yu-Ran Heo, Jae-Ho Lee
    Anatomy & Biological Anthropology.2023; 36(3): 81.     CrossRef
  • Seal of Approval or Ticket to Triumph? The Impact of Accreditation on Medical Student Performance in Foreign Medical Council Examinations
    Saurabh RamBihariLal Shrivastava, Titi Savitri Prihatiningsih, Kresna Lintang Pratidina
    Indian Journal of Medical Specialities.2023; 14(4): 249.     CrossRef
  • Internal evaluation in the faculties affiliated to zanjan university of medical sciences: Quality assurance of medical science education based on institutional accreditation
    Alireza Abdanipour, Farhad Ramezani‐Badr, Ali Norouzi, Mehdi Ghaemi
    Journal of Medical Education Development.2022; 15(46): 61.     CrossRef
  • Development of Mission and Vision of College of Korean Medicine Using the Delphi Techniques and Big-Data Analysis
    Sanghee Yeo, Seong Hun Choi, Su Jin Chae
    Journal of Korean Medicine.2021; 42(4): 176.     CrossRef
  • Special reviews on the history and future of the Korean Institute of Medical Education and Evaluation to memorialize its collaboration with the Korea Health Personnel Licensing Examination Institute to designate JEEHP as a co-official journal
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2020; 17: 33.     CrossRef
Research articles
Development of a checklist to validate the framework of a narrative medicine program based on Gagne’s instructional design model in Iran through consensus of a multidisciplinary expert panel  
Saeideh Daryazadeh, Nikoo Yamani, Payman Adibi
J Educ Eval Health Prof. 2019;16:34.   Published online October 31, 2019
DOI: https://doi.org/10.3352/jeehp.2019.16.34
  • 11,667 View
  • 188 Download
  • 1 Web of Science
  • 2 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Narrative medicine is a patient-centered approach focusing on the development of narrative skills and self-awareness that incorporates “attending, representing, and affiliating” in clinical encounters. Acquiring narrative competency promotes clinical performance, and narratives can be used for teaching professionalism, empathy, multicultural education, and professional development. This study was conducted to develop a checklist to validate the framework of a narrative medicine program through consensus of a panel.
Methods
This expert panel study was conducted from 2018 to 2019 at Isfahan University of Medical Sciences, Iran. It included 2 phases: developing a framework in 2 steps and forming an expert panel to validate the framework in 3 rounds. We adapted a 3-stage narrative medicine model with 9 training activities from Gagne’s theory, developed a framework, and then produced a checklist to validate the framework in a multidisciplinary expert panel that consisted of 7 experts. The RAND/UCLA appropriateness method was used to assess the experts’ agreement. The first-round opinions were received by email. Consensus was achieved in the second and third rounds through face-to-face meetings to facilitate interactions and discussion among the experts.
Results
Sixteen valid indicators were approved and 100% agreement was obtained among experts (with median values in the range of 7–9 out of a maximum of 9, with no disagreement), and the framework was validated by the expert panel.
Conclusion
The 16 checklist indicators can be used to evaluate narrative medicine programs as a simple and practical guide to improve teaching effectiveness and promote life-long learning.

Citations

Citations to this article as recorded by  
  • Challenges of Implementing the First Narrative Medicine Course for Teaching Professionalism in Iran: A Qualitative Content Analysis
    Saeideh Daryazadeh, Payman Adibi, Nikoo Yamani
    Educational Research in Medical Sciences.2022;[Epub]     CrossRef
  • Impact of a narrative medicine program on reflective capacity and empathy of medical students in Iran
    Saeideh Daryazadeh, Payman Adibi, Nikoo Yamani, Roya Mollabashi
    Journal of Educational Evaluation for Health Professions.2020; 17: 3.     CrossRef
Development of a self-assessment tool for resident doctors’ communication skills in India  
Upendra Baitha, Piyush Ranjan, Siddharth Sarkar, Charu Arora, Archana Kumari, Sada Nand Dwivedi, Asmita Patil, Nayer Jamshed
J Educ Eval Health Prof. 2019;16:17.   Published online June 24, 2019
DOI: https://doi.org/10.3352/jeehp.2019.16.17
  • 17,229 View
  • 303 Download
  • 13 Web of Science
  • 13 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Effective communication skills are essential for resident doctors to provide optimum patient care. This study was conducted to develop and validate a questionnaire for the self-assessment of resident doctors’ communication skills in India.
Methods
This was a mixed-methods study conducted in 2 phases. The first phase consisted of questionnaire development, including the identification of relevant literature, focus group discussions with residents and experts from clinical specialties, and pre-testing of the questionnaire. The second phase involved administering the questionnaire survey to 95 residents from the Departments of Medicine, Emergency Medicine, Pediatrics, and Surgery at the All India Institute of Medical Sciences, New Delhi, India in April 2019. Internal consistency was tested and the factor structure was analyzed to test construct validity.
Results
The questionnaire consisted of 3 sections: (A) 4 items on doctor-patient conflicts and the role of communication skills in avoiding these conflicts, (B) 29 items on self-assessment of communication skills in different settings, and (C) 8 items on barriers to practicing good communication skills. Sections B and C had good internal consistency (Cronbach α: 0.885 and 0.771, respectively). Section C had a 2-factor solution, and the barriers were classified as ‘training’ and ‘infrastructure’ factors.
Conclusion
This appears to be a valid assessment tool of resident doctors’ communication skills, with potential utility for identifying gaps in communication skills and developing communication skills modules.

Citations

Citations to this article as recorded by  
  • Evaluating Communication Skills in Health Professions Education: A Systematic Review of Assessment Tools
    Kelly Macauley, Jacque Bradford, Cathy Stucker, Carol Likens, Lin Wu
    The Clinical Teacher.2026;[Epub]     CrossRef
  • Are we ready for the feedback? A qualitative exploratory study on residents’ awareness and readiness for multisource (360°) feedback in medical residency
    Sumera Nisar, Ayesha Jamal, Husna Irfan Thalib, Sariya Khan, Sayeeda Mehveen, Aseef Rehman, Ghala AlSughayyir, Ammar Yassir Bafarat, Hiba Hisham Alshikh Ali
    BMJ Open.2025; 15(12): e101924.     CrossRef
  • Development and Validation of a Comprehensive Nursing Competence Assessment Questionnaire (CNCAQ) to Determine the Perceived Clinical Competence of Nursing Graduates
    Sunita Srivastava, Hariprasath Pandurangan, Anil Kumar
    Nursing & Midwifery Research Journal.2024; 20(2): 96.     CrossRef
  • Socio‐behaviour change intervention in health care professionals: Impact and effectiveness
    Chinmay Shah, Fouzia Shersad
    Medical Education.2024; 58(11): 1422.     CrossRef
  • Communication skills of residents: are they as good as they think?
    Namra Qadeer Shaikh, Ali Aahil Noorali, Asma Altaf Hussain Merchant, Noreen Afzal, Maryam Pyar Ali Lakhdir, Komal Abdul Rahim, Syeda Fatima Shariq, Rida Ahmad, Saqib Kamran Bakhshi, Saad Bin Zafar Mahmood, Shayan Shah, Muhammad Rizwan Khan, Muhammad Tariq
    Medical Education Online.2024;[Epub]     CrossRef
  • Leveraging the vantage point – exploring nurses’ perception of residents’ communication skills: a mixed-methods study
    Komal Abdul Rahim, Maryam Pyar Ali Lakhdir, Noreen Afzal, Asma Altaf Hussain Merchant, Namra Qadeer Shaikh, Ali Aahil Noorali, Umar Tariq, Rida Ahmad, Saqib Kamran Bakhshi, Saad bin Zafar Mahmood, Muhammad Rizwan Khan, Muhammed Tariq, Adil H. Haider
    BMC Medical Education.2023;[Epub]     CrossRef
  • Developing a communication-skills training curriculum for resident-physicians to enhance patient outcomes at an academic medical centre: an ongoing mixed-methods study protocol
    Hamna Shahbaz, Ali Aahil Noorali, Maha Inam, Namra Qadeer, Asma Altaf Hussain Merchant, Adnan Ali Khan, Noreen Afzal, Komal Abdul Rahim, Ibrahim Munaf, Rida Ahmad, Muhammad Tariq, Adil H Haider
    BMJ Open.2022; 12(8): e056840.     CrossRef
  • A cross-sectional evaluation of communication skills and perceived barriers among the resident doctors at a tertiary care center in India
    Amandeep Singh, Piyush Ranjan, Archana Kumari, Siddharth Sarkar, Tanveer Kaur, Ramesh Aggarwal, Ashish Datt Upadhyay, Biswaroop Chakrawarty, Jamshed Nayer, Mohit Joshi, Avinash Chakrawarty
    Journal of Education and Health Promotion.2022; 11(1): 425.     CrossRef
  • What do clinical resident doctors think about workplace violence? A qualitative study comprising focus group discussions and thematic analysis from a tertiary care center of India
    Amandeep Singh, Piyush Ranjan, Siddharth Sarkar, Tarang Preet Kaur, Roshan Mathew, Dinesh Gora, Ajay Mohan, Jaswant Jangra
    Journal of Family Medicine and Primary Care.2022; 11(6): 2678.     CrossRef
  • Development and validation of a questionnaire to assess preventive practices against COVID-19 pandemic in the general population
    Ayush Agarwal, Piyush Ranjan, Priyanka Rohilla, Yellamraju Saikaustubh, Anamika Sahu, Sada Nand Dwivedi, Aakansha, Upendra Baitha, Arvind Kumar
    Preventive Medicine Reports.2021; 22: 101339.     CrossRef
  • Development and Validation of a Comprehensive Questionnaire to Assess Interpersonal Discord (Bullying, Harassment, and Discrimination) at the Workplace in a Healthcare Setting
    Amandeep Singh, Piyush Ranjan, Tanveer Kaur, Siddharth Sarkar, Ashish D Upadhyay, Upendra Baitha, Prayas Sethi, Ranveer S Jadon, Pankaj Jorwal
    Cureus.2021;[Epub]     CrossRef
  • Development and Validation of a Questionnaire to Evaluate Workplace Violence in Healthcare Settings
    Archana Kumari, Amandeep Singh, Piyush Ranjan, Siddharth Sarkar, Tanveer Kaur, Ashish D Upadhyay, Kirti Verma, Vignan Kappagantu, Ajay Mohan, Upendra Baitha
    Cureus.2021;[Epub]     CrossRef
  • The value of communicating with patients in their first language
    Piyush Ranjan, Archana Kumari, Charu Arora
    Expert Review of Pharmacoeconomics & Outcomes Research.2020; 20(6): 559.     CrossRef
Educational/faculty development material
Analysis of the Clinical Education Situation framework: a tool for identifying the root cause of student failure in the United States  
Katherine Myers, Kyle Covington
J Educ Eval Health Prof. 2019;16:11.   Published online May 10, 2019
DOI: https://doi.org/10.3352/jeehp.2019.16.11
  • 18,173 View
  • 304 Download
  • 2 Web of Science
  • 5 Crossref
AbstractAbstract PDFSupplementary Material
Doctor of physical therapy preparation requires extensive time in precepted clinical education, which involves multiple stakeholders. Student outcomes in clinical education are impacted by many factors, and, in the case of failure, it can be challenging to determine which factors played a primary role in the poor result. Using existing root-cause analysis processes, the authors developed and implemented a framework designed to identify the causes of student failure in clinical education. This framework, when applied to a specific student failure event, can be used to identify the factors that contributed to the situation and to reveal opportunities for improvement in both the clinical and academic environments. A root-cause analysis framework can help to drive change at the programmatic level, and future studies should focus on the framework’s application to a variety of clinical and didactic settings.

Citations

Citations to this article as recorded by  
  • Risk Governance in Clinical Education for Healthcare Students: A Scoping Review
    Raelynn R. Tong, Je Min Suh, Lisa Cheshire, Lucio Naccarella
    The Clinical Teacher.2026;[Epub]     CrossRef
  • Underperformance and failure in allied health practice placements: a scoping review of student performance experiences
    Amanda Wray, Lucy K. Lewis, Alison Yaxley, Stacie Attrill
    Teaching in Higher Education.2025; 30(8): 1930.     CrossRef
  • Applying the 2022 Cardiovascular and Pulmonary Entry-Level Physical Therapist Competencies to Physical Therapist Education and Practice
    Nancy Smith, Angela Campbell, Morgan Johanson, Pamela Bartlo, Naomi Bauer, Sagan Everett
    Journal of Physical Therapy Education.2023; 37(3): 165.     CrossRef
  • Cardiovascular and Pulmonary Entry-Level Physical Therapist Competencies: Update by Academy of Cardiovascular & Pulmonary Physical Therapy Task Force
    Morgan Johanson, Pamela Bartlo, Naomi Bauer, Angela Campbell, Sagan Everett, Nancy Smith
    Cardiopulmonary Physical Therapy Journal.2023; 34(4): 183.     CrossRef
  • The situational analysis of teaching-learning in clinical education in Iran: a postmodern grounded theory study
    Soleiman Ahmady, Hamed Khani
    BMC Medical Education.2022;[Epub]     CrossRef
Research articles
Medical students’ thought process while solving problems in 3 different types of clinical assessments in Korea: clinical performance examination, multimedia case-based assessment, and modified essay question  
Sejin Kim, Ikseon Choi, Bo Young Yoon, Min Jeong Kwon, Seok-jin Choi, Sang Hyun Kim, Jong-Tae Lee, Byoung Doo Rhee
J Educ Eval Health Prof. 2019;16:10.   Published online May 9, 2019
DOI: https://doi.org/10.3352/jeehp.2019.16.10
  • 19,034 View
  • 295 Download
  • 2 Web of Science
  • 5 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to explore students’ cognitive patterns while solving clinical problems in 3 different types of assessments—clinical performance examination (CPX), multimedia case-based assessment (CBA), and modified essay question (MEQ)—and thereby to understand how different types of assessments stimulate different patterns of thinking.
Methods
A total of 6 test-performance cases from 2 fourth-year medical students were used in this cross-case study. Data were collected through one-on-one interviews using a stimulated recall protocol where students were shown videos of themselves taking each assessment and asked to elaborate on what they were thinking. The unit of analysis was the smallest phrases or sentences in the participants’ narratives that represented meaningful cognitive occurrences. The narrative data were reorganized chronologically and then analyzed according to the hypothetico-deductive reasoning framework for clinical reasoning.
Results
Both participants demonstrated similar proportional frequencies of clinical reasoning patterns on the same clinical assessments. The results also revealed that the three different assessment types may stimulate different patterns of clinical reasoning. For example, the CPX strongly promoted the participants’ reasoning related to inquiry strategy, while the MEQ strongly promoted hypothesis generation. Similarly, data analysis and synthesis by the participants were more strongly stimulated by the CBA than by the other assessment types.
Conclusion
This study found that different assessment designs stimulated different patterns of thinking during problem-solving. This finding can contribute to the search for ways to improve current clinical assessments. Importantly, the research method used in this study can be utilized as an alternative way to examine the validity of clinical assessments.

Citations

Citations to this article as recorded by  
  • Designing a Faculty Development Program on Electronic Question Banks: Schedule, Evaluation, and Sustainability Plan
    Saurabh RamBihariLal Shrivastava, Rachmadya Nur Hidayah
    Journal of the Scientific Society.2025; 52(2): 181.     CrossRef
  • Evaluation of modified essay questions (MEQs) as an assessment tool in third-year medical students’ modular summative assessment
    Ayman Elsamanoudy, Mohamed Shehata, Amer Almarabheh, Zienab Alrefaie
    BMC Medical Education.2024;[Epub]     CrossRef
  • Future directions of online learning environment design at medical schools: a transition towards a post-pandemic context
    Sejin Kim
    Kosin Medical Journal.2023; 38(1): 12.     CrossRef
  • Clinical Reasoning Training based on the analysis of clinical case using a virtual environment
    Sandra Elena Lisperguer Soto, María Soledad Calvo, Gabriela Paz Urrejola Contreras, Miguel Ángel Pérez Lizama
    Educación Médica.2021; 22(3): 139.     CrossRef
  • Newly appointed medical faculty members’ self-evaluation of their educational roles at the Catholic University of Korea College of Medicine in 2020 and 2021: a cross-sectional survey-based study
    Sun Kim, A Ra Cho, Chul Woon Chung
    Journal of Educational Evaluation for Health Professions.2021; 18: 28.     CrossRef
Comparison of the level of cognitive processing between case-based items and non-case-based items on the Interuniversity Progress Test of Medicine in the Netherlands  
Dario Cecilio-Fernandes, Wouter Kerdijk, Andreas Johannes Bremers, Wytze Aalders, René Anton Tio
J Educ Eval Health Prof. 2018;15:28.   Published online December 12, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.28
  • 22,008 View
  • 228 Download
  • 14 Web of Science
  • 13 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
It is assumed that case-based questions require higher-order cognitive processing, whereas questions that are not case-based require lower-order cognitive processing. In this study, we investigated to what extent case-based and non-case-based questions followed this assumption based on Bloom’s taxonomy.
Methods
In this article, 4,800 questions from the Interuniversity Progress Test of Medicine were classified based on whether they were case-based and on the level of Bloom’s taxonomy that they involved. Lower-order questions require students to remember or/and have a basic understanding of knowledge. Higher-order questions require students to apply, analyze, or/and evaluate. The phi coefficient was calculated to investigate the relationship between whether questions were case-based and the required level of cognitive processing.
Results
Our results demonstrated that 98.1% of case-based questions required higher-level cognitive processing. Of the non-case-based questions, 33.7% required higher-level cognitive processing. The phi coefficient demonstrated a significant, but moderate correlation between the presence of a patient case in a question and its required level of cognitive processing (phi coefficient= 0.55, P< 0.001).
Conclusion
Medical instructors should be aware of the association between item format (case-based versus non-case-based) and the cognitive processes they elicit in order to meet the desired balance in a test, taking the learning objectives and the test difficulty into account.

Citations

Citations to this article as recorded by  
  • To the Point: Substituting SOAP Notes for Vignettes in Preclinical Assessment Question Stems
    Kristina Lindquist, Derek Meeks, Kyle Mefferd, Cheryl Vanier, Terrence W. Miller
    Medical Science Educator.2025; 35(3): 1423.     CrossRef
  • Factors Associated with Lower Performance of Artificial Intelligence on Answering Undergraduate Medical Education Multiple-Choice Questions
    Renato Ferretti, Joyce Santana Rizzi, Lorraine Silva Requena, Angelica Maria Bicudo, Pedro Tadao Hamamoto Filho
    Medical Science Educator.2025; 35(4): 2145.     CrossRef
  • Measuring cognitive levels in high-stakes testing: A CDM analysis of a university entrance examination using Bloom’s Taxonomy
    Hamdollah Ravand, Reza Shahi, Farshad Effatpanah, Ali Moghadamzadeh
    Frontiers in Education.2025;[Epub]     CrossRef
  • Progress is impossible without change: implementing automatic item generation in medical knowledge progress testing
    Filipe Manuel Vidal Falcão, Daniela S.M. Pereira, José Miguel Pêgo, Patrício Costa
    Education and Information Technologies.2024; 29(4): 4505.     CrossRef
  • Reliability across content areas in progress tests assessing medical knowledge: a Brazilian cross-sectional study with implications for medical education assessments
    Pedro Tadao Hamamoto Filho, Miriam Hashimoto, Alba Regina de Abreu Lima, Leandro Arthur Diehl, Neide Tomimura Costa, Patrícia Moretti Rehder, Samira Yarak, Maria Cristina de Andrade, Maria de Lourdes Marmorato Botta Hafner, Zilda Maria Tosta Ribeiro, Júli
    Sao Paulo Medical Journal.2024;[Epub]     CrossRef
  • Development of qualified items for nursing education assessment: The progress testing experience
    Bruna Moreno Dias, Lúcia Marta Giunta da Silva, Pedro Tadao Hamamoto Filho, Valdes Roberto Bollela, Carmen Silvia Gabriel
    Nurse Education in Practice.2024; 81: 104199.     CrossRef
  • Identifying the response process validity of clinical vignette-type multiple choice questions: An eye-tracking study
    Francisco Carlos Specian Junior, Thiago Martins Santos, John Sandars, Eliana Martorano Amaral, Dario Cecilio-Fernandes
    Medical Teacher.2023; 45(8): 845.     CrossRef
  • Relationship between medical programme progress test performance and surgical clinical attachment timing and performance
    Andy Wearn, Vanshay Bindra, Bradley Patten, Benjamin P. T. Loveday
    Medical Teacher.2023; 45(8): 877.     CrossRef
  • Analysis of Orthopaedic In-Training Examination Trauma Questions: 2017 to 2021
    Lilah Fones, Daryl C. Osbahr, Daniel E. Davis, Andrew M. Star, Atif K. Ahmed, Arjun Saxena
    JAAOS: Global Research and Reviews.2023;[Epub]     CrossRef
  • Use of Sociodemographic Information in Clinical Vignettes of Multiple-Choice Questions for Preclinical Medical Students
    Kelly Carey-Ewend, Amir Feinberg, Alexis Flen, Clark Williamson, Carmen Gutierrez, Samuel Cykert, Gary L. Beck Dallaghan, Kurt O. Gilliland
    Medical Science Educator.2023; 33(3): 659.     CrossRef
  • What faculty write versus what students see? Perspectives on multiple-choice questions using Bloom’s taxonomy
    Seetha U. Monrad, Nikki L. Bibler Zaidi, Karri L. Grob, Joshua B. Kurtz, Andrew W. Tai, Michael Hortsch, Larry D. Gruppen, Sally A. Santen
    Medical Teacher.2021; 43(5): 575.     CrossRef
  • Aménagement du concours de première année commune aux études de santé (PACES) : entre justice sociale et éthique confraternelle en devenir ?
    R. Pougnet, L. Pougnet
    Éthique & Santé.2020; 17(4): 250.     CrossRef
  • Knowledge of dental faculty in gulf cooperation council states of multiple-choice questions’ item writing flaws
    Mawlood Kowash, Hazza Alhobeira, Iyad Hussein, Manal Al Halabi, Saif Khan
    Medical Education Online.2020;[Epub]     CrossRef
Agreement between 2 raters’ evaluations of a traditional prosthodontic practical exam integrated with directly observed procedural skills in Egypt  
Ahmed Khalifa Khalifa, Salah Hegazy
J Educ Eval Health Prof. 2018;15:23.   Published online September 27, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.23
  • 26,322 View
  • 209 Download
  • 3 Web of Science
  • 3 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to assess the agreement between 2 raters in evaluations of students on a prosthodontic clinical practical exam integrated with directly observed procedural skills (DOPS).
Methods
A sample of 76 students was monitored by 2 raters to evaluate the process and the final registered maxillomandibular relation for a completely edentulous patient at Mansoura Dental School, Egypt on a practical exam of bachelor’s students from May 15 to June 28, 2017. Each registered relation was evaluated from a total of 60 marks subdivided into 3 score categories: occlusal plane orientation (OPO), vertical dimension registration (VDR), and centric relation registration (CRR). The marks for each category included an assessment of DOPS. The marks of OPO and VDR for both raters were compared using the graph method to measure reliability through Bland and Altman analysis. The reliability of the CRR marks was evaluated by the Krippendorff alpha ratio.
Results
The results revealed highly similar marks between raters for OPO (mean= 18.1 for both raters), with close limits of agreement (0.73 and −0.78). For VDR, the mean marks were close (mean= 17.4 and 17.1 for examiners 1 and 2, respectively), with close limits of agreement (2.7 and −2.2). There was a strong correlation (Krippendorff alpha ratio, 0.92; 95% confidence interval, 0.79– 0.99) between the raters in the evaluation of CRR.
Conclusion
The 2 raters’ evaluation of a clinical traditional practical exam integrated with DOPS showed no significant differences in the evaluations of candidates at the end of a clinical prosthodontic course. The limits of agreement between raters could be optimized by excluding subjective evaluation parameters and complicated cases from the examination procedure.

Citations

Citations to this article as recorded by  
  • Application of directly observed procedural skills in hospital infection training: a randomized controlled trial
    Zhumin Hu, Weipeng Zhang, Minyan Huang, Xiaoyan Liu
    Frontiers in Medicine.2025;[Epub]     CrossRef
  • In‐person and virtual assessment of oral radiology skills and competences by the Objective Structured Clinical Examination
    Fernanda R. Porto, Mateus A. Ribeiro, Luciano A. Ferreira, Rodrigo G. Oliveira, Karina L. Devito
    Journal of Dental Education.2023; 87(4): 505.     CrossRef
  • Evaluation agreement between peer assessors, supervisors, and parents in assessing communication and interpersonal skills of students of pediatric dentistry
    Jin Asari, Maiko Fujita-Ohtani, Kuniomi Nakamura, Tomomi Nakamura, Yoshinori Inoue, Shigenari Kimoto
    Pediatric Dental Journal.2023; 33(2): 133.     CrossRef
Learning through multiple lenses: analysis of self, peer, nearpeer, and faculty assessments of a clinical history-taking task in Australia  
Kylie Fitzgerald, Brett Vaughan
J Educ Eval Health Prof. 2018;15:22.   Published online September 18, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.22
  • 26,207 View
  • 306 Download
  • 5 Web of Science
  • 7 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Peer assessment provides a framework for developing expected skills and receiving feedback appropriate to the learner’s level. Near-peer (NP) assessment may elevate expectations and motivate learning. Feedback from peers and NPs may be a sustainable way to enhance student assessment feedback. This study analysed relationships among self, peer, NP, and faculty marking of an assessment and students’ attitudes towards marking by those various groups.
Methods
A cross-sectional study design was used. Year 2 osteopathy students (n= 86) were invited to perform self and peer assessments of a clinical history-taking and communication skills assessment. NPs and faculty also marked the assessment. Year 2 students also completed a questionnaire on their attitudes to peer/NP marking. Descriptive statistics and the Spearman rho coefficient were used to evaluate relationships across marker groups.
Results
Year 2 students (n= 9), NPs (n= 3), and faculty (n= 5) were recruited. Correlations between self and peer (r= 0.38) and self and faculty (r= 0.43) marks were moderate. A weak correlation was observed between self and NP marks (r= 0.25). Perceptions of peer and NP marking varied, with over half of the cohort suggesting that peer or NP assessments should not contribute to their grade.
Conclusion
Framing peer and NP assessment as another feedback source may offer a sustainable method for enhancing feedback without overloading faculty resources. Multiple sources of feedback may assist in developing assessment literacy and calibrating students’ self-assessment capability. The small number of students recruited suggests some acceptability of peer and NP assessment; however, further work is required to increase its acceptability.

Citations

Citations to this article as recorded by  
  • Learning how to suture: Should learners observe a demonstration of someone who is experienced or inexperienced to improve their own performance?
    Portia Kalun, Ranil Sonnadara
    The American Journal of Surgery.2025; 243: 116276.     CrossRef
  • Transitioning to competency-based education in nursing: a scoping review of curriculum review and revision strategies
    Zakaria Ahmed Mani
    BMC Nursing.2025;[Epub]     CrossRef
  • The extent and quality of evidence for osteopathic education: A scoping review
    Andrew MacMillan, Patrick Gauthier, Luciane Alberto, Arabella Gaunt, Rachel Ives, Chris Williams, Dr Jerry Draper-Rodi
    International Journal of Osteopathic Medicine.2023; 49: 100663.     CrossRef
  • History and physical exam: a retrospective analysis of a clinical opportunity
    David McLinden, Krista Hailstone, Sue Featherston
    BMC Medical Education.2023;[Epub]     CrossRef
  • How Accurate Are Our Students? A Meta-analytic Systematic Review on Self-assessment Scoring Accuracy
    Samuel P. León, Ernesto Panadero, Inmaculada García-Martínez
    Educational Psychology Review.2023;[Epub]     CrossRef
  • Evaluating the Academic Performance of Mustansiriyah Medical College Teaching Staff vs. Final-Year Students Failure Rates
    Wassan Nori, Wisam Akram , Saad Mubarak Rasheed, Nabeeha Najatee Akram, Taqi Mohammed Jwad Taher, Mustafa Ali Kassim Kassim, Alexandru Cosmin Pantazi
    Al-Rafidain Journal of Medical Sciences ( ISSN 2789-3219 ).2023; 5(1S): S151.     CrossRef
  • History-taking level and its influencing factors among nursing undergraduates based on the virtual standardized patient testing results: Cross sectional study
    Jingrong Du, Xiaowen Zhu, Juan Wang, Jing Zheng, Xiaomin Zhang, Ziwen Wang, Kun Li
    Nurse Education Today.2022; 111: 105312.     CrossRef
Brief report
The implementation and evaluation of an e-Learning training module for objective structured clinical examination raters in Canada  
Karima Khamisa, Samantha Halman, Isabelle Desjardins, Mireille St. Jean, Debra Pugh
J Educ Eval Health Prof. 2018;15:18.   Published online August 6, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.18
  • 25,234 View
  • 268 Download
  • 5 Web of Science
  • 6 Crossref
AbstractAbstract PDFSupplementary Material
Improving the reliability and consistency of objective structured clinical examination (OSCE) raters’ marking poses a continual challenge in medical education. The purpose of this study was to evaluate an e-Learning training module for OSCE raters who participated in the assessment of third-year medical students at the University of Ottawa, Canada. The effects of online training and those of traditional in-person (face-to-face) orientation were compared. Of the 90 physicians recruited as raters for this OSCE, 60 consented to participate (67.7%) in the study in March 2017. Of the 60 participants, 55 rated students during the OSCE, while the remaining 5 were back-up raters. The number of raters in the online training group was 41, while that in the traditional in-person training group was 19. Of those with prior OSCE experience (n= 18) who participated in the online group, 13 (68%) reported that they preferred this format to the in-person orientation. The total average time needed to complete the online module was 15 minutes. Furthermore, 89% of the participants felt the module provided clarity in the rater training process. There was no significant difference in the number of missing ratings based on the type of orientation that raters received. Our study indicates that online OSCE rater training is comparable to traditional face-to-face orientation.

Citations

Citations to this article as recorded by  
  • Leveraging feedback mechanisms to improve the quality of objective structured clinical examinations in Singapore: an exploratory action research study
    Han Ting Jillian Yeo, Dujeepa Dasharatha Samarasekera, Michael Dean
    Journal of Educational Evaluation for Health Professions.2025; 22: 28.     CrossRef
  • Raters and examinees training for objective structured clinical examination: comparing the effectiveness of three instructional methodologies
    Jefferson Garcia Guerrero, Ayidah Sanad Alqarni, Lorraine Turiano Estadilla, Lizy Sonia Benjamin, Vanitha Innocent Rani
    BMC Nursing.2024;[Epub]     CrossRef
  • Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review
    Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
    Journal of Educational Evaluation for Health Professions.2021; 18: 11.     CrossRef
  • Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia
    Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
    Journal of Educational Evaluation for Health Professions.2021; 18: 23.     CrossRef
  • No observed effect of a student-led mock objective structured clinical examination on subsequent performance scores in medical students in Canada
    Lorenzo Madrazo, Claire Bo Lee, Meghan McConnell, Karima Khamisa, Debra Pugh
    Journal of Educational Evaluation for Health Professions.2019; 16: 14.     CrossRef
  • ОБ’ЄКТИВНИЙ СТРУКТУРОВАНИЙ КЛІНІЧНИЙ ІСПИТ ЯК ВИМІР ПРАКТИЧНОЇ ПІДГОТОВКИ МАЙБУТНЬОГО ЛІКАРЯ
    M. M. Korda, A. H. Shulhai, N. V. Pasyaka, N. V. Petrenko, N. V. Haliyash, N. A. Bilkevich
    Медична освіта.2019; (3): 19.     CrossRef
Research article
Attitudes to proposed assessment of pharmacy skills in Korean pharmacist licensure examination  
Joo Hee Kim, Ju-Yeun Lee, Young Sook Lee, Chul-Soon Yong, Nayoung Han, Hye Sun Gwak, Jungmi Oh, Byung Koo Lee, Sukhyang Lee
J Educ Eval Health Prof. 2017;14:6.   Published online March 27, 2017
DOI: https://doi.org/10.3352/jeehp.2017.14.6
  • 49,342 View
  • 385 Download
  • 3 Web of Science
  • 4 Crossref
AbstractAbstract PDF
Purpose
The survey aimed to obtain opinions about a proposed implementation of pharmacy skills assessment in Korean pharmacist licensure examination (KPLE).
Methods
A 16-question survey was distributed electronically to 2,738 people including 570 pharmacy professors of 35 pharmacy schools, 550 preceptors from 865 practice sites and 1,618 students who graduated in 2015. The survey solicited responses concerning the adequacy of the current KPLE in assessing pharmacy knowledge/skills/attitudes, deficiencies of pharmacy skills testing in assessing the professional competencies necessary for pharmacists, plans for pharmacy skills tests in the current KPLE, and subject areas of pharmacy practice.
Results
A total of 466 surveys were returned. The current exam is not adequate for assessing skills and attitudes according to 42%–48% of respondents. Sixty percent felt that skills test is necessary to assess qualifications and professional competencies. Almost two-thirds of participants stated that testing should be implemented within 5 years. More than 60% agreed that candidates should be graduates and that written and skills test scores can be combined for pass-fail decisions. About 70% of respondents felt that the test should be less than 2 hours in duration. Over half of the respondents thought that the assessor should be a pharmacy faculty member with at least 5 years of clinical experience. Up to 70% stated that activities related to patient care were appropriate and practical for the scope of skills test.
Conclusion
Pharmacy skills assessment was supported by the majority of respondents.

Citations

Citations to this article as recorded by  
  • Development of pharmaceutical care competency framework in Chinese hospital pharmacist: a qualitative study in grade a tertiary hospitals
    Wanqing Wang, Yong Wang, Xin Shi, Jianguo Zhu, Rong Chen
    BMC Medical Education.2025;[Epub]     CrossRef
  • A scoping review of the methods and processes used by regulatory bodies to determine pharmacists’ readiness for practice
    Eimear Ni Sheachnasaigh, Cathal Cadogan, Judith Strawbridge, Laura J. Sahm, Cristin Ryan
    Research in Social and Administrative Pharmacy.2022; 18(12): 4028.     CrossRef
  • Development of a Platform to Align Education and Practice: Bridging Academia and the Profession in Portugal
    Filipa Alves da Costa, Ana Paula Martins, Francisco Veiga, Isabel Ramalhinho, José Manuel Sousa Lobo, Luís Rodrigues, Luiza Granadeiro, Matilde Castro, Pedro Barata, Perpétua Gomes, Vítor Seabra, Maria Margarida Caramona
    Pharmacy.2020; 8(1): 11.     CrossRef
  • Selection of Tasks for Assessment of Pharmacy Clinical Performance in Korean Pharmacist Licensure Examination: Results of an Expert Survey
    Nayoung Han, Ju-Yeun Lee, Hye Sun Gwak, Byung Koo Lee, Young Sook Lee, Sukhyang Lee, Chul-Soon Yong, Joo Hee Kim, Jung Mi Oh, v
    Korean Journal of Clinical Pharmacy.2017; 27(3): 119.     CrossRef
Research Article
Profiling medical school learning environments in Malaysia: a validation study of the Johns Hopkins Learning Environment Scale  
Sean Tackett, Hamidah Abu Bakar, Nicole A. Shilkofski, Niamh Coady, Krishna Rampal, Scott Wright
J Educ Eval Health Prof. 2015;12:39.   Published online July 9, 2015
DOI: https://doi.org/10.3352/jeehp.2015.12.39
  • 33,606 View
  • 187 Download
  • 17 Web of Science
  • 12 Crossref
AbstractAbstract PDF
Purpose
While a strong learning environment is critical to medical student education, the assessment of medical school learning environments has confounded researchers. Our goal was to assess the validity and utility of the Johns Hopkins Learning Environment Scale (JHLES) for preclinical students at three Malaysian medical schools with distinct educational and institutional models. Two schools were new international partnerships, and the third was school leaver program established without international partnership. Methods: First- and second-year students responded anonymously to surveys at the end of the academic year. The surveys included the JHLES, a 28-item survey using five-point Likert scale response options, the Dundee Ready Educational Environment Measure (DREEM), the most widely used method to assess learning environments internationally, a personal growth scale, and single-item global learning environment assessment variables. Results: The overall response rate was 369/429 (86%). After adjusting for the medical school year, gender, and ethnicity of the respondents, the JHLES detected differences across institutions in four out of seven domains (57%), with each school having a unique domain profile. The DREEM detected differences in one out of five categories (20%). The JHLES was more strongly correlated than the DREEM to two thirds of the single-item variables and the personal growth scale. The JHLES showed high internal reliability for the total score (α=0.92) and the seven domains (α= 0.56-0.85). Conclusion: The JHLES detected variation between learning environment domains across three educational settings, thereby creating unique learning environment profiles. Interpretation of these profiles may allow schools to understand how they are currently supporting trainees and identify areas needing attention.

Citations

Citations to this article as recorded by  
  • Quality of the educational environmentin Slovakia and the Czech Republic using the DREEM inventory
    Ľubomíra Lizáková, Daša Stupková, Lucie Libešová, Ivana Lamková, Valéria Horanská, Ľudmila Andráščíková, Alena Lochmannová
    Pielegniarstwo XXI wieku / Nursing in the 21st Century.2025; 24(1): 45.     CrossRef
  • The impact of grade point average on medical students’ perception of the learning environment: a multicenter cross-sectional study across 12 Chinese medical schools
    Yifan Liu, Donghao Lyu, Sujie Xie, Yuntao Yao, Jun Liu, Bingnan Lu, Wei Zhang, Shuyuan Xian, Jiale Yan, Meiqiong Gong, Xinru Wu, Yuanan Li, Haoyu Zhang, Jiajie Zhou, Yibin Zhou, Min Lin, Huabin Yin, Xiaonan Wang, Yue Wang, Wenfang Chen, Chongyou Zhang, Er
    BMC Medical Education.2025;[Epub]     CrossRef
  • A scoping review of the questionnaires used for the assessment of the perception of undergraduate students of the learning environment in healthcare professions education programs
    Banan Mukhalalati, Ola Yakti, Sara Elshami
    Advances in Health Sciences Education.2024; 29(4): 1501.     CrossRef
  • Validation of the Polish version of the Johns Hopkins Learning Environment Scale–a confirmatory factor analysis
    Dorota Wójcik, Leszek Szalewski, Adam Bęben, Iwona Ordyniec-Kwaśnica, Robert B. Shochet
    Scientific Reports.2024;[Epub]     CrossRef
  • Exploring medical students' experience of the learning environment: a mixed methods study in Saudi medical college
    Mohammed Almansour, Noura Abouammoh, Reem Bin Idris, Omar Abdullatif Alsuliman, Renad Abdulrahman Alhomaidi, Mohammed Hamad Alhumud, Hani A. Alghamdi
    BMC Medical Education.2024;[Epub]     CrossRef
  • A multicenter cross-sectional study in China revealing the intrinsic relationship between medical students’ grade and their perceptions of the learning environment
    Runzhi Huang, Weijin Qian, Sujie Xie, Mei Cheng, Meiqiong Gong, Shuyuan Xian, Minghao Jin, Mengyi Zhang, Jieling Tang, Bingnan Lu, Yiting Yang, Zhenglin Liu, Mingyu Qu, Haonan Ma, Xinru Wu, Huabin Yin, Xiaonan Wang, Xin Liu, Yue Wang, Wenfang Chen, Min Li
    BMC Medical Education.2024;[Epub]     CrossRef
  • Learning environment and its relationship with quality of life and burn-out among undergraduate medical students in Pakistan: a cross-sectional study
    Saadia Shahzad, Gohar Wajid
    BMJ Open.2024; 14(8): e080440.     CrossRef
  • Validation of the Polish version of the DREEM questionnaire – a confirmatory factor analysis
    Dorota Wójcik, Leszek Szalewski, Adam Bęben, Iwona Ordyniec-Kwaśnica, Sue Roff
    BMC Medical Education.2023;[Epub]     CrossRef
  • Association between patient care ownership and personal or environmental factors among medical trainees: a multicenter cross-sectional study
    Hirohisa Fujikawa, Daisuke Son, Takuya Aoki, Masato Eto
    BMC Medical Education.2022;[Epub]     CrossRef
  • Measuring Students’ Perceptions of the Medical School Learning Environment: Translation, Transcultural Adaptation, and Validation of 2 Instruments to the Brazilian Portuguese Language
    Rodolfo F Damiano, Aline O Furtado, Betina N da Silva, Oscarina da S Ezequiel, Alessandra LG Lucchetti, Lisabeth F DiLalla, Sean Tackett, Robert B Shochet, Giancarlo Lucchetti
    Journal of Medical Education and Curricular Development.2020;[Epub]     CrossRef
  • Developing an Introductory Radiology Clerkship at Perdana University Graduate School of Medicine in Kuala Lumpur, Malaysia
    Sarah Wallace Cater, Lakshmi Krishnan, Lars Grimm, Brian Garibaldi, Isabel Green
    Health Professions Education.2017; 3(2): 113.     CrossRef
  • Trainers' perception of the learning environment and student competency: A qualitative investigation of midwifery and anesthesia training programs in Ethiopia
    Sharon Kibwana, Rachel Haws, Adrienne Kols, Firew Ayalew, Young-Mi Kim, Jos van Roosmalen, Jelle Stekelenburg
    Nurse Education Today.2017; 55: 5.     CrossRef
Review Article
Proposal of a linear rather than hierarchical evaluation of educational initiatives: the 7Is framework  
Damian Roland
J Educ Eval Health Prof. 2015;12:35.   Published online June 24, 2015
DOI: https://doi.org/10.3352/jeehp.2015.12.35
  • 39,349 View
  • 198 Download
  • 15 Web of Science
  • 16 Crossref
AbstractAbstract PDF
Extensive resources are expended attempting to change clinical practice; however, determining the effects of these interventions can be challenging. Traditionally, frameworks to examine the impact of educational interventions have been hierarchical in their approach. In this article, existing frameworks to examine medical education initiatives are reviewed and a novel ‘7Is framework’ discussed. This framework contains seven linearly sequenced domains: interaction, interface, instruction, ideation, integration, implementation, and improvement. The 7Is framework enables the conceptualization of the various effects of an intervention, promoting the development of a set of valid and specific outcome measures, ultimately leading to more robust evaluation.

Citations

Citations to this article as recorded by  
  • Redefining the Role of Medical Affairs Professionals as Innovators and Leaders in Industry-Led Medical Education
    Sajita Setia, Elliot Loo, Salil Prakash Shinde, Manmohan Singh, Chew Hooi Wong, Karan Thakkar
    Pharmaceutical Medicine.2024; 38(3): 167.     CrossRef
  • From not knowing to doing: An interprofessional approach to clinician training in use of Goal Attainment Scaling (GAS) as a recovery‐oriented outcome measure in a rural mental health service
    Kate Furlanetto, Rajlaxmi Khopade, Vivek Phutane, Ravi Bhat, Paul Stolee
    International Journal of Mental Health Nursing.2024; 33(5): 1471.     CrossRef
  • Sharpening the lens to evaluate interprofessional education and interprofessional collaboration by improving the conceptual framework: a critical discussion
    Florian B. Neubauer, Felicitas L. Wagner, Andrea Lörwald, Sören Huwendiek
    BMC Medical Education.2024;[Epub]     CrossRef
  • Healthcare decision-makers’ perspectives on evaluating conflict management training in paediatric healthcare: a utilisation-focused qualitative study
    Juliette Phillipson, Sarah Barclay, Esse Menson, Oscar Lyons
    BMJ Paediatrics Open.2024; 8(1): e003047.     CrossRef
  • Integrating planetary health education into tertiary curricula: a practical toolbox for implementation
    Zerina Lokmic-Tomkins, Liza Barbour, Jessica LeClair, Jeneile Luebke, Sarah L. McGuinness, Vijay S. Limaye, Parvathy Pillai, Maxfield Flynn, Michael A. Kamp, Karin Leder, Jonathan A. Patz
    Frontiers in Medicine.2024;[Epub]     CrossRef
  • A Conceptual Framework for Continuing Medical Education and Population Health
    Abhimanyu Sud, Kate Hodgson, Gary Bloch, Ross Upshur
    Teaching and Learning in Medicine.2022; 34(5): 541.     CrossRef
  • Educational programs and teaching strategies for health professionals responding to women with complex perinatal mental health and psychosocial concerns: A scoping review
    Louise Everitt, Virginia Stulz, Rakime Elmir, Virginia Schmied
    Nurse Education in Practice.2022; 60: 103319.     CrossRef
  • A systematic scoping review of teaching and evaluating communications in the intensive care unit
    Elisha Wan Ying Chia, Huixin Huang, Sherill Goh, Marlyn Tracy Peries, Charlotte Cheuk Yiu Lee, Lorraine Hui En Tan, Michelle Shi Qing Khoo, Kuang Teck Tay, Yun Ting Ong, Wei Qiang Lim, Xiu Hui Tan, Yao Hao Tan, Cheryl Shumin Kow, Annelissa Mien Chew Chi
    The Asia Pacific Scholar.2021; 6(1): 3.     CrossRef
  • 198 A Novel ‘Virtual Simulation’ for The Advanced Life Support Group, Making a Dream a Reality: A Beginner’s Guide
    Carl Leith Van Heyningen, Kate Denning, Sibi Joseph
    International Journal of Healthcare Simulation.2021;[Epub]     CrossRef
  • Changes in academic performance in the online, integrated system-based curriculum implemented due to the COVID-19 pandemic in a medical school in Korea
    Do-Hwan Kim, Hyo Jeong Lee, Yanyan Lin, Ye Ji Kang
    Journal of Educational Evaluation for Health Professions.2021; 18: 24.     CrossRef
  • In situ simulation and its effects on patient outcomes: a systematic review
    Daniel Goldshtein, Cole Krensky, Sachin Doshi, Vsevolod S. Perelman
    BMJ Simulation and Technology Enhanced Learning.2020; 6(1): 3.     CrossRef
  • Critical analysis of evidence about the impacts on surgical teams of ‘mental practice’ in systematic reviews: a systematic rapid evidence assessment (SREA)
    Huon Snelgrove, Ben Gabbott
    BMC Medical Education.2020;[Epub]     CrossRef
  • Interprofessional communication (IPC) for medical students: a scoping review
    Chermaine Bok, Cheng Han Ng, Jeffery Wei Heng Koh, Zhi Hao Ong, Haziratul Zakirah Binte Ghazali, Lorraine Hui En Tan, Yun Ting Ong, Clarissa Wei Shuen Cheong, Annelissa Mien Chew Chin, Stephen Mason, Lalit Kumar Radha Krishna
    BMC Medical Education.2020;[Epub]     CrossRef
  • Have we forgotten to teach how to think?
    Damian Roland
    Emergency Medicine Journal.2017; 34(2): 68.     CrossRef
  • Implementation of effective practices in health facilities: a systematic review of cluster randomised trials
    Emma R Allanson, Özge Tunçalp, Joshua P Vogel, Dina N Khan, Olufemi T Oladapo, Qian Long, Ahmet Metin Gülmezoglu
    BMJ Global Health.2017; 2(2): e000266.     CrossRef
  • A qualitative study of self-evaluation of junior doctor performance: is perceived ‘safeness’ a more useful metric than confidence and competence?
    Damian Roland, David Matheson, Timothy Coats, Graham Martin
    BMJ Open.2015; 5(11): e008521.     CrossRef
Research Articles
Student feedback about the integrated curriculum in a Caribbean medical school  
P. Ravi Shankar, Ramanan Balasubramanium, Neelam R. Dwivedi, Vivek Nuguri
J Educ Eval Health Prof. 2014;11:23.   Published online September 30, 2014
DOI: https://doi.org/10.3352/jeehp.2014.11.23
  • 39,636 View
  • 218 Download
  • 10 Web of Science
  • 7 Crossref
AbstractAbstract PDF
Purpose
Xavier University School of Medicine adopted an integrated, organ system-based curriculum in January 2013. The present study was aimed at determining students’ perceptions of the integrated curriculum and related assessment methods. Methods: The study was conducted on first- to fourth-semester undergraduate medical students during March 2014. The students were informed of the study and subsequently invited to participate. Focus group discussions were conducted. The curriculum’s level of integration, different courses offered, teaching-learning methods employed, and the advantages and concerns relating to the curriculum were noted. The respondents also provided feedback about the assessment methods used. Deductive content analysis was used to analyze the data. Results: Twenty-two of the 68 students (32.2%) participated in the study. The respondents expressed generally positive opinions. They felt that the curriculum prepared them well for licensing examinations and future practice. Problem-based learning sessions encouraged active learning and group work among students, thus, improving their understanding of the course material. The respondents felt that certain subjects were allocated a larger proportion of time during the sessions, as well as more questions during the integrated assessment. They also expressed an appreciation for medical humanities, and felt that sessions on the appraisal of literature needed modification. Their opinions about assessment of behavior, attitudes, and professionalism varied. Conclusion: Student opinion was positive, overall. Our findings would be of interest to other medical schools that have recently adopted an integrated curriculum or are in the process of doing so.

Citations

Citations to this article as recorded by  
  • Psychometric properties of a questionnaire assessing the extent of integration in a problem-based learning curriculum
    Marwan F. Abu-Hijleh, Salah Eldin Kassab, Soumaya Allouch, Raja Mahamade Ali, Noor Al-Wattary, Michail Nomikos, Abdel Halim Salem, Mohamed H. Shehata, Reginald P. Sequeira
    BMC Medical Education.2025;[Epub]     CrossRef
  • Adoption of Problem-Based Learning in Medical Schools in Non-Western Countries: A Systematic Review
    See Chai Carol Chan, Anjali Rajendra Gondhalekar, George Choa, Mohammed Ahmed Rashid
    Teaching and Learning in Medicine.2024; 36(2): 111.     CrossRef
  • Perceptions of the learning environment in ophthalmology residency training: A mixed method study
    Muhammad Irfan Kamaruddin, Andi Alfian Zainuddin, Berti Nelwan, Sri Asriyani, Firdaus Hamid, Tenri Esa, Irawan Yusuf
    The Asia Pacific Scholar.2024; 9(2): 39.     CrossRef
  • Promising score for teaching and learning environment: an experience of a fledgling medical college in Saudi Arabia
    Mohammed Almansour, Bader A. AlMehmadi, Nida Gulzar Zeb, Ghassan Matbouly, Waqas Sami, Al-Mamoon Badahdah
    BMC Medical Education.2023;[Epub]     CrossRef
  • Using generalizability analysis to estimate parameters for anatomy assessments: A multi‐institutional study
    Jessica N. Byram, Mark F. Seifert, William S. Brooks, Laura Fraser‐Cotlin, Laura E. Thorp, James M. Williams, Adam B. Wilson
    Anatomical Sciences Education.2017; 10(2): 109.     CrossRef
  • Recall of Theoretical Pharmacology Knowledge by 6th Year Medical Students and Interns of Three Medical Schools in Riyadh, Saudi Arabia
    A. A. Mustafa, H. A. Alassiry, A. Al-Turki, N. Alamri, N. A. Alhamdan, Abdalla Saeed
    Education Research International.2016; 2016: 1.     CrossRef
  • Designing and conducting a two day orientation program for first semester undergraduate medical students
    P. Ravi Shankar
    Journal of Educational Evaluation for Health Professions.2014; 11: 31.     CrossRef
Students’ perception of the learning environment at Xavier University School of Medicine, Aruba  
P. Ravi Shankar, Arun K Dubey, Ramanan Balasubramanium
J Educ Eval Health Prof. 2013;10:8.   Published online September 30, 2013
DOI: https://doi.org/10.3352/jeehp.2013.10.8
  • 50,551 View
  • 220 Download
  • 11 Crossref
PDF

Citations

Citations to this article as recorded by  
  • Study environment and teaching–learning activities: a cross sectional study in Vietnam
    Hoa H. Nguyen, Lam T. Huynh, Quan M. N. Huynh
    Learning Environments Research.2025; 28(3): 835.     CrossRef
  • Evaluation of integrated foundational medical curriculum from Wuhan University: a cross-sectional study based on questionnaires
    Xianlong Zhou, Bingyang Lv, Xingxing He, Dongxu Li, Xiaoyang Zhang, Wei Fan, Ping Wang, Jie Liu, Mingxia Yu
    Cogent Education.2024;[Epub]     CrossRef
  • Medical Students’ Perception of the Educational Environment at College of Medicine: A Prospective Study with a Review of Literature
    Syed Sameer Aga, Muhammad Anwar Khan, Mansour Al Qurashi, Bader Khawaji, Mubarak Al-Mansour, Syed Waqas Shah, Amir Abushouk, Hassan Abdullah Alabdali, Ahmed Sultan Alharbi, Mishal Essam Hawsawi, Osama Ali Alzharani, Ehsan Namaziandost
    Education Research International.2021; 2021: 1.     CrossRef
  • The Use of Clinical PBL in Primary Care in Undergraduate Medical Schools
    Gustavo Salata Romão, Reinaldo Bulgarelli Bestetti, Lucélio Bernardes Couto
    Revista Brasileira de Educação Médica.2020;[Epub]     CrossRef
  • Aplicação do PBL Clínico na Atenção Primária em Cursos de Medicina
    Gustavo Salata Romão, Reinaldo Bulgarelli Bestetti, Lucélio Bernardes Couto
    Revista Brasileira de Educação Médica.2020;[Epub]     CrossRef
  • Understanding the Mentoring Environment Through Thematic Analysis of the Learning Environment in Medical Education: a Systematic Review
    Jia Min Hee, Hong Wei Yap, Zheng Xuan Ong, Simone Qian Min Quek, Ying Pin Toh, Stephen Mason, Lalit Kumar Radha Krishna
    Journal of General Internal Medicine.2019; 34(10): 2190.     CrossRef
  • ASSESSMENT OF STUDENTS’ PERCEPTION ABOUT EDUCATIONAL ENVIRONMENT OF A MEDICAL COLLEGE IN KERALA
    Paul Daniel, Celine Thalappillil Mathew
    Journal of Evidence Based Medicine and Healthcare.2017; 4(51): 3103.     CrossRef
  • Development of an instrument to measure medical students’ perceptions of the assessment environment: initial validation
    Joong Hiong Sim, Wen Ting Tong, Wei-Han Hong, Jamuna Vadivelu, Hamimah Hassan
    Medical Education Online.2015; 20(1): 28612.     CrossRef
  • Veterinary students’ perceptions of their learning environment as measured by the Dundee Ready Education Environment Measure
    Jacquelyn M Pelzer, Jennifer L Hodgson, Stephen R Werre
    BMC Research Notes.2014;[Epub]     CrossRef
  • Designing and conducting a two day orientation program for first semester undergraduate medical students
    P. Ravi Shankar
    Journal of Educational Evaluation for Health Professions.2014; 11: 31.     CrossRef
  • Students’ perception of the learning environment at Xavier University School of Medicine, Aruba: a follow-up study
    P. Ravi Shankar, Rishi Bharti, Ravi Ramireddy, Ramanan Balasubramanium, Vivek Nuguri
    Journal of Educational Evaluation for Health Professions.2014; 11: 9.     CrossRef
United States medical students’ knowledge of Alzheimer disease  
Brian J. Nagle, Paula M. Usita, Steven D. Edland
J Educ Eval Health Prof. 2013;10:4.   Published online May 27, 2013
DOI: https://doi.org/10.3352/jeehp.2013.10.4
  • 49,998 View
  • 238 Download
  • 29 Crossref
AbstractAbstract PDF
Purpose
A knowledge gap exists between general physicians and specialists in diagnosing and managing Alzheimer disease (AD). This gap is concerning due to the estimated rise in prevalence of AD and cost to the health care system. Medical school is a viable avenue to decrease the gap, educating future physicians before they specialize. The purpose of this study was to assess the knowledge level of students in their first and final years of medical school. Methods: Fourteen participating United States medical schools used e-mail student rosters to distribute an online survey of a quantitative cross-sectional assessment of knowledge about AD; 343 students participated. Knowledge was measured using the 12-item University of Alabama at Birmingham AD Knowledge Test for Health Professionals. General linear models were used to examine the effect of demographic variables and previous experience with AD on knowledge scores. Results: Only 2.5% of first year and 68.0% of final year students correctly scored ten or more items on the knowledge scale. Personal experience with AD predicted higher knowledge scores in final year students (P=0.027). Conclusion: Knowledge deficiencies were common in final year medical students. Future studies to identify and evaluate the efficacy of AD education programs in medical schools are warranted. Identifying and disseminating effective programs may help close the knowledge gap.

Citations

Citations to this article as recorded by  
  • Adolescents’ knowledge regarding Alzheimer’s disease is declining: An international study
    Yulia Roitblat, Elijah Faridnia, Antony Morgan, Liliia Nehuliaieva, Kadri Mametov, Nigora Z. Nazarova, Michael Shterenshis
    Educational Gerontology.2025; 51(7): 774.     CrossRef
  • Assessing the knowledge of Alzheimer’s disease among interns from different healthcare professions in Jeddah, Saudi Arabia: a cross-sectional study
    Majed Alharbi, Ahmad J. Almalki, Bayan Alghamdi, Aljoharah Almalki, Nouf Jenaideb, Shahad Khinkar, Reem Alharbi, Mohammad S. Alzahrani
    Frontiers in Public Health.2025;[Epub]     CrossRef
  • Awareness and knowledge of dementia and its communication disorders amongst Brazilian speech and language therapists
    Bárbara Costa Beber, Emily Viega Alves, Natalie Pereira, Maria Isabel d’Ávila Freitas, Marcela Lima Silagi, Márcia Lorena Fagundes Chaves, Brian Lawlor
    International Journal of Language & Communication Disorders.2024; 59(6): 2229.     CrossRef
  • Dil ve Konuşma Terapisi Bölümü Öğrencilerinin Demansa Yönelik Bilgi, Tutum ve Eğitim Gerekliliklerinin Belirlenmesi
    Nazmiye Atila-çağlar, Elife Barmak
    Ergoterapi ve Rehabilitasyon Dergisi.2024;[Epub]     CrossRef
  • The knowledge and attitude of Nepalese nursing students towards dementia
    Ranjana Khatiwada, Siman Lyu, Haocheng Wang, Sushila Devi Bhandari, Yu Liu
    Heliyon.2023; 9(8): e19247.     CrossRef
  • Awareness and Perception Toward Alzheimer’s Disease Among Residents Living in the Jazan Province, Saudi Arabia: A Cross-Sectional Study
    Faisal Hakami, Mohammed Ali Madkhali, Eman Saleh, Raum Ayoub, Sarah Moafa, Akram Moafa, Bushra Alnami, Bushra Maashi, Saad Khubrani, Wafa Busayli, Abdulaziz Alhazmi
    Cureus.2023;[Epub]     CrossRef
  • The attitude and knowledge of medical students regarding dementia
    Josip Stojic, Maja Petrosanec, Milan Milosevic, Marina Boban
    Acta Neurologica Belgica.2022; 122(3): 625.     CrossRef
  • Assessment of dementia knowledge and its associated factors among final year medical undergraduates in selected universities across Malaysia
    Chee Mun Chan, Marjorie Jia Yi Ong, Adam Aiman Zakaria, Monikha Maria Visusasam, Mohd Fairuz Ali, Teh Rohaila Jamil, Azimatun Noor Aizuddin, Aznida Firzah Abdul Aziz
    BMC Geriatrics.2022;[Epub]     CrossRef
  • Providing dementia education with augmented reality: a health sciences and medicine feasibility pilot study
    Cindy Jones, Daniel Khalil, Karanjot Mander, Alexandra Yeoh, Christian Moro
    Research in Learning Technology.2022;[Epub]     CrossRef
  • Psychosocial care in dementia in European higher education: Evidence from the SiDECar (“Skills in DEmentia Care”) project
    G. Ottoboni, I. Chirico, P. Povolná, V. Dostálová, I. Holmerová, N. Janssen, F. Dassen, M. de Vugt, Ma.C. Sánchez-Gómez, F. García-Peñalvo, M.A. Franco-Martin, R. Chattat
    Nurse Education Today.2021; : 104977.     CrossRef
  • Knowledge regarding Alzheimer’s Disease among College Students of Kathmandu, Nepal
    Kushalata Baral, Maginsh Dahal, Shneha Pradhan
    International Journal of Alzheimer's Disease.2020; 2020: 1.     CrossRef
  • Evaluation of knowledge of Alzheimer disease among health university students in Riyadh, Saudi Arabia
    Mohamed N. Al Arifi
    Saudi Pharmaceutical Journal.2020; 28(8): 911.     CrossRef
  • Extent of and influences on knowledge of Alzheimer's disease among undergraduate medical students
    AbdulazizM Shadid, AbdulrahmanYousef Aldayel, Asem Shadid, AliM Alqaraishi, MahaM Gholah, FayA Almughiseeb, YaraAbdullah Alessa, HaimaF Alani, Salah Ud Din Khan, Saleh Algarni
    Journal of Family Medicine and Primary Care.2020; 9(7): 3707.     CrossRef
  • Dementia knowledge, attitudes and training needs of speech–language pathology students and practitioners: A countrywide study
    Nicola Saccasan, Charles Scerri
    International Journal of Language & Communication Disorders.2020; 55(6): 955.     CrossRef
  • Implementation and impact of unforgettable: an interactive art program for people with dementia and their caregivers
    Iris Hendriks, Franka J.M. Meiland, Debby L. Gerritsen, Rose-Marie Dröes
    International Psychogeriatrics.2019; 31(3): 351.     CrossRef
  • Journal of Educational Evaluation for Health Professions will be accepted for inclusion in Scopus
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2019; 16: 2.     CrossRef
  • A qualitative study of the impact of a dementia experiential learning project on pre-medical students: a friend for Rachel
    Jill S. Goldman, Amy E. Trommer
    BMC Medical Education.2019;[Epub]     CrossRef
  • Dementia Care Content in Prelicensure Nursing Curricula: A Pilot Mixed-Methods Study
    Modupe Adewuyi, Laura P. Kimble, Sharon L. Dormire, Tanya Sudia
    Journal of Nursing Education.2018; 57(2): 88.     CrossRef
  • Knowledge and attitudes towards dementia in adolescent students
    Mokhtar G. E. K. N. Isaac, Maria M. Isaac, Nicolas Farina, Naji Tabet
    Journal of Mental Health.2017; 26(5): 419.     CrossRef
  • Communication And Respect for people with Dementia: Student learning – A novel practical experience of undergraduate students interacting with people with dementia in care homes (innovative practice)
    Julia Helen Wood, Ledia Alushi, John A Hammond
    Dementia.2017; 16(2): 243.     CrossRef
  • Encountering aged care: a mixed methods investigation of medical students’ clinical placement experiences
    Michael J. Annear, Emma Lea, Amanda Lo, Laura Tierney, Andrew Robinson
    BMC Geriatrics.2016;[Epub]     CrossRef
  • An Academic Educational Program for Providing Eye Care to Older Individuals
    Hélène Kergoat, Marie-Jeanne Kergoat
    Creative Education.2016; 07(06): 807.     CrossRef
  • Communication and respect for people with dementia: student learning (CARDS) – the development and evaluation of a pilot of an education intervention for pre-qualifying healthcare students
    Julia Helen Wood, Ledia Alushi, John A. Hammond
    International Psychogeriatrics.2016; 28(4): 647.     CrossRef
  • Reconciling surveillance systems with limited resources: an evaluation of passive surveillance for rabies in an endemic setting
    Laura Craighead, William Gilbert, Dynatra Subasinghe, Barbara Häsler
    Preventive Veterinary Medicine.2015; 121(3-4): 206.     CrossRef
  • Uncontrolled Web-Based Administration of Surveys on Factual Health-Related Knowledge: A Randomized Study of Untimed Versus Timed Quizzing
    Alexander Domnich, Donatella Panatto, Alessio Signori, Nicola Luigi Bragazzi, Maria Luisa Cristina, Daniela Amicizia, Roberto Gasparini
    Journal of Medical Internet Research.2015; 17(4): e94.     CrossRef
  • Education Research: Changing medical student perceptions of dementia
    Hannah J. Roberts, James M. Noble
    Neurology.2015; 85(8): 739.     CrossRef
  • Evaluation of dementia education programs for pre-registration healthcare students—A review of the literature
    Ledia Alushi, John A. Hammond, Julia H. Wood
    Nurse Education Today.2015; 35(9): 992.     CrossRef
  • Knowledge about dementia in South Korean nursing students: a cross-sectional survey
    Jung Ha Shin, Hyun-Ju Seo, Kye Ha Kim, Kyoung-Hoon Kim, Youngjin Lee
    BMC Nursing.2015;[Epub]     CrossRef
  • The New Era of : What Should Be Prepared to Be a Top Journal in the Category of Gastroenterology and Hepatology
    Sun Huh
    Journal of Neurogastroenterology and Motility.2013; 19(4): 419.     CrossRef
Review Article
Assessment methods in surgical training in the United Kingdom
Evgenios Evgeniou, Loizou Peter, Maria Tsironi, Srinivasan Iyer
J Educ Eval Health Prof. 2013;10:2.   Published online February 5, 2013
DOI: https://doi.org/10.3352/jeehp.2013.10.2
  • 59,913 View
  • 225 Download
  • 12 Crossref
AbstractAbstract PDF
A career in surgery in the United Kingdom demands a commitment to a long journey of assessment. The assessment methods used must ensure that the appropriate candidates are selected into a programme of study or a job and must guarantee public safety by regulating the progression of surgical trainees and the certification of trained surgeons. This review attempts to analyse the psychometric properties of various assessment methods used in the selection of candidates to medical school, job selection, progression in training, and certification. Validity is an indicator of how well an assessment measures what it is designed to measure. Reliability informs us whether a test is consistent in its outcome by measuring the reproducibility and discriminating ability of the test. In the long journey of assessment in surgical training, the same assessment formats are frequently being used for selection into a programme of study, job selection, progression, and certification. Although similar assessment methods are being used for different purposes in surgical training, the psychometric properties of these assessment methods have not been examined separately for each purpose. Because of the significance of these assessments for trainees and patients, their reliability and validity should be examined thoroughly in every context where the assessment method is being used.

Citations

Citations to this article as recorded by  
  • Surgical video-based temporal action analysis algorithm and competency assessment in laparoscopic cholecystectomy: development and exploratory evaluation
    Hung-Hsuan Yen, Ming-Chih Ho, Meng-Han Yang, Yi-Hsiang Hsiao, Hsiang-Wei Huang, Jia-Yuan Huang, Chun-Chieh Huang, Jakey Blue
    Surgical Endoscopy.2026; 40(1): 391.     CrossRef
  • MCQs in-training examination scores of the surgical residency program in Thailand: the relationship between medical school vs public health-based training institutions
    Saritphat Orrapin, Ob-Uea Homchan, Narain Chotirosniramit, Tidarat Jirapongcharoenlap, Chagkrit Ditsatham
    BMC Medical Education.2024;[Epub]     CrossRef
  • Editorial: Otolaryngology training pathways in sub-Saharan Africa
    S.N. Okerosi, Evelyne Diom, Wakisa Mulwafu, Johannes J. Fagan
    Current Opinion in Otolaryngology & Head & Neck Surgery.2022; 30(3): 198.     CrossRef
  • Exploring the surgical residents’ experience of teaching and learning process in the operating room: A grounded theory study
    Leila Sadati, Shahram Yazdani, Peigham Heidarpoor
    Journal of Education and Health Promotion.2021;[Epub]     CrossRef
  • Influence of Trainer Role, Subspecialty and Hospital Status on Consultant Workplace-based Assessment Completion
    Ahmed Latif, Luke Hopkins, David Robinson, Christopher Brown, Tarig Abdelrahman, Richard Egan, Awen Iorwerth, John Pollitt, Wyn G. Lewis
    Journal of Surgical Education.2019; 76(4): 1068.     CrossRef
  • CORTRAK Superuser Competency Assessment and Training Recommendations
    Annette M. Bourgault, Laura Gonzalez, Lillian Aguirre, Joseph A. Ibrahim
    American Journal of Critical Care.2019; 28(1): 30.     CrossRef
  • The Role of Simulation in Microsurgical Training
    Evgenios Evgeniou, Harriet Walker, Sameer Gujral
    Journal of Surgical Education.2018; 75(1): 171.     CrossRef
  • A Shift on the Horizon: A Systematic Review of Assessment Tools for Plastic Surgery Trainees
    Victoria E. McKinnon, Portia Kalun, Mark H. McRae, Ranil R. Sonnadara, Christine Fahim
    Plastic & Reconstructive Surgery.2018; 142(2): 217e.     CrossRef
  • Agreement between 2 raters’ evaluations of a traditional prosthodontic practical exam integrated with directly observed procedural skills in Egypt
    Ahmed Khalifa Khalifa, Salah Hegazy
    Journal of Educational Evaluation for Health Professions.2018; 15: 23.     CrossRef
  • Training, assessment and accreditation in surgery
    Abdullatif Aydin, Rebecca Fisher, Muhammad Shamim Khan, Prokar Dasgupta, Kamran Ahmed
    Postgraduate Medical Journal.2017; 93(1102): 441.     CrossRef
  • Methodological shortcomings in the literature evaluating the role and applications of 3D training for surgical trainees
    Milosz Kostusiak, Michael Hart, Damiano Giuseppe Barone, Riikka Hofmann, Ramez Kirollos, Thomas Santarius, Rikin Trivedi
    Medical Teacher.2017; 39(11): 1168.     CrossRef
  • The current status and development of a skill examination for the Korean speciality certification examination
    Jung Jin Cho, Hye Mi Noh, Seung Ho Kim, Ho Kwon, Young Min Park, Byung Min Choi
    Journal of the Korean Medical Association.2014; 57(5): 444.     CrossRef
Research Articles
Assessment of structured physical examination skills training using a retro-pre-questionnaire  
Mal Piryani Rano, Shankar P. Ravi, Piryani Suneel, Thapa Trilok Pati, Karki Balmansingh, Khakurel Mahesh Prasad, Bhandary Shital
J Educ Eval Health Prof. 2013;10:13.   Published online December 4, 2012
DOI: https://doi.org/10.3352/jeehp.2013.10.13
  • 30,128 View
  • 171 Download
  • 4 Web of Science
  • 1 Crossref
PDF

Citations

Citations to this article as recorded by  
  • How Far Has theInternational Neurourology JournalProgressed Since Its Transformation Into an English Language Journal?
    Sun Huh
    International Neurourology Journal.2014; 18(1): 3.     CrossRef
Improved quality and quantity of written feedback is associated with a structured feedback proforma
Philip M. Newton, Melisa J. Wallace, Judy McKimm
J Educ Eval Health Prof. 2012;9:10.   Published online August 13, 2012
DOI: https://doi.org/10.3352/jeehp.2012.9.10
  • 57,141 View
  • 216 Download
  • 19 Crossref
AbstractAbstract PDF
Facilitating the provision of detailed, deep and useful feedback is an important design feature of any educational programme. Here we evaluate feedback provided to medical students completing short transferable skills projects. Feedback quantity and depth were evaluated before and after a simple intervention to change the structure of the feedback-provision form from a blank free-text feedback form to a structured proforma that asked a pair of short questions for each of the six domains being assessed. Each pair of questions consisted of asking the marker ?占퐓hat was done well???and ?占퐓hat changes would improve the assignment???Changing the form was associated with a significant increase in the quantity of the feedback and in the amount and quality of feedback provided to students. We also observed that, for these double-marked projects, the marker designated as ?占퐉arker 1??consistently wrote more feedback than the marker designated ?占퐉arker 2??

Citations

Citations to this article as recorded by  
  • Animated process-transparency in student evaluation of teaching: effects on the quality and quantity of student feedback
    Marloes Nederhand, Bas Giesbers, Judith Auer, Ad Scheepers
    Assessment & Evaluation in Higher Education.2024; 49(3): 288.     CrossRef
  • Development and evaluation of two interventions to improve students’ reflection on feedback
    Richard Harris, Pam Blundell-Birtill, Madeleine Pownall
    Assessment & Evaluation in Higher Education.2023; 48(5): 672.     CrossRef
  • How an EPA-based curriculum supports professional identity formation
    Anne E. Bremer, Marjolein H. J. van de Pol, Roland F. J. M. Laan, Cornelia R. M. G. Fluit
    BMC Medical Education.2022;[Epub]     CrossRef
  • Narrative Assessments in Higher Education: A Scoping Review to Identify Evidence-Based Quality Indicators
    Molk Chakroun, Vincent R Dion, Kathleen Ouellet, Ann Graillon, Valérie Désilets, Marianne Xhignesse, Christina St-Onge
    Academic Medicine.2022; 97(11): 1699.     CrossRef
  • Teaching in Geriatrics: The Potential of a Structured Written Feedback for the Improvement of Lectures
    Theresa Pohlmann, Volker Paulmann, Sandra Steffens, Klaus Hager
    European Journal of Geriatrics and Gerontology.2022; 4(3): 123.     CrossRef
  • Enhancing written feedback: The use of a cover sheet influences feedback quality
    J.G. Arts, M. Jaspers, D. Joosten-ten Brinke, Sammy King Fai Hui
    Cogent Education.2021;[Epub]     CrossRef
  • Implementation of written structured feedback into a surgical OSCE
    J. Sterz, S. Linßen, M. C. Stefanescu, T. Schreckenbach, L. B. Seifert, M. Ruesseler
    BMC Medical Education.2021;[Epub]     CrossRef
  • Eliciting student feedback for course development: the application of a qualitative course evaluation tool among business research students
    Carly Steyn, Clint Davies, Adeel Sambo
    Assessment & Evaluation in Higher Education.2019; 44(1): 11.     CrossRef
  • Diş Hekimliği Eğitiminde Beceri ve Yeterliğin Değerlendirilmesi II: Değerlendirme Yöntemleri
    Kadriye Funda AKALTAN
    Selcuk Dental Journal.2019; 6(5): 72.     CrossRef
  • Effect of individual structured and qualified feedback on improving clinical performance of dental students in clinical courses‐randomised controlled study
    I. M. Schüler, R. Heinrich‐Weltzien, M. Eiselt
    European Journal of Dental Education.2018;[Epub]     CrossRef
  • The Effect of High-Frequency, Structured Expert Feedback on the Learning Curves of Basic Interventional Ultrasound Skills Applied to Regional Anesthesia
    Getúlio Rodrigues de Oliveira Filho, Francisco de Assis Caire Mettrau
    Anesthesia & Analgesia.2018; 126(3): 1028.     CrossRef
  • A case study on written comments as a form of feedback in teacher education: so much to gain
    Jorik Gerardus Arts, Mieke Jaspers, Desiree Joosten-ten Brinke
    European Journal of Teacher Education.2016; 39(2): 159.     CrossRef
  • Medical students’ satisfaction with the Applied Basic Clinical Seminar with Scenarios for Students, a novel simulation-based learning method in Greece
    Panteleimon Pantelidis, Nikolaos Staikoglou, Georgios Paparoidamis, Christos Drosos, Stefanos Karamaroudis, Athina Samara, Christodoulos Keskinis, Michail Sideris, George Giannakoulas, Georgios Tsoulfas, Asterios Karagiannis
    Journal of Educational Evaluation for Health Professions.2016; 13: 13.     CrossRef
  • The effect of written standardized feedback on the structure and quality of surgical lectures: A prospective cohort study
    Jasmina Sterz, Sebastian H. Höfer, Bernd Bender, Maren Janko, Farzin Adili, Miriam Ruesseler
    BMC Medical Education.2016;[Epub]     CrossRef
  • Group Peer Teaching: A Strategy for Building Confidence in Communication and Teamwork Skills in Physical Therapy Students
    Christopher Seenan, Sivaramkumar Shanmugam, Jennie Stewart
    Journal of Physical Therapy Education.2016; 30(3): 40.     CrossRef
  • Does Reflective Learning with Feedback Improve Dental Students’ Self‐Perceived Competence in Clinical Preparedness?
    Jung-Joon Ihm, Deog-Gyu Seo
    Journal of Dental Education.2016; 80(2): 173.     CrossRef
  • Encouraging formative assessments of leadership for foundation doctors
    Lindsay Hadley, David Black, Jan Welch, Peter Reynolds, Clare Penlington
    The Clinical Teacher.2015; 12(4): 231.     CrossRef
  • Use of the ‘Stop, Start, Continue’ method is associated with the production of constructive qualitative feedback by students in higher education
    Alice Hoon, Emily Oliver, Kasia Szpakowska, Philip Newton
    Assessment & Evaluation in Higher Education.2015; 40(5): 755.     CrossRef
  • The New Era of : What Should Be Prepared to Be a Top Journal in the Category of Gastroenterology and Hepatology
    Sun Huh
    Journal of Neurogastroenterology and Motility.2013; 19(4): 419.     CrossRef
Brief Report
An objective structured biostatistics examination: a pilot study based on computer-assisted evaluation for undergraduates
Abdul Sattar Khan, Hamit Acemoglu, Zekeriya Akturk
J Educ Eval Health Prof. 2012;9:9.   Published online July 17, 2012
DOI: https://doi.org/10.3352/jeehp.2012.9.9
  • 29,058 View
  • 168 Download
  • 1 Crossref
AbstractAbstract PDF
We designed and evaluated an objective structured biostatistics examination (OSBE) on a trial basis to determine whether it was feasible for formative or summative assessment. At Ataturk University, we have a seminar system for curriculum for every cohort of all five years undergraduate education. Each seminar consists of an integrated system for different subjects, every year three to six seminars that meet for six to eight weeks, and at the end of each seminar term we conduct an examination as a formative assessment. In 2010, 201 students took the OSBE, and in 2011, 211 students took the same examination at the end of a seminar that had biostatistics as one module. The examination was conducted in four groups and we examined two groups together. Each group had to complete 5 stations in each row therefore we had two parallel lines with different instructions to be followed, thus we simultaneously examined 10 students in these two parallel lines. The students were invited after the examination to receive feedback from the examiners and provide their reflections. There was a significant (P= 0.004) difference between male and female scores in the 2010 students, but no gender difference was found in 2011. The comparison among the parallel lines and among the four groups showed that two groups, A and B, did not show a significant difference (P> 0.05) in either class. Nonetheless, among the four groups, there was a significant difference in both 2010 (P= 0.001) and 2011 (P= 0.001). The inter-rater reliability coefficient was 0.60. Overall, the students were satisfied with the testing method; however, they felt some stress. The overall experience of the OSBE was useful in terms of learning, as well as for assessment.

Citations

Citations to this article as recorded by  
  • THE COMPARISON OF DIFFERENT ASSESSMENT TECHNIQUES USED IN PHYSIOLOGY PRACTICAL ASSESSMENT
    Ksh. Lakshmikumari, Sarada N, Lalit Kumar L
    INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH.2022; : 7.     CrossRef
Research Article
Patient Simulation: A Literary Synthesis of Assessment Tools in Anesthesiology
Alice A. Edler, Ruth G. Fanning, Michael. I. Chen, Rebecca Claure, Dondee Almazan, Brain Struyk, Samuel C. Seiden
J Educ Eval Health Prof. 2009;6:3.   Published online December 20, 2009
DOI: https://doi.org/10.3352/jeehp.2009.6.3
  • 42,064 View
  • 180 Download
  • 14 Crossref
AbstractAbstract PDF
High-fidelity patient simulation (HFPS) has been hypothesized as a modality for assessing competency of knowledge and skill in patient simulation, but uniform methods for HFPS performance assessment (PA) have not yet been completely achieved. Anesthesiology as a field founded the HFPS discipline and also leads in its PA. This project reviews the types, quality, and designated purpose of HFPS PA tools in anesthesiology. We used the systematic review method and systematically reviewed anesthesiology literature referenced in PubMed to assess the quality and reliability of available PA tools in HFPS. Of 412 articles identified, 50 met our inclusion criteria. Seventy seven percent of studies have been published since 2000; more recent studies demonstrated higher quality. Investigators reported a variety of test construction and validation methods. The most commonly reported test construction methods included ?占퐉odified Delphi Techniques??for item selection, reliability measurement using inter-rater agreement, and intra-class correlations between test items or subtests. Modern test theory, in particular generalizability theory, was used in nine (18%) of studies. Test score validity has been addressed in multiple investigations and shown a significant improvement in reporting accuracy. However the assessment of predicative has been low across the majority of studies. Usability and practicality of testing occasions and tools was only anecdotally reported. To more completely comply with the gold standards for PA design, both shared experience of experts and recognition of test construction standards, including reliability and validity measurements, instrument piloting, rater training, and explicit identification of the purpose and proposed use of the assessment tool, are required.

Citations

Citations to this article as recorded by  
  • Simulation-based summative assessment in healthcare: an overview of key principles for practice
    Clément Buléon, Laurent Mattatia, Rebecca D. Minehart, Jenny W. Rudolph, Fernande J. Lois, Erwan Guillouet, Anne-Laure Philippon, Olivier Brissaud, Antoine Lefevre-Scelles, Dan Benhamou, François Lecomte, the SoFraSimS Assessment with simul group, Anne Be
    Advances in Simulation.2022;[Epub]     CrossRef
  • Computer-Based Case Simulations for Assessment in Health Care: A Literature Review of Validity Evidence
    Robyn C. Ward, Timothy J. Muckle, Michael J. Kremer, Mary Anne Krogh
    Evaluation & the Health Professions.2019; 42(1): 82.     CrossRef
  • Competence Assessment Instruments in Perianesthesia Nursing Care: A Scoping Review of the Literature
    Yunsuk Jeon, Riitta-Liisa Lakanmaa, Riitta Meretoja, Helena Leino-Kilpi
    Journal of PeriAnesthesia Nursing.2017; 32(6): 542.     CrossRef
  • Training and Competency in Sedation Practice in Gastrointestinal Endoscopy
    Ben Da, James Buxbaum
    Gastrointestinal Endoscopy Clinics of North America.2016; 26(3): 443.     CrossRef
  • Linking Simulation-Based Educational Assessments and Patient-Related Outcomes
    Ryan Brydges, Rose Hatala, Benjamin Zendejas, Patricia J. Erwin, David A. Cook
    Academic Medicine.2015; 90(2): 246.     CrossRef
  • Simulation With PARTS (Phase-Augmented Research and Training Scenarios)
    Carl J. Schick, Mona Weiss, Michaela Kolbe, Adrian Marty, Micha Dambach, Axel Knauth, Donat R. Spahn, Gudela Grote, Bastian Grande
    Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare.2015; 10(3): 178.     CrossRef
  • Endoscopy nurse-administered propofol sedation performance. Development of an assessment tool and a reliability testing model
    Jeppe Thue Jensen, Lars Konge, Ann Møller, Pernille Hornslet, Peter Vilmann
    Scandinavian Journal of Gastroenterology.2014; 49(8): 1014.     CrossRef
  • Exploration of Specialty Certification for Nurse Anesthetists: Nonsurgical Pain Management as a Test Case
    Steven Wooden, Sharron Docherty, Karen Plaus, Anthony Kusek, Charles Vacchiano
    Pain Management Nursing.2014; 15(4): 789.     CrossRef
  • What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment
    David A. Cook, Benjamin Zendejas, Stanley J. Hamstra, Rose Hatala, Ryan Brydges
    Advances in Health Sciences Education.2014; 19(2): 233.     CrossRef
  • Technology-Enhanced Simulation to Assess Health Professionals
    David A. Cook, Ryan Brydges, Benjamin Zendejas, Stanley J. Hamstra, Rose Hatala
    Academic Medicine.2013; 88(6): 872.     CrossRef
  • Review article: Assessment in anesthesiology education
    John R. Boulet, David Murray
    Canadian Journal of Anesthesia/Journal canadien d'anesthésie.2012; 59(2): 182.     CrossRef
  • Review article: Simulation in anesthesia: state of the science and looking forward
    Vicki R. LeBlanc
    Canadian Journal of Anesthesia/Journal canadien d'anesthésie.2012; 59(2): 193.     CrossRef
  • Simulation and Quality Improvement in Anesthesiology
    Christine S. Park
    Anesthesiology Clinics.2011; 29(1): 13.     CrossRef
  • Performance in assessment: Consensus statement and recommendations from the Ottawa conference
    Katharine Boursicot, Luci Etheridge, Zeryab Setna, Alison Sturrock, Jean Ker, Sydney Smee, Elango Sambandam
    Medical Teacher.2011; 33(5): 370.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions
TOP