Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
3 "Checklist"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Research articles
Correlation between task-based checklists and global rating scores in undergraduate objective structured clinical examinations in Saudi Arabia: a 1-year comparative study  
Uzma Khan, Yasir Naseem Khan
J Educ Eval Health Prof. 2025;22:19.   Published online June 19, 2025
DOI: https://doi.org/10.3352/jeehp.2025.22.19
  • 4,493 View
  • 255 Download
  • 1 Web of Science
  • 2 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study investigated the correlation between task-based checklist scores and global rating scores (GRS) in objective structured clinical examinations (OSCEs) for fourth-year undergraduate medical students and aimed to determine whether both methods can be reliably used in a standard setting.
Methods
A comparative observational study was conducted at Al Rayan College of Medicine, Saudi Arabia, involving 93 fourth-year students during the 2023–2024 academic year. OSCEs from 2 General Practice courses were analyzed, each comprising 10 stations assessing clinical competencies. Students were scored using both task-specific checklists and holistic 5-point GRS. Reliability was evaluated using Cronbach’s α, and the relationship between the 2 scoring methods was assessed using the coefficient of determination (R2). Ethical approval and informed consent were obtained.
Results
The mean OSCE score was 76.7 in Course 1 (Cronbach’s α=0.85) and 73.0 in Course 2 (Cronbach’s α=0.81). R2 values varied by station and competency. Strong correlations were observed in procedural and management skills (R2 up to 0.87), while weaker correlations appeared in history-taking stations (R2 as low as 0.35). The variability across stations highlighted the context-dependence of alignment between checklist and GRS methods.
Conclusion
Both checklists and GRS exhibit reliable psychometric properties. Their combined use improves validity in OSCE scoring, but station-specific application is recommended. Checklists may anchor pass/fail decisions, while GRS may assist in assessing borderline performance. This hybrid model increases fairness and reflects clinical authenticity in competency-based assessment.

Citations

Citations to this article as recorded by  
  • The effectiveness of the “Early clinical exposure” course based on narrative medicine in cultivating the professional qualities of undergraduates in clinical medicine: a mixed-methods study
    Zhao Li, Huijuan Cai, Xiaolin Yang, Jingsong Lin, Wenhua Cao, Peng Zhang, Jing Ren, Dayong Zheng
    BMC Medical Education.2026;[Epub]     CrossRef
  • Agreement and reliability of global rating versus checklist scores in a high-stakes undergraduate OSCE in Rwanda
    Olayinka Rasheed Ibrahim, Natalie McCall, Abebe Bekele, Biniam Ewnte Zelelew, Oluwaseun Ojomo, Anteneh Gadisa Belachew, Equlinet Misganaw Amare, Zelalem Mengistu Gashaw, Birhanu Abera Ayana, Ariane Nina Ndayikeje
    BMC Medical Education.2026;[Epub]     CrossRef
Content validity test of a safety checklist for simulated participants in simulation-based education in the United Kingdom: a methodological study
Matthew Bradley
J Educ Eval Health Prof. 2022;19:21.   Published online August 25, 2022
DOI: https://doi.org/10.3352/jeehp.2022.19.21
  • 4,885 View
  • 202 Download
  • 2 Web of Science
  • 2 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Simulation training is an ever-growing means of healthcare education and often involves simulated participants (SPs), commonly known as actors. Simulation-based education (SBE) can sometimes endanger SPs, and as such we have created a safety checklist for them to follow. This study describes how we developed the checklist through a quality improvement project, and then evaluated feedback responses to assess whether SPs felt our checklist was safe.
Methods
The checklist was provided to SPs working in an acute trust simulation service when delivering multidisciplinary SBE over 4 months. Using multiple plan–do–study–act cycles, the checklist was refined by reflecting on SP feedback to ensure that the standards of the safe simulation were met. We collected 21 responses from September to December 2021 after SPs completed an SBE event.
Results
The responses showed that 100% of SPs felt safe during SBE when using our checklist. The average “confidence in safety” rating before using the checklist was 6.8/10, which increased significantly to 9.2/10 after using the checklist (P<0.0005). The checklist was refined throughout the 4 months and implemented in adult and pediatric SBE as a standard operating procedure.
Conclusion
We recommend using our safety checklist as a standard operating procedure to improve the confidence and safety of SPs during safe and effective simulations.

Citations

Citations to this article as recorded by  
  • The effect of structured checklist-assisted multimedia interactive education on postoperative pain management and quality of life in patients with lower extremity varicose veins: a randomized controlled trial with 1-year follow-up
    Jing Huang, Ling Li, Min Li, Li Ren, Yukui Ma, Huanrui Hu
    Frontiers in Public Health.2026;[Epub]     CrossRef
  • Addressing under-appreciated risk in healthcare simulation
    Paul O’Connor, Angela O’Dea, Dara Byrne
    Advances in Simulation.2026;[Epub]     CrossRef
Comparing the cut score for the borderline group method and borderline regression method with norm-referenced standard setting in an objective structured clinical examination in medical school in Korea  
Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
J Educ Eval Health Prof. 2021;18:25.   Published online September 27, 2021
DOI: https://doi.org/10.3352/jeehp.2021.18.25
  • 9,198 View
  • 339 Download
  • 3 Web of Science
  • 5 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Setting standards is critical in health professions. However, appropriate standard setting methods do not always apply to the set cut score in performance assessment. The aim of this study was to compare the cut score when the standard setting is changed from the norm-referenced method to the borderline group method (BGM) and borderline regression method (BRM) in an objective structured clinical examination (OSCE) in medical school.
Methods
This was an explorative study to model the implementation of the BGM and BRM. A total of 107 fourth-year medical students attended the OSCE at 7 stations for encountering standardized patients (SPs) and at 1 station for performing skills on a manikin on July 15th, 2021. Thirty-two physician examiners evaluated the performance by completing a checklist and global rating scales.
Results
The cut score of the norm-referenced method was lower than that of the BGM (P<0.01) and BRM (P<0.02). There was no significant difference in the cut score between the BGM and BRM (P=0.40). The station with the highest standard deviation and the highest proportion of the borderline group showed the largest cut score difference in standard setting methods.
Conclusion
Prefixed cut scores by the norm-referenced method without considering station contents or examinee performance can vary due to station difficulty and content, affecting the appropriateness of standard setting decisions. If there is an adequate consensus on the criteria for the borderline group, standard setting with the BRM could be applied as a practical and defensible method to determine the cut score for OSCE.

Citations

Citations to this article as recorded by  
  • Refining competency benchmarks: a scoping review of Angoff standard-setting in dental education
    Galvin Sim Siang Lin, Abdul Rauf Badrul Hisham, Muhammad Nazmi Abdul Majid, Chan Choong Foong, Ting Khee Ho, Lara T. Friedlander
    BMC Oral Health.2026;[Epub]     CrossRef
  • Standard setting methods in objective structured clinical examination (OSCE): A comparative study of five methods
    Reshma Ansari, Norhafizah Ab Manan, Nur Ain Mahat, Norfaizatul Shalida Omar, Atikah Abdul Latiff, Sara Idris, Azli Shahril Othman
    Journal of Medical Education Development.2024; 17(56): 87.     CrossRef
  • Analyzing the Quality of Objective Structured Clinical Examination in Alborz University of Medical Sciences
    Suleiman Ahmadi, Amin Habibi, Mitra Rahimzadeh, Shahla Bahrami
    Alborz University Medical Journal.2023; 12(4): 485.     CrossRef
  • Possibility of using the yes/no Angoff method as a substitute for the percent Angoff method for estimating the cutoff score of the Korean Medical Licensing Examination: a simulation study
    Janghee Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 23.     CrossRef
  • Newly appointed medical faculty members’ self-evaluation of their educational roles at the Catholic University of Korea College of Medicine in 2020 and 2021: a cross-sectional survey-based study
    Sun Kim, A Ra Cho, Chul Woon Chung
    Journal of Educational Evaluation for Health Professions.2021; 18: 28.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions
TOP