Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
8 "Objective Structured clinical examination"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Research articles
Is online objective structured clinical examination teaching an acceptable replacement in post-COVID-19 medical education in the United Kingdom?: a descriptive study  
Vashist Motkur, Aniket Bharadwaj, Nimalesh Yogarajah
J Educ Eval Health Prof. 2022;19:30.   Published online November 7, 2022
DOI: https://doi.org/10.3352/jeehp.2022.19.30
  • 2,349 View
  • 147 Download
  • 2 Web of Science
  • 2 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Coronavirus disease 2019 (COVID-19) restrictions resulted in an increased emphasis on virtual communication in medical education. This study assessed the acceptability of virtual teaching in an online objective structured clinical examination (OSCE) series and its role in future education.
Methods
Six surgical OSCE stations were designed, covering common surgical topics, with specific tasks testing data interpretation, clinical knowledge, and communication skills. These were delivered via Zoom to students who participated in student/patient/examiner role-play. Feedback was collected by asking students to compare online teaching with previous experiences of in-person teaching. Descriptive statistics were used for Likert response data, and thematic analysis for free-text items.
Results
Sixty-two students provided feedback, with 81% of respondents finding online instructions preferable to paper equivalents. Furthermore, 65% and 68% found online teaching more efficient and accessible, respectively, than in-person teaching. Only 34% found communication with each other easier online; Forty percent preferred online OSCE teaching to in-person teaching. Students also expressed feedback in positive and negative free-text comments.
Conclusion
The data suggested that generally students were unwilling for online teaching to completely replace in-person teaching. The success of online teaching was dependent on the clinical skill being addressed; some were less amenable to a virtual setting. However, online OSCE teaching could play a role alongside in-person teaching.

Citations

Citations to this article as recorded by  
  • Feasibility and reliability of the pandemic-adapted online-onsite hybrid graduation OSCE in Japan
    Satoshi Hara, Kunio Ohta, Daisuke Aono, Toshikatsu Tamai, Makoto Kurachi, Kimikazu Sugimori, Hiroshi Mihara, Hiroshi Ichimura, Yasuhiko Yamamoto, Hideki Nomura
    Advances in Health Sciences Education.2024; 29(3): 949.     CrossRef
  • Should Virtual Objective Structured Clinical Examination (OSCE) Teaching Replace or Complement Face-to-Face Teaching in the Post-COVID-19 Educational Environment: An Evaluation of an Innovative National COVID-19 Teaching Programme
    Charles Gamble, Alice Oatham, Raj Parikh
    Cureus.2023;[Epub]     CrossRef
Comparing the cut score for the borderline group method and borderline regression method with norm-referenced standard setting in an objective structured clinical examination in medical school in Korea  
Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
J Educ Eval Health Prof. 2021;18:25.   Published online September 27, 2021
DOI: https://doi.org/10.3352/jeehp.2021.18.25
  • 6,231 View
  • 313 Download
  • 3 Web of Science
  • 3 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Setting standards is critical in health professions. However, appropriate standard setting methods do not always apply to the set cut score in performance assessment. The aim of this study was to compare the cut score when the standard setting is changed from the norm-referenced method to the borderline group method (BGM) and borderline regression method (BRM) in an objective structured clinical examination (OSCE) in medical school.
Methods
This was an explorative study to model the implementation of the BGM and BRM. A total of 107 fourth-year medical students attended the OSCE at 7 stations for encountering standardized patients (SPs) and at 1 station for performing skills on a manikin on July 15th, 2021. Thirty-two physician examiners evaluated the performance by completing a checklist and global rating scales.
Results
The cut score of the norm-referenced method was lower than that of the BGM (P<0.01) and BRM (P<0.02). There was no significant difference in the cut score between the BGM and BRM (P=0.40). The station with the highest standard deviation and the highest proportion of the borderline group showed the largest cut score difference in standard setting methods.
Conclusion
Prefixed cut scores by the norm-referenced method without considering station contents or examinee performance can vary due to station difficulty and content, affecting the appropriateness of standard setting decisions. If there is an adequate consensus on the criteria for the borderline group, standard setting with the BRM could be applied as a practical and defensible method to determine the cut score for OSCE.

Citations

Citations to this article as recorded by  
  • Analyzing the Quality of Objective Structured Clinical Examination in Alborz University of Medical Sciences
    Suleiman Ahmadi, Amin Habibi, Mitra Rahimzadeh, Shahla Bahrami
    Alborz University Medical Journal.2023; 12(4): 485.     CrossRef
  • Possibility of using the yes/no Angoff method as a substitute for the percent Angoff method for estimating the cutoff score of the Korean Medical Licensing Examination: a simulation study
    Janghee Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 23.     CrossRef
  • Newly appointed medical faculty members’ self-evaluation of their educational roles at the Catholic University of Korea College of Medicine in 2020 and 2021: a cross-sectional survey-based study
    Sun Kim, A Ra Cho, Chul Woon Chung
    Journal of Educational Evaluation for Health Professions.2021; 18: 28.     CrossRef
Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia  
Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
J Educ Eval Health Prof. 2021;18:23.   Published online September 23, 2021
DOI: https://doi.org/10.3352/jeehp.2021.18.23
  • 5,513 View
  • 233 Download
  • 4 Web of Science
  • 5 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
It aimed to compare the use of the tele-objective structured clinical examination (teleOSCE) with in-person assessment in high-stakes clinical examination so as to determine the impact of the teleOSCE on the assessment undertaken. Discussion follows regarding what skills and domains can effectively be assessed in a teleOSCE.
Methods
This study is a retrospective observational analysis. It compares the results achieved by final year medical students in their clinical examination, assessed using the teleOSCE in 2020 (n=285), with those who were examined using the traditional in-person format in 2019 (n=280). The study was undertaken at the University of New South Wales, Australia.
Results
In the domain of physical examination, students in 2020 scored 0.277 points higher than those in 2019 (mean difference=–0.277, P<0.001, effect size=0.332). Across all other domains, there was no significant difference in mean scores between 2019 and 2020.
Conclusion
The teleOSCE does not negatively impact assessment in clinical examination in all domains except physical examination. If the teleOSCE is the future of clinical skills examination, assessment of physical examination will require concomitant workplace-based assessment.

Citations

Citations to this article as recorded by  
  • Feasibility and reliability of the pandemic-adapted online-onsite hybrid graduation OSCE in Japan
    Satoshi Hara, Kunio Ohta, Daisuke Aono, Toshikatsu Tamai, Makoto Kurachi, Kimikazu Sugimori, Hiroshi Mihara, Hiroshi Ichimura, Yasuhiko Yamamoto, Hideki Nomura
    Advances in Health Sciences Education.2024; 29(3): 949.     CrossRef
  • Feasibility of an online clinical assessment of competence in physiotherapy students
    Brooke Flew, Lucy Chipchase, Darren Lee, Jodie A. McClelland
    Physiotherapy Theory and Practice.2024; : 1.     CrossRef
  • Navigating digital assessments in medical education: Findings from a scoping review
    Chin-Siang Ang, Sakura Ito, Jennifer Cleland
    Medical Teacher.2024; : 1.     CrossRef
  • Radiography education in 2022 and beyond - Writing the history of the present: A narrative review
    Y.X. Tay, J.P. McNulty
    Radiography.2023; 29(2): 391.     CrossRef
  • Newly appointed medical faculty members’ self-evaluation of their educational roles at the Catholic University of Korea College of Medicine in 2020 and 2021: a cross-sectional survey-based study
    Sun Kim, A Ra Cho, Chul Woon Chung
    Journal of Educational Evaluation for Health Professions.2021; 18: 28.     CrossRef
Review
Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review  
Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
J Educ Eval Health Prof. 2021;18:11.   Published online June 1, 2021
DOI: https://doi.org/10.3352/jeehp.2021.18.11
  • 7,568 View
  • 416 Download
  • 12 Web of Science
  • 12 Crossref
AbstractAbstract PDFSupplementary Material
The coronavirus disease 2019 (COVID-19) pandemic has required educators to adapt the in-person objective structured clinical examination (OSCE) to online settings in order for it to remain a critical component of the multifaceted assessment of a student’s competency. This systematic scoping review aimed to summarize the assessment methods and validity and reliability of the measurement tools used in current online OSCE (hereafter, referred to as teleOSCE) approaches. A comprehensive literature review was undertaken following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews guidelines. Articles were eligible if they reported any form of performance assessment, in any field of healthcare, delivered in an online format. Two reviewers independently screened the results and analyzed relevant studies. Eleven articles were included in the analysis. Pre-recorded videos were used in 3 studies, while observations by remote examiners through an online platform were used in 7 studies. Acceptability as perceived by students was reported in 2 studies. This systematic scoping review identified several insights garnered from implementing teleOSCEs, the components transferable from telemedicine, and the need for systemic research to establish the ideal teleOSCE framework. TeleOSCEs may be able to improve the accessibility and reproducibility of clinical assessments and equip students with the requisite skills to effectively practice telemedicine in the future.

Citations

Citations to this article as recorded by  
  • Feasibility and reliability of the pandemic-adapted online-onsite hybrid graduation OSCE in Japan
    Satoshi Hara, Kunio Ohta, Daisuke Aono, Toshikatsu Tamai, Makoto Kurachi, Kimikazu Sugimori, Hiroshi Mihara, Hiroshi Ichimura, Yasuhiko Yamamoto, Hideki Nomura
    Advances in Health Sciences Education.2024; 29(3): 949.     CrossRef
  • A level playing field? Evaluation of the virtual Objective Structured Clinical Examination in Psychiatry and Addiction Medicine: A mixed methods study
    Rebecca E Reay, Paul A Maguire, Jeffrey CL Looi
    Australasian Psychiatry.2024; 32(4): 359.     CrossRef
  • Conducting an objective structured clinical examination under COVID-restricted conditions
    Andrea Gotzmann, John Boulet, Yichi Zhang, Judy McCormick, Mathieu Wojcik, Ilona Bartman, Debra Pugh
    BMC Medical Education.2024;[Epub]     CrossRef
  • The virtual Clinical Assessment of Skills and Competence: the impact and challenges of a digitised final examination
    Kenny Chu, Shivanthi Sathanandan
    BJPsych Bulletin.2023; 47(2): 110.     CrossRef
  • Virtual Learning and Assessment in Rheumatology Fellowship Training: Objective Structured Clinical Examination Revisited
    Rachel M. Wolfe, Faye N. Hant, Rumey C. Ishizawar, Lisa G. Criscione‐Schreiber, Beth L. Jonas, Kenneth S. O'Rourke, Marcy B. Bolster
    Arthritis Care & Research.2023; 75(12): 2435.     CrossRef
  • Innovations in assessment in health professions education during the COVID‐19 pandemic: A scoping review
    Jamal Giri, Claire Stewart
    The Clinical Teacher.2023;[Epub]     CrossRef
  • Evaluation of the Utility of Online Objective Structured Clinical Examination Conducted During the COVID-19 Pandemic
    Mona Arekat, Mohamed Hany Shehata, Abdelhalim Deifalla, Ahmed Al-Ansari, Archana Kumar, Mohamed Alsenbesy, Hamdi Alshenawi, Amgad El-Agroudy, Mariwan Husni, Diaa Rizk, Abdelaziz Elamin, Afif Ben Salah, Hani Atwa
    Advances in Medical Education and Practice.2022; Volume 13: 407.     CrossRef
  • Comparison of student pharmacists' performance on in-person vs. virtual OSCEs in a pre-APPE capstone course
    Justine S. Gortney, Joseph P. Fava, Andrew D. Berti, Brittany Stewart
    Currents in Pharmacy Teaching and Learning.2022; 14(9): 1116.     CrossRef
  • Is online objective structured clinical examination teaching an acceptable replacement in post-COVID-19 medical education in the United Kingdom?: a descriptive study
    Vashist Motkur, Aniket Bharadwaj, Nimalesh Yogarajah
    Journal of Educational Evaluation for Health Professions.2022; 19: 30.     CrossRef
  • Equal Z standard-setting method to estimate the minimum number of panelists for a medical school’s objective structured clinical examination in Taiwan: a simulation study
    Ying-Ying Yang, Pin-Hsiang Huang, Ling-Yu Yang, Chia-Chang Huang, Chih-Wei Liu, Shiau-Shian Huang, Chen-Huan Chen, Fa-Yauh Lee, Shou-Yen Kao, Boaz Shulruf
    Journal of Educational Evaluation for Health Professions.2022; 19: 27.     CrossRef
  • Applying the Student Response System in the Online Dermatologic Video Curriculum on Medical Students' Interaction and Learning Outcomes during the COVID-19 Pandemic
    Chih-Tsung Hung, Shao-An Fang, Feng-Cheng Liu, Chih-Hsiung Hsu, Ting-Yu Yu, Wei-Ming Wang
    Indian Journal of Dermatology.2022; 67(4): 477.     CrossRef
  • Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia
    Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
    Journal of Educational Evaluation for Health Professions.2021; 18: 23.     CrossRef
Research article
Sequential Objective Structured Clinical Examination based on item response theory in Iran  
Sara Mortaz Hejri, Mohammad Jalili
J Educ Eval Health Prof. 2017;14:19.   Published online September 8, 2017
DOI: https://doi.org/10.3352/jeehp.2017.14.19
  • 29,991 View
  • 268 Download
  • 2 Web of Science
  • 1 Crossref
AbstractAbstract PDF
Purpose
In a sequential objective structured clinical examination (OSCE), all students initially take a short screening OSCE. Examinees who pass are excused from further testing, but an additional OSCE is administered to the remaining examinees. Previous investigations of sequential OSCE were based on classical test theory. We aimed to design and evaluate screening OSCEs based on item response theory (IRT).
Methods
We carried out a retrospective observational study. At each station of a 10-station OSCE, the students’ performance was graded on a Likert-type scale. Since the data were polytomous, the difficulty parameters, discrimination parameters, and students’ ability were calculated using a graded response model. To design several screening OSCEs, we identified the 5 most difficult stations and the 5 most discriminative ones. For each test, 5, 4, or 3 stations were selected. Normal and stringent cut-scores were defined for each test. We compared the results of each of the 12 screening OSCEs to the main OSCE and calculated the positive and negative predictive values (PPV and NPV), as well as the exam cost.
Results
A total of 253 students (95.1%) passed the main OSCE, while 72.6% to 94.4% of examinees passed the screening tests. The PPV values ranged from 0.98 to 1.00, and the NPV values ranged from 0.18 to 0.59. Two tests effectively predicted the results of the main exam, resulting in financial savings of 34% to 40%.
Conclusion
If stations with the highest IRT-based discrimination values and stringent cut-scores are utilized in the screening test, sequential OSCE can be an efficient and convenient way to conduct an OSCE.

Citations

Citations to this article as recorded by  
  • Utility of eye-tracking technology for preparing medical students in Spain for the summative objective structured clinical examination
    Francisco Sánchez-Ferrer, J.M. Ramos-Rincón, M.D. Grima-Murcia, María Luisa Sánchez-Ferrer, Francisco Sánchez-del Campo, Antonio F. Compañ-Rosique, Eduardo Fernández-Jover
    Journal of Educational Evaluation for Health Professions.2017; 14: 27.     CrossRef
Technical Report
Introduction and Administration of the Clinical Skill Test of the Medical Licensing Examination, Republic of Korea (2009)
Kun Sang Kim
J Educ Eval Health Prof. 2010;7:4.   Published online December 3, 2010
DOI: https://doi.org/10.3352/jeehp.2010.7.4
  • 65,535 View
  • 221 Download
  • 14 Crossref
AbstractAbstract PDF
The first trial of the clinical skill test as part of the Korean Medical Licensing Examination was done from September 23 to December 1, 2009, in the clinical skill test center located in the National Health Personnel Licensing Examination Board (NHPLEB) building, Seoul. Korea is the first country to introduce the clinical skill test as part of the medical licensing examination in Asia. It is a report on the introduction and administration of the test. The NHPLEB launched researches on the validity of introducing the clinical skill test and on the best implementation methods in 2000. Since 2006, lists of subjects of test items for the clinical skill test has been developed. The test consisted of two types of evaluation, i.e., a clinical performance examination (CPX) with a standardized patient (SP) and objective structured clinical examination (OSCE). The proctor (medical faculty member) and SP rate the examinees??proficiency for the OSCE and CPX respectively. Out of 3,456 applicants, 3,289 examinees (95.2%) passed the test. Out of 167 examinees who failed the clinical skill test, 142 passed the written test. This means that the clinical skill test showed characteristics independent from the written test. This successful implementation of the clinical skill test is going to improve the medical graduates??performance of clinical skills.

Citations

Citations to this article as recorded by  
  • Presidential address: improving item validity and adopting computer-based testing, clinical skills assessments, artificial intelligence, and virtual reality in health professions licensing examinations in Korea
    Hyunjoo Pai
    Journal of Educational Evaluation for Health Professions.2023; 20: 8.     CrossRef
  • Implementation strategy for introducing a clinical skills examination to the Korean Oriental Medicine Licensing Examination: a mixed-method modified Delphi study
    Chan-Young Kwon, Sanghoon Lee, Min Hwangbo, Chungsik Cho, Sangwoo Shin, Dong-Hyeon Kim, Aram Jeong, Hye-Yoon Lee
    Journal of Educational Evaluation for Health Professions.2023; 20: 23.     CrossRef
  • Authenticity, acceptability, and feasibility of a hybrid gynecology station for the Papanicolaou test as part of a clinical skills examination in Korea
    Ji-Hyun Seo, Younglim Oh, Sunju Im, Do-Kyong Kim, Hyun-Hee Kong, HyeRin Roh
    Journal of Educational Evaluation for Health Professions.2018; 15: 4.     CrossRef
  • A one-day surgical-skill training course for medical students’ improved surgical skills and increased interest in surgery as a career
    Ho Seok Seo, Yong Hwa Eom, Min Ki Kim, Young-Min Kim, Byung Joo Song, Kyo Young Song
    BMC Medical Education.2017;[Epub]     CrossRef
  • Presidential address: launching the Korea Health Personnel Licensing Examination Institute, a government-supported special foundation from December 23, 2015
    Chang Hwi Kim
    Journal of Educational Evaluation for Health Professions.2016; 13: 20.     CrossRef
  • Reforms of the Korean Medical Licensing Examination regarding item development and performance evaluation
    Mi Kyoung Yim
    Journal of Educational Evaluation for Health Professions.2015; 12: 6.     CrossRef
  • Educational intervention as an effective step for reducing blood culture contamination: a prospective cohort study
    W.B. Park, S.J. Myung, M.-d. Oh, J. Lee, N.-J. Kim, E.-C. Kim, J.S. Park
    Journal of Hospital Infection.2015; 91(2): 111.     CrossRef
  • Impact of Clinical Performance Examination on Incoming Interns' Clinical Competency in Differential Diagnosis of Headache
    Seong-Min Park, Yun-Mi Song, Bo-Kyoung Kim, Hyoeun Kim
    Korean Journal of Family Medicine.2014; 35(2): 56.     CrossRef
  • Can a medical regulatory system be implemented in Korea?
    Sun Huh, Myung-Hyun Chung
    Journal of the Korean Medical Association.2013; 56(3): 158.     CrossRef
  • Power of the policy: how the announcement of high-stakes clinical examination altered OSCE implementation at institutional level
    Chi-Wei Lin, Tsuen-Chiuan Tsai, Cheuk-Kwan Sun, Der-Fang Chen, Keh-Min Liu
    BMC Medical Education.2013;[Epub]     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef
  • How can high stakes examination in Korean medical society be improved to the international level?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 114.     CrossRef
  • The impact of introducing the Korean Medical Licensing Examination clinical skills assessment on medical education
    Hoon-Ki Park
    Journal of the Korean Medical Association.2012; 55(2): 116.     CrossRef
  • Failed Examinees' Legal Challenge over the Clinical Skill Test in the Korean Medical Licensing Examination
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2010; 7: 5.     CrossRef
Review Article
The New Horizon for Evaluations in Medical Education in Korea
Sang-Ho Baik
J Educ Eval Health Prof. 2005;2(1):7-22.   Published online June 30, 2005
DOI: https://doi.org/10.3352/jeehp.2005.2.1.7
  • 34,425 View
  • 219 Download
  • 7 Crossref
AbstractAbstract PDF
Over the last two decades, there have been a number of significant changes in the evaluation system in medical education in Korea. One major improvement in this respect has been the listing of learning objectives at medical schools and the construction of a content outline for the Korean Medical Licensing Examination that can be used as a basis of evaluation. Item analysis has become a routine method for obtaining information that often provides valuable feedback concerning test items after the completion of a written test. The use of item response theory in analyzing test items has been spreading in medical schools as a way to evaluate performance tests and computerized adaptive testing. A series of recent studies have documented an upward trend in the adoption of the objective structured clinical examination (OSCE) and clinical practice examination (CPX) for measuring skill and attitude domains, in addition to tests of the knowledge domain. There has been an obvious increase in regional consortiums involving neighboring medical schools that share the planning and administration of the OSCE and CPX; this includes recruiting and training standardized patients. Such consortiums share common activities, such as case development and program evaluation. A short history and the pivotal roles of four organizations that have brought about significant changes in the examination system are discussed briefly.

Citations

Citations to this article as recorded by  
  • Presidential address: Adoption of a clinical skills examination for dental licensing, implementation of computer-based testing for the medical licensing examination, and the 30th anniversary of the Korea Health Personnel Licensing Examination Institute
    Yoon-Seong Lee
    Journal of Educational Evaluation for Health Professions.2022; 19: 1.     CrossRef
  • Effectiveness of Medical Education Assessment Consortium Clinical Knowledge Mock Examination (2011‐2016)
    Sang Yeoup Lee, Yeli Lee, Mi Kyung Kim
    Korean Medical Education Review.2018; 20(1): 20.     CrossRef
  • Long for wonderful leadership in a new era of the Korean Association of Medical Colleges
    Young Hwan Lee
    Korean Journal of Medical Education.2014; 26(3): 163.     CrossRef
  • Major Reforms and Issues of the Medical Licensing Examination Systems in Korea
    Sang-Ho Baik
    Korean Medical Education Review.2013; 15(3): 125.     CrossRef
  • A Study on the Feasibility of a National Practical Examination in the Radiologic Technologist
    Soon-Yong Son, Tae-Hyung Kim, Jung-Whan Min, Dong-Kyoon Han, Sung-Min Ahn
    Journal of the Korea Academia-Industrial cooperation Society.2011; 12(5): 2149.     CrossRef
  • The Relationship between Senior Year Examinations at a Medical School and the Korean Medical Licensing Examination
    Ki Hoon Jung, Ho Keun Jung, Kwan Lee
    Korean Journal of Medical Education.2009; 21(1): 17.     CrossRef
  • What Qualities Do Medical School Applicants Need to Have? - Secondary Publication
    Yera Hur, Sun Kim
    Yonsei Medical Journal.2009; 50(3): 427.     CrossRef
Research article
Examiner seniority and experience are associated with bias when scoring communication, but not examination, skills in objective structured clinical examinations in Australia  
Lauren Chong, Silas Taylor, Matthew Haywood, Barbara-Ann Adelstein, Boaz Shulruf
J Educ Eval Health Prof. 2018;15:17.
DOI: https://doi.org/10.3352/jeehp.2018.15.17
  • 26,071 View
  • 285 Download
  • 22 Web of Science
  • 19 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
The biases that may influence objective structured clinical examination (OSCE) scoring are well understood, and recent research has attempted to establish the magnitude of their impact. However, the influence of examiner experience, clinical seniority, and occupation on communication and physical examination scores in OSCEs has not yet been clearly established.
Methods
We compared the mean scores awarded for generic and clinical communication and physical examination skills in 2 undergraduate medicine OSCEs in relation to examiner characteristics (gender, examining experience, occupation, seniority, and speciality). The statistical significance of the differences was calculated using the 2-tailed independent t-test and analysis of variance.
Results
Five hundred and seventeen students were examined by 237 examiners at the University of New South Wales in 2014 and 2016. Examiner gender, occupation (academic, clinician, or clinical tutor), and job type (specialist or generalist) did not significantly impact scores. Junior doctors gave consistently higher scores than senior doctors in all domains, and this difference was statistically significant for generic and clinical communication scores. Examiner experience was significantly inversely correlated with generic communication scores.
Conclusion
We suggest that the assessment of examination skills may be less susceptible to bias because this process is fairly prescriptive, affording greater scoring objectivity. We recommend training to define the marking criteria, teaching curriculum, and expected level of performance in communication skills to reduce bias in OSCE assessment.

Citations

Citations to this article as recorded by  
  • Analyse systématique des évaluations de circuits multiples d’examen clinique objectif structuré (ECOS) : variables explicatives et corrélations inter-évaluateurs
    E. Ollier, C. Pelissier, C. Boissier, T. Barjat, P. Berthelot, C. Boutet, X. Gocko, C. Le Hello, S. Perinel
    La Revue de Médecine Interne.2024;[Epub]     CrossRef
  • Variance due to the examination conditions and factors associated with success in objective structured clinical examinations (OSCEs): first experiences at Paris-Saclay medical school
    Coralie Amadou, Raphael Veil, Antonia Blanié, Claire Nicaise, Alexandra Rouquette, Vincent Gajdos
    BMC Medical Education.2024;[Epub]     CrossRef
  • Assessment during clinical education among nursing students using two different assessment instruments
    Nilsson Tomas, Masiello Italo, Broberger Eva, Lindström Veronica
    BMC Medical Education.2024;[Epub]     CrossRef
  • Association entre les performances cliniques des étudiants et leur réussite aux Épreuves classantes nationales informatisées : une étude de cohorte rétrospective monocentrique
    L. Azoyan, Y. Lombardi, M.C. Renaud, A. Duguet, S. Georgin-Lavialle, F. Cohen-Aubart, G. Ibanez, O. Steichen
    La Revue de Médecine Interne.2023; 44(1): 5.     CrossRef
  • Bias in Medical School Clerkship Grading: Is It Time for a Change?
    Rachel A. Russo, Dana M. Raml, Anna J. Kerlek, Martin Klapheke, Katherine B. Martin, Jeffrey J. Rakofsky
    Academic Psychiatry.2023; 47(4): 428.     CrossRef
  • Are we ready yet for digital transformation? Virtual versus on-campus OSCE as assessment tools in pharmacy education. A randomized controlled head-to-head comparative assessment
    Zelal Kharaba, Mohammad M. AlAhmad, Asim Ahmed Elnour, Abdallah Abou Hajal, Suhad Abumweis, Mohammad A. Ghattas
    Saudi Pharmaceutical Journal.2023; 31(3): 359.     CrossRef
  • Comparing Entrustable Professional Activity Scores Given by Faculty Physicians and Senior Trainees to First-Year Residents
    Steven J Katz, Dennis Wang
    Cureus.2022;[Epub]     CrossRef
  • eOSCE stations live versus remote evaluation and scores variability
    Donia Bouzid, Jimmy Mullaert, Aiham Ghazali, Valentine Marie Ferré, France Mentré, Cédric Lemogne, Philippe Ruszniewski, Albert Faye, Alexy Tran Dinh, Tristan Mirault, Nathan Peiffer Smadja, Léonore Muller, Laure Falque Pierrotin, Michael Thy, Maksud Assa
    BMC Medical Education.2022;[Epub]     CrossRef
  • Development and Evaluation of an Online Exam for Exercise Physiology During the COVID-19 Pandemic
    Amanda L Burdett, Nancy van Doorn, Matthew D Jones, Natalie CG Kwai, Rachel E Ward, Silas Taylor, Boaz Shulruf
    Journal of Clinical Exercise Physiology.2022; 11(4): 122.     CrossRef
  • Equal Z standard-setting method to estimate the minimum number of panelists for a medical school’s objective structured clinical examination in Taiwan: a simulation study
    Ying-Ying Yang, Pin-Hsiang Huang, Ling-Yu Yang, Chia-Chang Huang, Chih-Wei Liu, Shiau-Shian Huang, Chen-Huan Chen, Fa-Yauh Lee, Shou-Yen Kao, Boaz Shulruf
    Journal of Educational Evaluation for Health Professions.2022; 19: 27.     CrossRef
  • How biased are you? The effect of prior performance information on attending physician ratings and implications for learner handover
    Tammy Shaw, Timothy J. Wood, Claire Touchie, Debra Pugh, Susan M. Humphrey-Murto
    Advances in Health Sciences Education.2021; 26(1): 199.     CrossRef
  • Does objective structured clinical examination examiners’ backgrounds influence the score agreement?
    Oscar Gilang Purnajati, Rachmadya Nur Hidayah, Gandes Retno Rahayu
    The Asia Pacific Scholar.2021; 6(2): 48.     CrossRef
  • Ethnic and Gender Bias in Objective Structured Clinical Examination
    Iris C.I Chao, Efrem Violato, Brendan Concannon, Charlotte McCartan, Sharla King, Mary Roduta Roberts
    Education in the Health Professions.2021; 4(2): 37.     CrossRef
  • Tutor–Student Partnership in Practice OSCE to Enhance Medical Education
    Eve Cosker, Valentin Favier, Patrice Gallet, Francis Raphael, Emmanuelle Moussier, Louise Tyvaert, Marc Braun, Eva Feigerlova
    Medical Science Educator.2021; 31(6): 1803.     CrossRef
  • Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia
    Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
    Journal of Educational Evaluation for Health Professions.2021; 18: 23.     CrossRef
  • Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review
    Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
    Journal of Educational Evaluation for Health Professions.2021; 18: 11.     CrossRef
  • Is There Variability in Scoring of Student Surgical OSCE Performance Based on Examiner Experience and Expertise?
    Claire L. Donohoe, Frank Reilly, Suzanne Donnelly, Ronan A. Cahill
    Journal of Surgical Education.2020; 77(5): 1202.     CrossRef
  • The role of training in student examiner rating performance in a student-led mock OSCE
    Jian Hui Koo, Kim Yao Ong, Yun Ting Yap, Kum Ying Tham
    Perspectives on Medical Education.2020; 10(5): 293.     CrossRef
  • Insights into student assessment outcomes in rural clinical campuses
    Boaz Shulruf, Gary Velan, Lesley Forster, Anthony O’Sullivan, Peter Harris, Silas Taylor
    BMC Medical Education.2019;[Epub]     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions
TOP