Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Author index

Page Path
HOME > Browse articles > Author index
Search
Duck Sun Ahn 2 Articles
Similarity of the cut score in test sets with different item amounts using the modified Angoff, modified Ebel, and Hofstee standard-setting methods for the Korean Medical Licensing Examination  
Janghee Park, Mi Kyoung Yim, Na Jin Kim, Duck Sun Ahn, Young-Min Kim
J Educ Eval Health Prof. 2020;17:28.   Published online October 5, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.28
  • 6,602 View
  • 189 Download
  • 7 Web of Science
  • 6 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
The Korea Medical Licensing Exam (KMLE) typically contains a large number of items. The purpose of this study was to investigate whether there is a difference in the cut score between evaluating all items of the exam and evaluating only some items when conducting standard-setting.
Methods
We divided the item sets that appeared on 3 recent KMLEs for the past 3 years into 4 subsets of each year of 25% each based on their item content categories, discrimination index, and difficulty index. The entire panel of 15 members assessed all the items (360 items, 100%) of the year 2017. In split-half set 1, each item set contained 184 (51%) items of year 2018 and each set from split-half set 2 contained 182 (51%) items of the year 2019 using the same method. We used the modified Angoff, modified Ebel, and Hofstee methods in the standard-setting process.
Results
Less than a 1% cut score difference was observed when the same method was used to stratify item subsets containing 25%, 51%, or 100% of the entire set. When rating fewer items, higher rater reliability was observed.
Conclusion
When the entire item set was divided into equivalent subsets, assessing the exam using a portion of the item set (90 out of 360 items) yielded similar cut scores to those derived using the entire item set. There was a higher correlation between panelists’ individual assessments and the overall assessments.

Citations

Citations to this article as recorded by  
  • Application of computer-based testing in the Korean Medical Licensing Examination, the emergence of the metaverse in medical education, journal metrics and statistics, and appreciation to reviewers and volunteers
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2022; 19: 2.     CrossRef
  • Possibility of using the yes/no Angoff method as a substitute for the percent Angoff method for estimating the cutoff score of the Korean Medical Licensing Examination: a simulation study
    Janghee Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 23.     CrossRef
  • Equal Z standard-setting method to estimate the minimum number of panelists for a medical school’s objective structured clinical examination in Taiwan: a simulation study
    Ying-Ying Yang, Pin-Hsiang Huang, Ling-Yu Yang, Chia-Chang Huang, Chih-Wei Liu, Shiau-Shian Huang, Chen-Huan Chen, Fa-Yauh Lee, Shou-Yen Kao, Boaz Shulruf
    Journal of Educational Evaluation for Health Professions.2022; 19: 27.     CrossRef
  • Possibility of independent use of the yes/no Angoff and Hofstee methods for the standard setting of the Korean Medical Licensing Examination written test: a descriptive study
    Do-Hwan Kim, Ye Ji Kang, Hoon-Ki Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 33.     CrossRef
  • Presidential address: Quarantine guidelines to protect examinees from coronavirus disease 2019, clinical skills examination for dental licensing, and computer-based testing for medical, dental, and oriental medicine licensing
    Yoon-Seong Lee
    Journal of Educational Evaluation for Health Professions.2021; 18: 1.     CrossRef
  • Comparing the cut score for the borderline group method and borderline regression method with norm-referenced standard setting in an objective structured clinical examination in medical school in Korea
    Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
    Journal of Educational Evaluation for Health Professions.2021; 18: 25.     CrossRef
Reconsidering the Cut Score of Korean National Medical Licensing Examination
Duck Sun Ahn, Sowon Ahn
J Educ Eval Health Prof. 2007;4:1.   Published online April 28, 2007
DOI: https://doi.org/10.3352/jeehp.2007.4.1
  • 42,362 View
  • 178 Download
  • 4 Crossref
AbstractAbstract PDF
After briefly reviewing theories of standard setting we analyzed the problems of the current cut scores. Then, we reported the results of need assessment on the standard setting among medical educators and psychometricians. Analyses of the standard setting methods of developed countries were reported as well. Based on these findings, we suggested the Bookmark and the modified Angoff methods as alternative methods for setting standard. Possible problems and challenges were discussed when these methods were applied to the National Medical Licensing Examination.

Citations

Citations to this article as recorded by  
  • Predicting medical graduates’ clinical performance using national competency examination results in Indonesia
    Prattama Santoso Utomo, Amandha Boy Timor Randita, Rilani Riskiyana, Felicia Kurniawan, Irwin Aras, Cholis Abrori, Gandes Retno Rahayu
    BMC Medical Education.2022;[Epub]     CrossRef
  • Possibility of independent use of the yes/no Angoff and Hofstee methods for the standard setting of the Korean Medical Licensing Examination written test: a descriptive study
    Do-Hwan Kim, Ye Ji Kang, Hoon-Ki Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 33.     CrossRef
  • Applying the Bookmark method to medical education: Standard setting for an aseptic technique station
    Monica L. Lypson, Steven M. Downing, Larry D. Gruppen, Rachel Yudkowsky
    Medical Teacher.2013; 35(7): 581.     CrossRef
  • Standard Setting in Student Assessment: Is a Defensible Method Yet to Come?
    A Barman
    Annals of the Academy of Medicine, Singapore.2008; 37(11): 957.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions