Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
2 "Ebel"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Research articles
Similarity of the cut score in test sets with different item amounts using the modified Angoff, modified Ebel, and Hofstee standard-setting methods for the Korean Medical Licensing Examination  
Janghee Park, Mi Kyoung Yim, Na Jin Kim, Duck Sun Ahn, Young-Min Kim
J Educ Eval Health Prof. 2020;17:28.   Published online October 5, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.28
  • 6,090 View
  • 188 Download
  • 7 Web of Science
  • 6 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
The Korea Medical Licensing Exam (KMLE) typically contains a large number of items. The purpose of this study was to investigate whether there is a difference in the cut score between evaluating all items of the exam and evaluating only some items when conducting standard-setting.
Methods
We divided the item sets that appeared on 3 recent KMLEs for the past 3 years into 4 subsets of each year of 25% each based on their item content categories, discrimination index, and difficulty index. The entire panel of 15 members assessed all the items (360 items, 100%) of the year 2017. In split-half set 1, each item set contained 184 (51%) items of year 2018 and each set from split-half set 2 contained 182 (51%) items of the year 2019 using the same method. We used the modified Angoff, modified Ebel, and Hofstee methods in the standard-setting process.
Results
Less than a 1% cut score difference was observed when the same method was used to stratify item subsets containing 25%, 51%, or 100% of the entire set. When rating fewer items, higher rater reliability was observed.
Conclusion
When the entire item set was divided into equivalent subsets, assessing the exam using a portion of the item set (90 out of 360 items) yielded similar cut scores to those derived using the entire item set. There was a higher correlation between panelists’ individual assessments and the overall assessments.

Citations

Citations to this article as recorded by  
  • Application of computer-based testing in the Korean Medical Licensing Examination, the emergence of the metaverse in medical education, journal metrics and statistics, and appreciation to reviewers and volunteers
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2022; 19: 2.     CrossRef
  • Possibility of using the yes/no Angoff method as a substitute for the percent Angoff method for estimating the cutoff score of the Korean Medical Licensing Examination: a simulation study
    Janghee Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 23.     CrossRef
  • Equal Z standard-setting method to estimate the minimum number of panelists for a medical school’s objective structured clinical examination in Taiwan: a simulation study
    Ying-Ying Yang, Pin-Hsiang Huang, Ling-Yu Yang, Chia-Chang Huang, Chih-Wei Liu, Shiau-Shian Huang, Chen-Huan Chen, Fa-Yauh Lee, Shou-Yen Kao, Boaz Shulruf
    Journal of Educational Evaluation for Health Professions.2022; 19: 27.     CrossRef
  • Possibility of independent use of the yes/no Angoff and Hofstee methods for the standard setting of the Korean Medical Licensing Examination written test: a descriptive study
    Do-Hwan Kim, Ye Ji Kang, Hoon-Ki Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 33.     CrossRef
  • Presidential address: Quarantine guidelines to protect examinees from coronavirus disease 2019, clinical skills examination for dental licensing, and computer-based testing for medical, dental, and oriental medicine licensing
    Yoon-Seong Lee
    Journal of Educational Evaluation for Health Professions.2021; 18: 1.     CrossRef
  • Comparing the cut score for the borderline group method and borderline regression method with norm-referenced standard setting in an objective structured clinical examination in medical school in Korea
    Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
    Journal of Educational Evaluation for Health Professions.2021; 18: 25.     CrossRef
Comparison of standard-setting methods for the Korean Radiological Technologist Licensing Examination: Angoff, Ebel, bookmark, and Hofstee  
Janghee Park, Duck-Sun Ahn, Mi Kyoung Yim, Jaehyoung Lee
J Educ Eval Health Prof. 2018;15:32.   Published online December 26, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.32
  • 18,865 View
  • 247 Download
  • 11 Web of Science
  • 8 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to compare the possible standard-setting methods for the Korean Radiological Technologist Licensing Examination, which has a fixed cut score, and to suggest the most appropriate method.
Methods
Six radiological technology professors set standards for 250 items on the Korean Radiological Technologist Licensing Examination administered in December 2016 using the Angoff, Ebel, bookmark, and Hofstee methods.
Results
With a maximum percentile score of 100, the cut score for the examination was 71.27 using the Angoff method, 62.2 using the Ebel method, 64.49 using the bookmark method, and 62 using the Hofstee method. Based on the Hofstee method, an acceptable cut score for the examination would be between 52.83 and 70, but the cut score was 71.27 using the Angoff method.
Conclusion
The above results suggest that the best standard-setting method to determine the cut score would be a panel discussion with the modified Angoff or Ebel method, with verification of the rated results by the Hofstee method. Since no standard-setting method has yet been adopted for the Korean Radiological Technologist Licensing Examination, this study will be able to provide practical guidance for introducing a standard-setting process.

Citations

Citations to this article as recorded by  
  • Setting standards for a diagnostic test of aviation English for student pilots
    Maria Treadaway, John Read
    Language Testing.2024;[Epub]     CrossRef
  • The challenges inherent with anchor-based approaches to the interpretation of important change in clinical outcome assessments
    Kathleen W. Wyrwich, Geoffrey R. Norman
    Quality of Life Research.2023; 32(5): 1239.     CrossRef
  • Possibility of independent use of the yes/no Angoff and Hofstee methods for the standard setting of the Korean Medical Licensing Examination written test: a descriptive study
    Do-Hwan Kim, Ye Ji Kang, Hoon-Ki Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 33.     CrossRef
  • Comparison of the validity of bookmark and Angoff standard setting methods in medical performance tests
    Majid Yousefi Afrashteh
    BMC Medical Education.2021;[Epub]     CrossRef
  • Comparing the cut score for the borderline group method and borderline regression method with norm-referenced standard setting in an objective structured clinical examination in medical school in Korea
    Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
    Journal of Educational Evaluation for Health Professions.2021; 18: 25.     CrossRef
  • Using the Angoff method to set a standard on mock exams for the Korean Nursing Licensing Examination
    Mi Kyoung Yim, Sujin Shin
    Journal of Educational Evaluation for Health Professions.2020; 17: 14.     CrossRef
  • Performance of the Ebel standard-setting method for the spring 2019 Royal College of Physicians and Surgeons of Canada internal medicine certification examination consisting of multiple-choice questions
    Jimmy Bourque, Haley Skinner, Jonathan Dupré, Maria Bacchus, Martha Ainslie, Irene W. Y. Ma, Gary Cole
    Journal of Educational Evaluation for Health Professions.2020; 17: 12.     CrossRef
  • Similarity of the cut score in test sets with different item amounts using the modified Angoff, modified Ebel, and Hofstee standard-setting methods for the Korean Medical Licensing Examination
    Janghee Park, Mi Kyoung Yim, Na Jin Kim, Duck Sun Ahn, Young-Min Kim
    Journal of Educational Evaluation for Health Professions.2020; 17: 28.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions