Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
1 "Koreans"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Technical report
Item development process and analysis of 50 case-based items for implementation on the Korean Nursing Licensing Examination  
In Sook Park, Yeon Ok Suh, Hae Sook Park, So Young Kang, Kwang Sung Kim, Gyung Hee Kim, Yeon-Hee Choi, Hyun-Ju Kim
J Educ Eval Health Prof. 2017;14:20.   Published online September 11, 2017
DOI: https://doi.org/10.3352/jeehp.2017.14.20
  • 31,529 View
  • 262 Download
  • 3 Web of Science
  • 4 Crossref
AbstractAbstract PDF
Purpose
The purpose of this study was to improve the quality of items on the Korean Nursing Licensing Examination by developing and evaluating case-based items that reflect integrated nursing knowledge.
Methods
We conducted a cross-sectional observational study to develop new case-based items. The methods for developing test items included expert workshops, brainstorming, and verification of content validity. After a mock examination of undergraduate nursing students using the newly developed case-based items, we evaluated the appropriateness of the items through classical test theory and item response theory.
Results
A total of 50 case-based items were developed for the mock examination, and content validity was evaluated. The question items integrated 34 discrete elements of integrated nursing knowledge. The mock examination was taken by 741 baccalaureate students in their fourth year of study at 13 universities. Their average score on the mock examination was 57.4, and the examination showed a reliability of 0.40. According to classical test theory, the average level of item difficulty of the items was 57.4% (80%–100% for 12 items; 60%–80% for 13 items; and less than 60% for 25 items). The mean discrimination index was 0.19, and was above 0.30 for 11 items and 0.20 to 0.29 for 15 items. According to item response theory, the item discrimination parameter (in the logistic model) was none for 10 items (0.00), very low for 20 items (0.01 to 0.34), low for 12 items (0.35 to 0.64), moderate for 6 items (0.65 to 1.34), high for 1 item (1.35 to 1.69), and very high for 1 item (above 1.70). The item difficulty was very easy for 24 items (below −2.0), easy for 8 items (−2.0 to −0.5), medium for 6 items (−0.5 to 0.5), hard for 3 items (0.5 to 2.0), and very hard for 9 items (2.0 or above). The goodness-of-fit test in terms of the 2-parameter item response model between the range of 2.0 to 0.5 revealed that 12 items had an ideal correct answer rate.
Conclusion
We surmised that the low reliability of the mock examination was influenced by the timing of the test for the examinees and the inappropriate difficulty of the items. Our study suggested a methodology for the development of future case-based items for the Korean Nursing Licensing Examination.

Citations

Citations to this article as recorded by  
  • Suggestion for item allocation to 8 nursing activity categories of the Korean Nursing Licensing Examination: a survey-based descriptive study
    Kyunghee Kim, So Young Kang, Younhee Kang, Youngran Kweon, Hyunjung Kim, Youngshin Song, Juyeon Cho, Mi-Young Choi, Hyun Su Lee
    Journal of Educational Evaluation for Health Professions.2023; 20: 18.     CrossRef
  • Automating assessment of Design exams: A case study of novelty evaluation
    Nandita Bhanja Chaudhuri, Debayan Dhar, Pradeep G. Yammiyavar
    Expert Systems with Applications.2021; : 116108.     CrossRef
  • Levels, antecedents, and consequences of critical thinking among clinical nurses: a quantitative literature review
    Yongmi Lee, Younjae Oh
    Journal of Educational Evaluation for Health Professions.2020; 17: 26.     CrossRef
  • Factors Influencing the Success of the National Nursing Competency Examination taken by the Nursing Diploma Students in Yogyakarta
    Yulia Wardani
    Jurnal Ners.2020; 14(2): 172.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions