Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
2 "Educational evaluation"
Filter
Filter
Article category
Keywords
Publication year
Authors
Review
How to execute Context, Input, Process, and Product evaluation model in medical health education  
So young Lee, Jwa-Seop Shin, Seung-Hee Lee
J Educ Eval Health Prof. 2019;16:40.   Published online December 28, 2019
DOI: https://doi.org/10.3352/jeehp.2019.16.40
  • 12,795 View
  • 552 Download
  • 7 Web of Science
  • 11 Crossref
AbstractAbstract PDFSupplementary Material
Improvements to education are necessary in order to keep up with the education requirements of today. The Context, Input, Process, and Product (CIPP) evaluation model was created for the decision-making towards education improvement, so this model is appropriate in this regard. However, application of this model in the actual context of medical health education is considered difficult in the education environment. Thus, in this study, literature survey of previous studies was investigated to examine the execution procedure of how the CIPP model can be actually applied. For the execution procedure utilizing the CIPP model, the criteria and indicators were determined from analysis results and material was collected after setting the material collection method. Afterwards, the collected material was analyzed for each CIPP element, and finally, the relationship of each CIPP element was analyzed for the final improvement decision-making. In this study, these steps were followed and the methods employed in previous studies were organized. Particularly, the process of determining the criteria and indicators was important and required a significant effort. Literature survey was carried out to analyze the most widely used criteria through content analysis and obtained a total of 12 criteria. Additional emphasis is necessary in the importance of the criteria selection for the actual application of the CIPP model. Also, a diverse range of information can be obtained through qualitative as well as quantitative methods. Above all, since the CIPP evaluation model execution result becomes the basis for the execution of further improved evaluations, the first attempt of performing without hesitation is essential.

Citations

Citations to this article as recorded by  
  • Evaluation of the Maryland Next Gen Test Bank Project: Implications and Recommendations
    Desirée Hensel, Diane M. Billings, Rebecca Wiseman
    Nursing Education Perspectives.2024;[Epub]     CrossRef
  • Development of a blended teaching quality evaluation scale (BTQES) for undergraduate nursing based on the Context, Input, Process and Product (CIPP) evaluation model: A cross-sectional survey
    Yue Zhao, Weijuan Li, Hong Jiang, Mohedesi Siyiti, Meng Zhao, Shuping You, Yinglan Li, Ping Yan
    Nurse Education in Practice.2024; : 103976.     CrossRef
  • Self-care educational guide for mothers with gestational diabetes mellitus: A systematic review on identifying self-care domains, approaches, and their effectiveness
    Zarina Haron, Rosnah Sutan, Roshaya Zakaria, Zaleha Abdullah Mahdy
    Belitung Nursing Journal.2023; 9(1): 6.     CrossRef
  • Evaluation of the Smart Indonesia Program as a Policy to Improve Equality in Education
    Patni Ninghardjanti, Wiedy Murtini, Aniek Hindrayani, Khresna B. Sangka
    Sustainability.2023; 15(6): 5114.     CrossRef
  • Exploring Perceptions of Competency-Based Medical Education in Undergraduate Medical Students and Faculty: A Program Evaluation
    Erica Ai Li, Claire A Wilson, Jacob Davidson, Aaron Kwong, Amrit Kirpalani, Peter Zhan Tao Wang
    Advances in Medical Education and Practice.2023; Volume 14: 381.     CrossRef
  • The Evaluation of China's Double Reduction Policy: A Case Study in Dongming County Mingde Primary School
    Danyang Li , Chaimongkhon Supromin, Supit Boonlab
    International Journal of Sociologies and Anthropologies Science Reviews.2023; 3(6): 437.     CrossRef
  • Exploring the Components of the Research Empowerment Program of the Faculty Members of Kermanshah University of Medical Sciences, Iran Based on the CIPP Model: A Qualitative Study
    Mostafa Jafari, Susan Laei, Elham Kavyani, Rostam Jalali
    Educational Research in Medical Sciences.2021;[Epub]     CrossRef
  • Adapting an Integrated Program Evaluation for Promoting Competency‐Based Medical Education
    Hyunjung Ju, Minkyung Oh, Jong-Tae Lee, Bo Young Yoon
    Korean Medical Education Review.2021; 23(1): 56.     CrossRef
  • Changes in the accreditation standards of medical schools by the Korean Institute of Medical Education and Evaluation from 2000 to 2019
    Hyo Hyun Yoo, Mi Kyung Kim, Yoo Sang Yoon, Keun Mi Lee, Jong Hun Lee, Seung-Jae Hong, Jung –Sik Huh, Won Kyun Park
    Journal of Educational Evaluation for Health Professions.2020; 17: 2.     CrossRef
  • Human Resources Development via Higher Education Scholarships: A Case Study of a Ministry of Public Works and Housing Scholarship Program
    Abdullatif SETİABUDİ, Muchlis. R. LUDDIN, Yuli RAHMAWATI
    International e-Journal of Educational Studies.2020; 4(8): 209.     CrossRef
  • Exploring Components, Barriers, and Solutions for Faculty Members’ Research Empowerment Programs Based on the CIPP Model: A Qualitative Study
    Mostafa Jafari, Soosan Laei, Elham Kavyani, Rostam Jalali
    Journal of Occupational Health and Epidemiology.2020; 9(4): 213.     CrossRef
Original Article
Comparison of item analysis results of Korean Medical Licensing Examination according to classical test theory and item response theory
Eun Young Lim, Jang Hee Park, ll Kwon, Gue Lim Song, Sun Huh
J Educ Eval Health Prof. 2004;1(1):67-76.   Published online January 31, 2004
DOI: https://doi.org/10.3352/jeehp.2004.1.1.67
  • 30,591 View
  • 214 Download
  • 3 Crossref
AbstractAbstract PDF
The results of the 64th and 65th Korean Medical Licensing Examination were analyzed according to the classical test theory and item response theory in order to know the possibility of applying item response theory to item analys and to suggest its applicability to computerized adaptive test. The correlation coefficiency of difficulty index, discriminating index and ability parameter between two kinds of analysis were got using computer programs such as Analyst 4.0, Bilog and Xcalibre. Correlation coefficiencies of difficulty index were equal to or more than 0.75; those of discriminating index were between - 0.023 and 0.753; those of ability parameter were equal to or more than 0.90. Those results suggested that the item analysis according to item response theory showed the comparable results with that according to classical test theory except discriminating index. Since the ability parameter is most widely used in the criteria-reference test, the high correlation between ability parameter and total score can provide the validity of computerized adaptive test utilizing item response theory.

Citations

Citations to this article as recorded by  
  • Analysis on Validity and Academic Competency of Mock Test for Korean Medicine National Licensing Examination Using Item Response Theory
    Han Chae, Eunbyul Cho, SeonKyoung Kim, DaHye Choi, Seul Lee
    Keimyung Medical Journal.2023; 42(1): 7.     CrossRef
  • Item difficulty index, discrimination index, and reliability of the 26 health professions licensing examinations in 2022, Korea: a psychometric study
    Yoon Hee Kim, Bo Hyun Kim, Joonki Kim, Bokyoung Jung, Sangyoung Bae
    Journal of Educational Evaluation for Health Professions.2023; 20: 31.     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions