Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
32 "Evaluation"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Research article
Decline in attrition rates in United States pediatric residency and fellowship programs, 2007–2020: a repeated cross-sectional study  
Emma Omoruyi, Greg Russell, Kimberly Montez
J Educ Eval Health Prof. 2025;22:24.   Published online September 5, 2025
DOI: https://doi.org/10.3352/jeehp.2025.22.24
  • 1,452 View
  • 153 Download
AbstractAbstract PDFSupplementary Material
Purpose
Declining fill rates in US pediatric residency and subspecialty programs requires trainee retention. Attrition, defined as transfers, withdrawals, dismissals, unsuccessful completions, or deaths, disrupts program function and impacts the pediatric workforce pipeline. It aims to evaluate attrition trends among pediatric residents and fellows in Accreditation Council for Graduate Medical Education (ACGME)-accredited programs from 2007 to 2020.
Methods
This repeated cross-sectional study analyzed publicly available ACGME Data Resource Book records. Attrition rates and 95% confidence intervals (CIs) were calculated overall and by subspecialty. Logistic regression assessed temporal changes; odds ratios (ORs) compared 2020 to 2007.
Results
From 2007–2020, pediatric residents increased from 8,145 to 9,419 and fellows from 2,875 to 4,279. Aggregate annual resident attrition averaged 1.71% (range, 0.93%–2.64%), and fellow attrition ranged from 12.39%–30.87%. Transfer rates declined from 18.05 to 5.20 per 1,000 trainees (P<0.0001), withdrawals from 5.65 to 2.76 (P=0.030), and dismissals from 3.14 in 2010 to 1.27 in 2020 (P=0.0068). Odds of unsuccessful completion significantly decreased in categorical pediatrics (OR, 0.41; 95% CI, 0.29–0.58), pediatric cardiology (OR, 0.08; 95% CI, 0.01–0.64), pediatric critical care (OR, 0.14; 95% CI, 0.06–0.35), and neonatal-perinatal medicine (OR, 0.46; 95% CI, 0.20–1.08).
Conclusion
Although attrition has improved, premature trainee loss can still disrupt program operations and threaten workforce development. Attrition may reflect educational environment quality, support structures, or selection processes. Greater data transparency is needed to understand demographic trends and inform equitable retention strategies, ultimately strengthening training programs and sustaining the United States pediatric workforce.
Educational/Faculty development material
Radiotorax.es: a web-based tool for formative self-assessment in chest X-ray interpretation  
Verónica Illescas-Megías, Jorge Manuel Maqueda-Pérez, Dolores Domínguez-Pinos, Teodoro Rudolphi Solero, Francisco Sendra-Portero
J Educ Eval Health Prof. 2025;22:17.   Published online June 9, 2025
DOI: https://doi.org/10.3352/jeehp.2025.22.17
  • 4,493 View
  • 185 Download
AbstractAbstract PDFSupplementary Material
Radiotorax.es is a free, non-profit web-based tool designed to support formative self-assessment in chest X-ray interpretation. This article presents its structure, educational applications, and usage data from 11 years of continuous operation. Users complete interpretation rounds of 20 clinical cases, compare their reports with expert evaluations, and conduct a structured self-assessment. From 2011 to 2022, 14,389 users registered, and 7,726 completed at least one session. Most were medical students (75.8%), followed by residents (15.2%) and practicing physicians (9.0%). The platform has been integrated into undergraduate medical curricula and used in various educational contexts, including tutorials, peer and expert review, and longitudinal tracking. Its flexible design supports self-directed learning, instructor-guided use, and multicenter research. As a freely accessible resource based on real clinical cases, Radiotorax.es provides a scalable, realistic, and well-received training environment that promotes diagnostic skill development, reflection, and educational innovation in radiology education.
Research articles
Performance of large language models on Thailand’s national medical licensing examination: a cross-sectional study  
Prut Saowaprut, Romen Samuel Wabina, Junwei Yang, Lertboon Siriwat
J Educ Eval Health Prof. 2025;22:16.   Published online May 12, 2025
DOI: https://doi.org/10.3352/jeehp.2025.22.16
  • 5,593 View
  • 338 Download
  • 4 Web of Science
  • 3 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to evaluate the feasibility of general-purpose large language models (LLMs) in addressing inequities in medical licensure exam preparation for Thailand’s National Medical Licensing Examination (ThaiNLE), which currently lacks standardized public study materials.
Methods
We assessed 4 multi-modal LLMs (GPT-4, Claude 3 Opus, Gemini 1.0/1.5 Pro) using a 304-question ThaiNLE Step 1 mock examination (10.2% image-based), applying deterministic API configurations and 5 inference repetitions per model. Performance was measured via micro- and macro-accuracy metrics compared against historical passing thresholds.
Results
All models exceeded passing scores, with GPT-4 achieving the highest accuracy (88.9%; 95% confidence interval, 88.7–89.1), surpassing Thailand’s national average by more than 2 standard deviations. Claude 3.5 Sonnet (80.1%) and Gemini 1.5 Pro (72.8%) followed hierarchically. Models demonstrated robustness across 17 of 20 medical domains, but variability was noted in genetics (74.0%) and cardiovascular topics (58.3%). While models demonstrated proficiency with images (Gemini 1.0 Pro: +9.9% vs. text), text-only accuracy remained superior (GPT-4o: 90.0% vs. 82.6%).
Conclusion
General-purpose LLMs show promise as equitable preparatory tools for ThaiNLE Step 1. However, domain-specific knowledge gaps and inconsistent multi-modal integration warrant refinement before clinical deployment.

Citations

Citations to this article as recorded by  
  • Performance of GPT-4o and o1-Pro on United Kingdom Medical Licensing Assessment-style items: a comparative study
    Behrad Vakili, Aadam Ahmad, Mahsa Zolfaghari
    Journal of Educational Evaluation for Health Professions.2025; 22: 30.     CrossRef
  • Large Language Models for the National Radiological Technologist Licensure Examination in Japan: Cross-Sectional Comparative Benchmarking and Evaluation of Model-Generated Items Study
    Toshimune Ito, Toru Ishibashi, Tatsuya Hayashi, Shinya Kojima, Kazumi Sogabe
    JMIR Medical Education.2025; 11: e81807.     CrossRef
  • Technologies, opportunities, challenges, and future directions for integrating generative artificial intelligence into medical education: a narrative review
    Junseok Kang, Jihyun Ahn
    Ewha Medical Journal.2025; 48(4): e53.     CrossRef
Empirical effect of the Dr LEE Jong-wook Fellowship Program to empower sustainable change for the health workforce in Tanzania: a mixed-methods study  
Masoud Dauda, Swabaha Aidarus Yusuph, Harouni Yasini, Issa Mmbaga, Perpetua Mwambinngu, Hansol Park, Gyeongbae Seo, Kyoung Kyun Oh
J Educ Eval Health Prof. 2025;22:6.   Published online January 20, 2025
DOI: https://doi.org/10.3352/jeehp.2025.22.6
  • 4,982 View
  • 341 Download
AbstractAbstract PDFSupplementary Material
Purpose
This study evaluated the Dr LEE Jong-wook Fellowship Program’s impact on Tanzania’s health workforce, focusing on relevance, effectiveness, efficiency, impact, and sustainability in addressing healthcare gaps.
Methods
A mixed-methods research design was employed. Data were collected from 97 out of 140 alumni through an online survey, 35 in-depth interviews, and one focus group discussion. The study was conducted from November to December 2023 and included alumni from 2009 to 2022. Measurement instruments included structured questionnaires for quantitative data and semi-structured guides for qualitative data. Quantitative analysis involved descriptive and inferential statistics (Spearman’s rank correlation, non-parametric tests) using Python ver. 3.11.0 and Stata ver. 14.0. Thematic analysis was employed to analyze qualitative data using NVivo ver. 12.0.
Results
Findings indicated high relevance (mean=91.6, standard deviation [SD]=8.6), effectiveness (mean=86.1, SD=11.2), efficiency (mean=82.7, SD=10.2), and impact (mean=87.7, SD=9.9), with improved skills, confidence, and institutional service quality. However, sustainability had a lower score (mean=58.0, SD=11.1), reflecting challenges in follow-up support and resource allocation. Effectiveness strongly correlated with impact (ρ=0.746, P<0.001). The qualitative findings revealed that participants valued tailored training but highlighted barriers, such as language challenges and insufficient practical components. Alumni-led initiatives contributed to knowledge sharing, but limited resources constrained sustainability.
Conclusion
The Fellowship Program enhanced Tanzania’s health workforce capacity, but it requires localized curricula and strengthened alumni networks for sustainability. These findings provide actionable insights for improving similar programs globally, confirming the hypothesis that tailored training positively influences workforce and institutional outcomes.
Reliability and construct validation of the Blended Learning Usability Evaluation–Questionnaire with interprofessional clinicians in Canada: a methodological study  
Anish Kumar Arora, Jeff Myers, Tavis Apramian, Kulamakan Kulasegaram, Daryl Bainbridge, Hsien Seow
J Educ Eval Health Prof. 2025;22:5.   Published online January 16, 2025
DOI: https://doi.org/10.3352/jeehp.2025.22.5
  • 4,879 View
  • 330 Download
  • 1 Web of Science
  • 3 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
To generate Cronbach’s alpha and further mixed methods construct validity evidence for the Blended Learning Usability Evaluation–Questionnaire (BLUE-Q).
Methods
Forty interprofessional clinicians completed the BLUE-Q after finishing a 3-month long blended learning professional development program in Ontario, Canada. Reliability was assessed with Cronbach’s α for each of the 3 sections of the BLUE-Q and for all quantitative items together. Construct validity was evaluated through the Grand-Guillaume-Perrenoud et al. framework, which consists of 3 elements: congruence, convergence, and credibility. To compare quantitative and qualitative results, descriptive statistics, including means and standard deviations for each Likert scale item of the BLUE-Q were calculated.
Results
Cronbach’s α was 0.95 for the pedagogical usability section, 0.85 for the synchronous modality section, 0.93 for the asynchronous modality section, and 0.96 for all quantitative items together. Mean ratings (with standard deviations) were 4.77 (0.506) for pedagogy, 4.64 (0.654) for synchronous learning, and 4.75 (0.536) for asynchronous learning. Of the 239 qualitative comments received, 178 were identified as substantive, of which 88% were considered congruent and 79% were considered convergent with the high means. Among all congruent responses, 69% were considered confirming statements and 31% were considered clarifying statements, suggesting appropriate credibility. Analysis of the clarifying statements assisted in identifying 5 categories of suggestions for program improvement.
Conclusion
The BLUE-Q demonstrates high reliability and appropriate construct validity in the context of a blended learning program with interprofessional clinicians, making it a valuable tool for comprehensive program evaluation, quality improvement, and evaluative research in health professions education.

Citations

Citations to this article as recorded by  
  • Multi-methods development and validation of a tool for use in measuring serious illness communication competence: Assessment of clinical encounters – Communication tool (ACE-CT)
    Anish K. Arora, Hsien Seow, Daryl Bainbridge, Kulamakan Kulasegaram, Tavis Apramian, Nadia Incardona, Leah Steinberg, Justin Sanders, Zhimeng Jia, Oren Levine, Jessica Simon, Karen Zhang, Zelda Freitas, Clare Fuller, Amanda Lee Roze des Ordons, Jill Dombr
    Patient Education and Counseling.2026; 144: 109465.     CrossRef
  • Utilizing cognitive interview in the item refinement of the Blended Teaching Assessment Tool (BTAT) for Health Professions Education
    Maria Teresita B. Dalusong, Glenda Sanggalang Ogerio, Valentin C. Dones, Maria Elizabeth M. Grageda
    Philippine Journal of Health Research and Development.2025; 29(2): 54.     CrossRef
  • All providers Better Communication Skills (ABCs) program: protocol for a randomized controlled trial assessing communication training effectiveness with interprofessional clinicians
    Hsien Seow, Anish K. Arora, Daryl Bainbridge, Zhimeng Jia, Leah Steinberg, Nadia Incardona, Oren Levine, Justin J. Sanders, Jessica Simon, Amanda Roze des Ordons, Karen Zhang, Jeff Myers
    BMC Palliative Care.2025;[Epub]     CrossRef
Validation of the Blended Learning Usability Evaluation–Questionnaire (BLUE-Q) through an innovative Bayesian questionnaire validation approach  
Anish Kumar Arora, Charo Rodriguez, Tamara Carver, Hao Zhang, Tibor Schuster
J Educ Eval Health Prof. 2024;21:31.   Published online November 7, 2024
DOI: https://doi.org/10.3352/jeehp.2024.21.31
  • 3,328 View
  • 234 Download
  • 1 Web of Science
  • 3 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
The primary aim of this study is to validate the Blended Learning Usability Evaluation–Questionnaire (BLUE-Q) for use in the field of health professions education through a Bayesian approach. As Bayesian questionnaire validation remains elusive, a secondary aim of this article is to serve as a simplified tutorial for engaging in such validation practices in health professions education.
Methods
A total of 10 health education-based experts in blended learning were recruited to participate in a 30-minute interviewer-administered survey. On a 5-point Likert scale, experts rated how well they perceived each item of the BLUE-Q to reflect its underlying usability domain (i.e., effectiveness, efficiency, satisfaction, accessibility, organization, and learner experience). Ratings were descriptively analyzed and converted into beta prior distributions. Participants were also given the option to provide qualitative comments for each item.
Results
After reviewing the computed expert prior distributions, 31 quantitative items were identified as having a probability of “low endorsement” and were thus removed from the questionnaire. Additionally, qualitative comments were used to revise the phrasing and order of items to ensure clarity and logical flow. The BLUE-Q’s final version comprises 23 Likert-scale items and 6 open-ended items.
Conclusion
Questionnaire validation can generally be a complex, time-consuming, and costly process, inhibiting many from engaging in proper validation practices. In this study, we demonstrate that a Bayesian questionnaire validation approach can be a simple, resource-efficient, yet rigorous solution to validating a tool for content and item-domain correlation through the elicitation of domain expert endorsement ratings.

Citations

Citations to this article as recorded by  
  • Reliability and construct validation of the Blended Learning Usability Evaluation–Questionnaire with interprofessional clinicians in Canada: a methodological study
    Anish Kumar Arora, Jeff Myers, Tavis Apramian, Kulamakan Kulasegaram, Daryl Bainbridge, Hsien Seow
    Journal of Educational Evaluation for Health Professions.2025; 22: 5.     CrossRef
  • Utilizing cognitive interview in the item refinement of the Blended Teaching Assessment Tool (BTAT) for Health Professions Education
    Maria Teresita B. Dalusong, Glenda Sanggalang Ogerio, Valentin C. Dones, Maria Elizabeth M. Grageda
    Philippine Journal of Health Research and Development.2025; 29(2): 54.     CrossRef
  • All providers Better Communication Skills (ABCs) program: protocol for a randomized controlled trial assessing communication training effectiveness with interprofessional clinicians
    Hsien Seow, Anish K. Arora, Daryl Bainbridge, Zhimeng Jia, Leah Steinberg, Nadia Incardona, Oren Levine, Justin J. Sanders, Jessica Simon, Amanda Roze des Ordons, Karen Zhang, Jeff Myers
    BMC Palliative Care.2025;[Epub]     CrossRef
A new performance evaluation indicator for the LEE Jong-wook Fellowship Program of Korea Foundation for International Healthcare to better assess its long-term educational impacts: a Delphi study  
Minkyung Oh, Bo Young Yoon
J Educ Eval Health Prof. 2024;21:27.   Published online October 2, 2024
DOI: https://doi.org/10.3352/jeehp.2024.21.27
  • 3,493 View
  • 287 Download
  • 1 Web of Science
  • 1 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
The Dr. LEE Jong-wook Fellowship Program, established by the Korea Foundation for International Healthcare (KOFIH), aims to strengthen healthcare capacity in partner countries. The aim of the study was to develop new performance evaluation indicators for the program to better assess long-term educational impact across various courses and professional roles.
Methods
A 3-stage process was employed. First, a literature review of established evaluation models (Kirkpatrick’s 4 levels, context/input/process/product evaluation model, Organization for Economic Cooperation and Development Assistance Committee criteria) was conducted to devise evaluation criteria. Second, these criteria were validated via a 2-round Delphi survey with 18 experts in training projects from May 2021 to June 2021. Third, the relative importance of the evaluation criteria was determined using the analytic hierarchy process (AHP), calculating weights and ensuring consistency through the consistency index and consistency ratio (CR), with CR values below 0.1 indicating acceptable consistency.
Results
The literature review led to a combined evaluation model, resulting in 4 evaluation areas, 20 items, and 92 indicators. The Delphi surveys confirmed the validity of these indicators, with content validity ratio values exceeding 0.444. The AHP analysis assigned weights to each indicator, and CR values below 0.1 indicated consistency. The final set of evaluation indicators was confirmed through a workshop with KOFIH and adopted as the new evaluation tool.
Conclusion
The developed evaluation framework provides a comprehensive tool for assessing the long-term outcomes of the Dr. LEE Jong-wook Fellowship Program. It enhances evaluation capabilities and supports improvements in the training program’s effectiveness and international healthcare collaboration.

Citations

Citations to this article as recorded by  
  • Halted medical education and medical residents’ training in Korea, journal metrics, and appreciation to reviewers and volunteers
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2025; 22: 1.     CrossRef
Challenges and potential improvements in the Accreditation Standards of the Korean Institute of Medical Education and Evaluation 2019 (ASK2019) derived through meta-evaluation: a cross-sectional study  
Yoonjung Lee, Min-jung Lee, Junmoo Ahn, Chungwon Ha, Ye Ji Kang, Cheol Woong Jung, Dong-Mi Yoo, Jihye Yu, Seung-Hee Lee
J Educ Eval Health Prof. 2024;21:8.   Published online April 2, 2024
DOI: https://doi.org/10.3352/jeehp.2024.21.8
  • 4,860 View
  • 349 Download
  • 2 Web of Science
  • 3 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to identify challenges and potential improvements in Korea's medical education accreditation process according to the Accreditation Standards of the Korean Institute of Medical Education and Evaluation 2019 (ASK2019). Meta-evaluation was conducted to survey the experiences and perceptions of stakeholders, including self-assessment committee members, site visit committee members, administrative staff, and medical school professors.
Methods
A cross-sectional study was conducted using surveys sent to 40 medical schools. The 332 participants included self-assessment committee members, site visit team members, administrative staff, and medical school professors. The t-test, one-way analysis of variance and the chi-square test were used to analyze and compare opinions on medical education accreditation between the categories of participants.
Results
Site visit committee members placed greater importance on the necessity of accreditation than faculty members. A shared positive view on accreditation’s role in improving educational quality was seen among self-evaluation committee members and professors. Administrative staff highly regarded the Korean Institute of Medical Education and Evaluation’s reliability and objectivity, unlike the self-evaluation committee members. Site visit committee members positively perceived the clarity of accreditation standards, differing from self-assessment committee members. Administrative staff were most optimistic about implementing standards. However, the accreditation process encountered challenges, especially in duplicating content and preparing self-evaluation reports. Finally, perceptions regarding the accuracy of final site visit reports varied significantly between the self-evaluation committee members and the site visit committee members.
Conclusion
This study revealed diverse views on medical education accreditation, highlighting the need for improved communication, expectation alignment, and stakeholder collaboration to refine the accreditation process and quality.

Citations

Citations to this article as recorded by  
  • Beyond intentions: a critical narrative review of accreditation’s planned and emergent impact on medical schools
    Do-Hwan Kim, Roghayeh Gandomkar, Hyo Hyun Yoo, David Rojas
    Advances in Health Sciences Education.2026;[Epub]     CrossRef
  • Evaluation of a Vietnamese medical school using Korean medical school accreditation standards
    Bo-Young Yoon, Yon-Chul Park, Keunmi Lee, Hee-Je Lee, Jung-Sook Ha, Seung-Jae Hong, Nguyen Hoang Minh, Jung-Sik Huh
    Journal of Medicine and Life Science.2026; 23(1): 24.     CrossRef
  • The new placement of 2,000 entrants at Korean medical schools in 2025: is the government’s policy evidence-based?
    Sun Huh
    The Ewha Medical Journal.2024;[Epub]     CrossRef
Effect of an interprofessional simulation program on patient safety competencies of healthcare professionals in Switzerland: a before and after study  
Sylvain Boloré, Thomas Fassier, Nicolas Guirimand
J Educ Eval Health Prof. 2023;20:25.   Published online August 28, 2023
DOI: https://doi.org/10.3352/jeehp.2023.20.25
  • 5,987 View
  • 325 Download
  • 4 Web of Science
  • 7 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to identify the effects of a 12-week interprofessional simulation program, operated between February 2020 and January 2021, on the patient safety competencies of healthcare professionals in Switzerland.
Methods
The simulation training was based on 2 scenarios of hospitalized patients with septic shock and respiratory failure, and trainees were expected to demonstrate patient safety competencies. A single-group before and after study was conducted after the intervention—simulation program, using a measurement tool (the Health Professional Education in Patient Safety Survey) to measure the perceived competencies of physicians, nurses, and nursing assistants. Out of 57 participants, 37 answered the questionnaire surveys 4 times: 48 hours before the training, followed by post-surveys at 24 hours, 6 weeks, and 12 weeks after the training. The linear mixed effect model was applied for the analysis.
Results
Four components out of 6 perceived patient safety competencies improved at 6 weeks but returned to a similar level before training at 12 weeks. Competencies of “communicating effectively,” “managing safety risks,” “understanding human and environmental factors that influence patient safety,” and “recognize and respond to remove immediate risks of harm” are statistically significant both overall and in the comparison between before the training and 6 weeks after the training.
Conclusion
Interprofessional simulation programs contributed to developing some areas of patient safety competencies of healthcare professionals, but only for a limited time. Interprofessional simulation programs should be repeated and combined with other forms of support, including case discussions and debriefings, to ensure lasting effects.

Citations

Citations to this article as recorded by  
  • Sustaining emergency team competencies: Impact of interprofessional simulation with team resource management and structured debriefing
    Yihsuan Tsai, Chunwen Chiu, Shouchuan Sun, Hsingju Lin, Chihhao Lin, Yawen Lee
    Clinical Simulation in Nursing.2026; 110: 101880.     CrossRef
  • Interprofessional education in healthcare settings: are healthcare professionals translating learning into practice? An integrated mixed methods systematic review
    Rebecca Field, Claire Palermo, Jane Kellett, Thomas Bevitt, Krishna Lambert, Rachel Bacon
    Journal of Interprofessional Care.2026; : 1.     CrossRef
  • Expressive pragmatic language in mood and psychotic disorders: a systematic review and meta-analysis
    Fiona Meister, Martin Sellier Silva, Gleb Melshin, Chaimaa El Mouslih, Farida Zaher, Roozbeh Sattari, Hsi T. Wei, Neyra Mekideche, Valentina Bambini, Alban Voppel, Lena Palaniyappan
    Schizophrenia.2026;[Epub]     CrossRef
  • Midwifery Students’ and Obstetricians’ Perception of Training in Non-Technical Skills
    Coralie Fregonese, Paul Guerby, Gilles Vallade, Régis Fuzier
    Simulation & Gaming.2025; 56(4): 374.     CrossRef
  • Cenário de simulação realística sobre respostas não punitivas aos erros no Centro Cirúrgico
    Francyne Sequeira Lopes Martins, Dionisia Oliveira of Oliveira, Cassiana Gil Prates, Kaihara Freitas Furtado, Rita Catalina Aquino Caregnato
    Revista Gaúcha de Enfermagem.2025;[Epub]     CrossRef
  • Realistic simulation scenario on non-punitive responses to errors in the Surgical Center
    Francyne Sequeira Lopes Martins, Dionisia Oliveira of Oliveira, Cassiana Gil Prates, Kaihara Freitas Furtado, Rita Catalina Aquino Caregnato
    Revista Gaúcha de Enfermagem.2025;[Epub]     CrossRef
  • Interprofessional education interventions for healthcare professionals to improve patient safety: a scoping review
    Yan Jiang, Yan Cai, Xue Zhang, Cong Wang
    Medical Education Online.2024;[Epub]     CrossRef
Case report
Successful pilot application of multi-attribute utility analysis concepts in evaluating academic-clinical partnerships in the United States: a case report  
Sara Elizabeth North, Amanda Nicole Sharp
J Educ Eval Health Prof. 2022;19:18.   Published online August 19, 2022
DOI: https://doi.org/10.3352/jeehp.2022.19.18
  • 3,881 View
  • 162 Download
  • 2 Web of Science
  • 3 Crossref
AbstractAbstract PDFSupplementary Material
Strong partnerships between academic health professions programs and clinical practice settings, termed academic-clinical partnerships, are essential in providing quality clinical training experiences. However, the literature does not operationalize a model by which an academic program may identify priority attributes and evaluate its partnerships. This study aimed to develop a values-based academic-clinical partnership evaluation approach, rooted in methodologies from the field of evaluation and implemented in the context of an academic Doctor of Physical Therapy clinical education program. The authors developed a semi-quantitative evaluation approach incorporating concepts from multi-attribute utility analysis (MAUA) that enabled consistent, values-based partnership evaluation. Data-informed actions led to improved overall partnership effectiveness. Pilot outcomes support the feasibility and desirability of moving toward MAUA as a potential methodological framework. Further research may lead to the development of a standardized process for any academic health profession program to perform a values-based evaluation of their academic-clinical partnerships to guide decision-making.

Citations

Citations to this article as recorded by  
  • Advancing Value-Based Academic–Clinical Partnership Evaluation in Physical Therapy Education: Multiattribute Utility Analysis as a Contextual Methodological Approach
    Sara North
    Journal of Physical Therapy Education.2025; 39(2): 186.     CrossRef
  • Application of Multi-Attribute Utility Analysis as a Methodological Framework in Academic–Clinical Partnership Evaluation
    Sara E. North
    American Journal of Evaluation.2024; 45(4): 562.     CrossRef
  • Multi-attribute monitoring applications in biopharmaceutical analysis
    Anurag S. Rathore, Deepika Sarin, Sanghati Bhattacharya, Sunil Kumar
    Journal of Chromatography Open.2024; 6: 100166.     CrossRef
Reviews
Is accreditation in medical education in Korea an opportunity or a burden?  
Hanna Jung, Woo Taek Jeon, Shinki An
J Educ Eval Health Prof. 2020;17:31.   Published online October 21, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.31
  • 9,487 View
  • 176 Download
  • 13 Web of Science
  • 17 Crossref
AbstractAbstract PDFSupplementary Material
The accreditation process is both an opportunity and a burden for medical schools in Korea. The line that separates the two is based on how medical schools recognize and utilize the accreditation process. In other words, accreditation is a burden for medical schools if they view the accreditation process as merely a formal procedure or a means to maintain accreditation status for medical education. However, if medical schools acknowledge the positive value of the accreditation process, accreditation can be both an opportunity and a tool for developing medical education. The accreditation process has educational value by catalyzing improvements in the quality, equity, and efficiency of medical education and by increasing the available options. For the accreditation process to contribute to medical education development, accrediting agencies and medical schools must first be recognized as partners of an educational alliance working together towards common goals. Secondly, clear guidelines on accreditation standards should be periodically reviewed and shared. Finally, a formative self-evaluation process must be introduced for institutions to utilize the accreditation process as an opportunity to develop medical education. This evaluation system could be developed through collaboration among medical schools, academic societies for medical education, and the accrediting authority.

Citations

Citations to this article as recorded by  
  • Curriculum development as a governance process: insights from a new PharmD program
    Geneviève Gauthier, Françoise Moreau-Johnson, Christine Landry, Tim Dubé, Claire Touchie
    Advances in Health Sciences Education.2026;[Epub]     CrossRef
  • Equity in Basic Medical Education accreditation standards: a scoping review protocol
    Neelofar Shaheen, Usman Mahboob, Ahsan Sethi, Muhammad Irfan
    BMJ Open.2025; 15(1): e086661.     CrossRef
  • Making Medical Education Socially Accountable in Australia and Southeast Asia: A Systematic Review
    Jyotsna Rimal, Ashish Shrestha, Elizabeth Cardell, Stephen Billett, Alfred King-yin Lam
    Medical Science Educator.2025; 35(3): 1767.     CrossRef
  • Impacts of the Accreditation Process for Undergraduate Medical Schools: A Scoping Review
    Leticia C. Girotto, Karynne B. Machado, Roberta F. C. Moreira, Milton A. Martins, Patrícia Z. Tempski
    The Clinical Teacher.2025;[Epub]     CrossRef
  • Benefits, Challenges, and Opportunities of LCME Accreditation: A National Survey
    Colleen Hayden, David Muller, Allan R. Tunkel, Reena Karani, Jennifer Christner, Abbas Hyderi, Robert Fallar
    Medical Science Educator.2025; 35(5): 2533.     CrossRef
  • Impact of accreditation on medical education quality improvement in 82 medical schools in Japan: a descriptive study
    Nobuo Nara
    Journal of Educational Evaluation for Health Professions.2025; 22: 22.     CrossRef
  • Two decades of accreditation in Chilean medical education: outcomes and lessons learned
    Oscar Jerez Yañez, Carlos Schade Carter, Miguel Altamirano Rivas, Bárbara Carrasco García
    BMC Medical Education.2025;[Epub]     CrossRef
  • Global research agenda for medical education regulation: findings from a nominal group consensus exercise
    Valdes Roberto Bollela, Vanessa Burch, Kadambari Dharanipragada, Janneke Frambach, Janet Grant, Lois Haruna-Cooper, Homa Kabiri, James Kelly, Maria-Athina Martimianakis, Fernando Menezes da Silva, Lamiaa Mohsen, John-george Nicholson, Mohammed Ahmed Rashi
    BMJ Global Health.2025; 10(12): e016014.     CrossRef
  • To prove or improve? Examining how paradoxical tensions shape evaluation practices in accreditation contexts
    Betty Onyura, Abigail J. Fisher, Qian Wu, Shrutikaa Rajkumar, Sarick Chapagain, Judith Nassuna, David Rojas, Latika Nirula
    Medical Education.2024; 58(3): 354.     CrossRef
  • ASPIRE for excellence in curriculum development
    John Jenkins, Sharon Peters, Peter McCrorie
    Medical Teacher.2024; 46(5): 633.     CrossRef
  • Challenges and potential improvements in the Accreditation Standards of the Korean Institute of Medical Education and Evaluation 2019 (ASK2019) derived through meta-evaluation: a cross-sectional study
    Yoonjung Lee, Min-jung Lee, Junmoo Ahn, Chungwon Ha, Ye Ji Kang, Cheol Woong Jung, Dong-Mi Yoo, Jihye Yu, Seung-Hee Lee
    Journal of Educational Evaluation for Health Professions.2024; 21: 8.     CrossRef
  • Accreditation standards items of post-2nd cycle related to the decision of accreditation of medical schools by the Korean Institute of Medical Education and Evaluation
    Kwi Hwa Park, Geon Ho Lee, Su Jin Chae, Seong Yong Kim
    Korean Journal of Medical Education.2023; 35(1): 1.     CrossRef
  • The Need for the Standards for Anatomy Labs in Medical School Evaluation and Accreditation
    Yu-Ran Heo, Jae-Ho Lee
    Anatomy & Biological Anthropology.2023; 36(3): 81.     CrossRef
  • Seal of Approval or Ticket to Triumph? The Impact of Accreditation on Medical Student Performance in Foreign Medical Council Examinations
    Saurabh RamBihariLal Shrivastava, Titi Savitri Prihatiningsih, Kresna Lintang Pratidina
    Indian Journal of Medical Specialities.2023; 14(4): 249.     CrossRef
  • Internal evaluation in the faculties affiliated to zanjan university of medical sciences: Quality assurance of medical science education based on institutional accreditation
    Alireza Abdanipour, Farhad Ramezani‐Badr, Ali Norouzi, Mehdi Ghaemi
    Journal of Medical Education Development.2022; 15(46): 61.     CrossRef
  • Development of Mission and Vision of College of Korean Medicine Using the Delphi Techniques and Big-Data Analysis
    Sanghee Yeo, Seong Hun Choi, Su Jin Chae
    Journal of Korean Medicine.2021; 42(4): 176.     CrossRef
  • Special reviews on the history and future of the Korean Institute of Medical Education and Evaluation to memorialize its collaboration with the Korea Health Personnel Licensing Examination Institute to designate JEEHP as a co-official journal
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2020; 17: 33.     CrossRef
How to execute Context, Input, Process, and Product evaluation model in medical health education  
So young Lee, Jwa-Seop Shin, Seung-Hee Lee
J Educ Eval Health Prof. 2019;16:40.   Published online December 28, 2019
DOI: https://doi.org/10.3352/jeehp.2019.16.40
  • 21,229 View
  • 904 Download
  • 22 Web of Science
  • 29 Crossref
AbstractAbstract PDFSupplementary Material
Improvements to education are necessary in order to keep up with the education requirements of today. The Context, Input, Process, and Product (CIPP) evaluation model was created for the decision-making towards education improvement, so this model is appropriate in this regard. However, application of this model in the actual context of medical health education is considered difficult in the education environment. Thus, in this study, literature survey of previous studies was investigated to examine the execution procedure of how the CIPP model can be actually applied. For the execution procedure utilizing the CIPP model, the criteria and indicators were determined from analysis results and material was collected after setting the material collection method. Afterwards, the collected material was analyzed for each CIPP element, and finally, the relationship of each CIPP element was analyzed for the final improvement decision-making. In this study, these steps were followed and the methods employed in previous studies were organized. Particularly, the process of determining the criteria and indicators was important and required a significant effort. Literature survey was carried out to analyze the most widely used criteria through content analysis and obtained a total of 12 criteria. Additional emphasis is necessary in the importance of the criteria selection for the actual application of the CIPP model. Also, a diverse range of information can be obtained through qualitative as well as quantitative methods. Above all, since the CIPP evaluation model execution result becomes the basis for the execution of further improved evaluations, the first attempt of performing without hesitation is essential.

Citations

Citations to this article as recorded by  
  • Beyond Surveys: Unveiling the Impact of Focus Group Interviews in Medical Education Evaluation
    Selçuk Akturan, Ayşenur Duman Dilbaz, Yasemin Güner, Melek Üçüncüoğlu, Neşe Kaklıkkaya
    Genel Tıp Dergisi.2026;[Epub]     CrossRef
  • The Correctional Nursing Workforce Crisis: An Innovative Solution to Meet the Challenge
    Deborah Shelton, Lori E. Roscoe, Theresa A. Kapetanovic, Sue Smith
    Journal of Correctional Health Care.2025;[Epub]     CrossRef
  • A Novel Implementation of the CIPP Model in Undergraduate Medical Education: The Karadeniz Technical University Faculty of Medicine Experience
    Selçuk Akturan
    Tıp Eğitimi Dünyası.2025; 24(72): 5.     CrossRef
  • The structural effects of evaluation types in the implementation process of the independent learning program in higher education
    Bambang Budi Wiyono, Aan Komariah, Hendra Hidayat, Desi Eri Kusumaningrum
    Discover Sustainability.2025;[Epub]     CrossRef
  • Implementation of the CIPP Model for Assessing the Elements of Archive Management Based on The Independent Curriculum
    Ananda Destya Selviana, Ismiyati, Mar'atus Sholikah
    Jurnal Pendidikan dan Pengajaran.2025; 58(1): 172.     CrossRef
  • Enhancing Health Policy Administration in LMICs: Dr. LJW Fellowship Program Insights (2021–2023)
    Bomgyeol Kim, Yejin Kim, Jun Su Park, Soo Hyeok Choi, Su Hyun Kim, Vasuki Rajaguru, Hyejin Jung, Tae Hyun Kim
    Annals of Global Health.2025; 91(1): 34.     CrossRef
  • Evaluation of Self-Directed Learning Activities at King Abdulaziz University: A Qualitative Study of Faculty Perceptions
    Ammar A Balkheyour, Michal Tombs
    Cureus.2025;[Epub]     CrossRef
  • Educational and research utility of the registrar clinical encounters in training (ReCEnT) project: an exploration of mechanisms using the context, input, process and product (CIPP) framework
    Michael Tran, Susan Wearne, Andrew Davey, Parker Magin
    Family Medicine and Community Health.2025; 13(3): e003289.     CrossRef
  • Raising expectations and defining boundaries in quality management scholarship: Strengthening the design, methodology, and reporting of quality-focused work
    Paul Simpson, Walter Tavares
    Paramedicine.2025; 22(5): 221.     CrossRef
  • Application of multidisciplinary collaborative education in general clinical training based on the CIPP model: a comparative study
    Xia Li, Qinggan Ni, Jing Zhou, Shanshan Pang, Yingling Zhu, Wenchun Song
    Critical Public Health.2025;[Epub]     CrossRef
  • A structured, mentored experiential clinical research programme for undergraduate students
    MINAKSHI PARIKH, PURVI BHAGAT, AMITA HIMANSHU SUTARIA, SAMYAK BHARWAD, HARIOM VAJA, AHAN BANKER, KHUSHI SHAH, HANSA MOTIGIRI GOSWAMI
    The National Medical Journal of India.2025; 0: 1.     CrossRef
  • A satisfaction-focused CIPP evaluation of Mongolia’s undergraduate occupational therapy program: a cross-sectional study
    Bulganchimeg Sanjmyatav, Erdenetsetseg Myagmar, Karen P. Y. Liu, Janet Lok Chun Lee, Munkh-Erdene Bayartai, Batgerel Oidov, Solongo Bandi, Oyungoo Badamdorj
    Korean Journal of Medical Education.2025; 37(4): 391.     CrossRef
  • Klinik Beceri Eğitiminin Değerlendirilmesinde CIPP Modeli: Teoriden Uygulamaya
    Selcen Öncü, Özlem Sürel Karabilgin Öztürkçü
    Tıp Eğitimi Dünyası.2025; 24(74): 153.     CrossRef
  • Evaluation of the pencak silat coaching program in East Java: A study using the CIPP model
    Heri Wahyudi, Roy Januardi Irawan, Achmad Widodo, Mokhamad Nur Bawono, Himawan Wismanadi, Shidqi Hamdi Pratama Putera, Anindya Mar’atus Sholikhah, Dhananjaya Sutanto
    SPORT TK-Revista EuroAmericana de Ciencias del Deporte.2025; 14: 77.     CrossRef
  • Evaluation of the Maryland Next Gen Test Bank Project: Implications and Recommendations
    Desirée Hensel, Diane M. Billings, Rebecca Wiseman
    Nursing Education Perspectives.2024; 45(4): 225.     CrossRef
  • Development of a blended teaching quality evaluation scale (BTQES) for undergraduate nursing based on the Context, Input, Process and Product (CIPP) evaluation model: A cross-sectional survey
    Yue Zhao, Weijuan Li, Hong Jiang, Mohedesi Siyiti, Meng Zhao, Shuping You, Yinglan Li, Ping Yan
    Nurse Education in Practice.2024; 77: 103976.     CrossRef
  • Internal evaluation of medical programs is more than housework: A scoping review
    Sujani Kodagoda Gamage, Tanisha Jowsey, Jo Bishop, Melanie Forbes, Lucy-Jane Grant, Patricia Green, Helen Houghton, Matthew Links, Mark Morgan, Joan Roehl, Jessica Stokes-Parish, Rano Mal Piryani
    PLOS ONE.2024; 19(10): e0305996.     CrossRef
  • Effectiveness of Peer-Assisted Learning in health professional education: a scoping review of systematic reviews
    Hanbo Feng, Ziyi Luo, Zijing Wu, Xiaohan Li
    BMC Medical Education.2024;[Epub]     CrossRef
  • Pengembangan Budaya Integritas Melalui Pendekatan Sufistik pada Perguruan Tinggi Berbasis Pesantren
    Imaduddin Imaduddin
    Nidhomiyyah: Jurnal Manajemen Pendidikan Islam.2024; 5(1): 66.     CrossRef
  • Evaluating the Economics Education Framework Within the Merdeka Curriculum at the Ecd Level: Case Study at Kharisma Kindergarten
    Fariha Nuraini, Annisa Mar’atus Sholikhah, Wahjoedi, Farida Rahmawati
    Eduscape : Journal of Education Insight.2024; 2(3): 186.     CrossRef
  • Self-care educational guide for mothers with gestational diabetes mellitus: A systematic review on identifying self-care domains, approaches, and their effectiveness
    Zarina Haron, Rosnah Sutan, Roshaya Zakaria, Zaleha Abdullah Mahdy
    Belitung Nursing Journal.2023; 9(1): 6.     CrossRef
  • Evaluation of the Smart Indonesia Program as a Policy to Improve Equality in Education
    Patni Ninghardjanti, Wiedy Murtini, Aniek Hindrayani, Khresna B. Sangka
    Sustainability.2023; 15(6): 5114.     CrossRef
  • Exploring Perceptions of Competency-Based Medical Education in Undergraduate Medical Students and Faculty: A Program Evaluation
    Erica Ai Li, Claire A Wilson, Jacob Davidson, Aaron Kwong, Amrit Kirpalani, Peter Zhan Tao Wang
    Advances in Medical Education and Practice.2023; Volume 14: 381.     CrossRef
  • The Evaluation of China's Double Reduction Policy: A Case Study in Dongming County Mingde Primary School
    Danyang Li , Chaimongkhon Supromin, Supit Boonlab
    International Journal of Sociologies and Anthropologies Science Reviews.2023; 3(6): 437.     CrossRef
  • Exploring the Components of the Research Empowerment Program of the Faculty Members of Kermanshah University of Medical Sciences, Iran Based on the CIPP Model: A Qualitative Study
    Mostafa Jafari, Susan Laei, Elham Kavyani, Rostam Jalali
    Educational Research in Medical Sciences.2021;[Epub]     CrossRef
  • Adapting an Integrated Program Evaluation for Promoting Competency‐Based Medical Education
    Hyunjung Ju, Minkyung Oh, Jong-Tae Lee, Bo Young Yoon
    Korean Medical Education Review.2021; 23(1): 56.     CrossRef
  • Changes in the accreditation standards of medical schools by the Korean Institute of Medical Education and Evaluation from 2000 to 2019
    Hyo Hyun Yoo, Mi Kyung Kim, Yoo Sang Yoon, Keun Mi Lee, Jong Hun Lee, Seung-Jae Hong, Jung –Sik Huh, Won Kyun Park
    Journal of Educational Evaluation for Health Professions.2020; 17: 2.     CrossRef
  • Human Resources Development via Higher Education Scholarships: A Case Study of a Ministry of Public Works and Housing Scholarship Program
    Abdullatif SETİABUDİ, Muchlis. R. LUDDIN, Yuli RAHMAWATI
    International e-Journal of Educational Studies.2020; 4(8): 209.     CrossRef
  • Exploring Components, Barriers, and Solutions for Faculty Members’ Research Empowerment Programs Based on the CIPP Model: A Qualitative Study
    Mostafa Jafari, Soosan Laei, Elham Kavyani, Rostam Jalali
    Journal of Occupational Health and Epidemiology.2020; 9(4): 213.     CrossRef
Brief report
Benefits of focus group discussions beyond online surveys in course evaluations by medical students in the United States: a qualitative study  
Katharina Brandl, Soniya V. Rabadia, Alexander Chang, Jess Mandel
J Educ Eval Health Prof. 2018;15:25.   Published online October 16, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.25
  • 25,260 View
  • 379 Download
  • 7 Web of Science
  • 10 Crossref
AbstractAbstract PDFSupplementary Material
In addition to online questionnaires, many medical schools use supplemental evaluation tools such as focus groups to evaluate their courses. Although some benefits of using focus groups in program evaluation have been described, it is unknown whether these inperson data collection methods provide sufficient additional information beyond online evaluations to justify them. In this study, we analyze recommendations gathered from student evaluation team (SET) focus group meetings and analyzed whether these items were captured in open-ended comments within the online evaluations. Our results indicate that online evaluations captured only 49% of the recommendations identified via SETs. Surveys to course directors identified that 74% of the recommendations exclusively identified via the SETs were implemented within their courses. Our results indicate that SET meetings provided information not easily captured in online evaluations and that these recommendations resulted in actual course changes.

Citations

Citations to this article as recorded by  
  • Beyond Surveys: Unveiling the Impact of Focus Group Interviews in Medical Education Evaluation
    Selçuk Akturan, Ayşenur Duman Dilbaz, Yasemin Güner, Melek Üçüncüoğlu, Neşe Kaklıkkaya
    Genel Tıp Dergisi.2026;[Epub]     CrossRef
  • Effects of building resilience skills among undergraduate medical students in a multi-cultural, multi-ethnic setting in the United Arab Emirates: A convergent mixed methods study
    Farah Otaki, Samuel B. Ho, Bhavana Nair, Reem AlGurg, Adrian Stanley, Amar Hassan Khamis, Agnes Paulus, Laila Alsuwaidi, Ashraf Atta Mohamed Safein Salem
    PLOS ONE.2025; 20(2): e0308774.     CrossRef
  • “What Did You Learn?” - An Alternative Narrative Approach to Student Evaluations of Teaching
    Gunnar Tschudi Bondevik, Eivind Alexander Valestrand, Monika Kvernenes
    Journal of Medical Education and Curricular Development.2025;[Epub]     CrossRef
  • Overreliance on student satisfaction surveys in medical education
    Anthony R Artino, H Carrie Chen, Sally A Santen, Richard J Simons, Jennifer G Christner, Arianne Teherani
    Academic Medicine.2025;[Epub]     CrossRef
  • Assessing the Utility of Oral and Maxillofacial Surgery Posters as Educational Aids in Dental Education for Undergraduate Students: Is it Useless or Helpful?
    Seyed Mohammad Ali Seyedi, Navid Kazemian, Omid Alizadeh, Zeinab Mohammadi, Maryam Jamali, Reza Shahakbari, Sahand Samieirad
    WORLD JOURNAL OF PLASTIC SURGERY.2024; 13(1): 57.     CrossRef
  • Grupos focais como ferramenta de pesquisa qualitativa na fisioterapia: implicações e expectativas
    Dartel Ferrari de Lima, Adelar Aparecido Sampaio
    Revista Pesquisa Qualitativa.2023; 11(27): 361.     CrossRef
  • Educational attainment for at-risk high school students: closing the gap
    Karen Miner-Romanoff
    SN Social Sciences.2023;[Epub]     CrossRef
  • Student evaluations of teaching and the development of a comprehensive measure of teaching effectiveness for medical schools
    Constantina Constantinou, Marjo Wijnen-Meijer
    BMC Medical Education.2022;[Epub]     CrossRef
  • National Security Law Education in Hong Kong: Qualitative Evaluation Based on the Perspective of the Students
    Daniel T. L. Shek, Xiaoqin Zhu, Diya Dou, Xiang Li
    International Journal of Environmental Research and Public Health.2022; 20(1): 553.     CrossRef
  • Mentoring as a transformative experience
    Wendy A. Hall, Sarah Liva
    Mentoring & Tutoring: Partnership in Learning.2021; 29(1): 6.     CrossRef
Research article
Evaluation of an undergraduate occupational health program in Iran based on alumni perceptions: a structural equation model  
Semira Mehralizadeh, Alireza Dehdashti, Masoud Motalebi Kashani
J Educ Eval Health Prof. 2017;14:16.   Published online July 26, 2017
DOI: https://doi.org/10.3352/jeehp.2017.14.16
  • 34,146 View
  • 318 Download
  • 3 Web of Science
  • 1 Crossref
AbstractAbstract PDF
Purpose
Evaluating educational programs can improve the quality of education. The present study evaluated the undergraduate occupational health program at the Semnan University of Medical Sciences in Semnan, Iran, with a focus on the associations between alumni perceptions of the learning environment and the outcomes of the occupational health program. Methods: A cross-sectional questionnaire survey was conducted among alumni of the undergraduate occupational health program. We asked alumni to rate their perceptions of the items using a 4-point Likert scale. The associations between alumni perceptions of the educational program and curriculum, faculty, institutional resources, and learning outcomes were modeled and described using structural equation modeling procedures. Results: A descriptive analysis of alumni perceptions indicated low evaluations for the administrative system, practical and research-based courses, and the number of faculty members. We found that a structural model of the evaluation variables of curriculum, faculty qualifications, and institutional resources significantly predicted undergraduate educational outcomes. The curriculum had direct and indirect effects on learning outcomes, mediated by faculty. Conclusion: The findings of our study highlight the usefulness of the structural equation modeling approach for examining links between variables related to the learning process and learning outcomes. Surveys of alumni can provide data for reassessing the learning environment in the light of the professional competencies needed for occupational health graduates.

Citations

Citations to this article as recorded by  
  • Integrated-Based Curriculum of Pharmaceutical Dosage Forms (ICPDF): What Factors Affect the Learning Outcome Attainment?
    Anis Yohana Chaerunisaa, Akhmad Habibi, Muhaimin Muhaimin, Mailizar Mailizar, Tommy Tanu Wijaya, Ahmad Samed Al-Adwan
    International Journal of Environmental Research and Public Health.2023; 20(5): 4272.     CrossRef
Research Articles
Evaluation of a continuing professional development training program for physicians and physician assistants in hospitals in Laos based on the Kirkpatrick model  
Hyun Bae Yoon, Jwa-Seop Shin, Ketsomsouk Bouphavanh, Yu Min Kang
J Educ Eval Health Prof. 2016;13:21.   Published online May 31, 2016
DOI: https://doi.org/10.3352/jeehp.2016.13.21
  • 33,948 View
  • 360 Download
  • 25 Web of Science
  • 22 Crossref
AbstractAbstract PDF
Purpose
Medical professionals from Korea and Laos have been working together to develop a continuing professional development training program covering the major clinical fields of primary care. This study aimed to evaluate the effectiveness of the program from 2013 to 2014 using the Kirkpatrick model. Methods: A questionnaire was used to evaluate the reaction of the trainees, and the trainers assessed the level of trainees’ performance at the beginning and the end of each clinical section. The transfer (behavioral change) of the trainees was evaluated through the review of medical records written by the trainees before and after the training program. Results: The trainees were satisfied with the training program, for which the average score was 4.48 out of 5.0. The average score of the trainees’ performance at the beginning was 2.39 out of 5.0, and rose to 3.88 at the end of each section. The average score of the medical records written before the training was 2.92 out of 5.0, and it rose to 3.34 after the training. The number of patient visits to the district hospitals increased. Conclusion: The continuing professional development training program, which was planned and implemented with the full engagement and responsibility of Lao health professionals, proved to be effective.

Citations

Citations to this article as recorded by  
  • An evaluation of pre-post antimicrobial stewardship healthcare educational intervention studies utilizing the Kirkpatrick model: a scoping review
    Ziad G. Nasr, Hanin M. Said, Kaoutar R. Barakat, Raghad M. Elwan, Aya Maklad, Zachariah J. Nazar
    International Journal of Clinical Pharmacy.2025; 47(6): 1635.     CrossRef
  • Practicalities and dichotomies of education policy and practice of higher education in the Golden Triangle Area (Southeast Asia): Implications for international development
    Shine Wanna Aung, Than Than Aye
    Policy Futures in Education.2024; 22(7): 1421.     CrossRef
  • Evaluation of cost-effectiveness of single-credit traffic safety course based on Kirkpatrick model: a case study of Iran
    Mina Golestani, Homayoun Sadeghi-bazargani, Sepideh Harzand-Jadidi, Hamid Soori
    BMC Medical Education.2024;[Epub]     CrossRef
  • Effectiveness of E-learning on “Sexual Health” among students of Shahid Beheshti University of Medical Sciences based on the Kirkpatrick model
    Zohreh Sadat Mirmoghtadaie, Zahra Mahbadi, Zinat Mahbadi
    Journal of Education and Health Promotion.2024;[Epub]     CrossRef
  • The benefits and limitations of establishing the PA profession globally
    Arden R. Turkewitz, Jane P. Sallen, Rachel M. Smith, Kandi Pitchford, Kimberly Lay, Scott Smalley
    JAAPA.2024; 37(11): 1.     CrossRef
  • Transforming the “SEAD”: Evaluation of a Virtual Surgical Exploration and Discovery Program and its Effects on Career Decision-Making
    Kameela Miriam Alibhai, Patricia Burhunduli, Christopher Tarzi, Kush Patel, Christine Seabrook, Tim Brandys
    Journal of Surgical Education.2023; 80(2): 256.     CrossRef
  • Evaluation of the effectiveness of a training programme for nurses regarding augmentative and alternative communication with intubated patients using Kirkpatrick's model: A pilot study
    Marzieh Momennasab, Fatemeh Mohammadi, Fereshteh DehghanRad, Azita Jaberi
    Nursing Open.2023; 10(5): 2895.     CrossRef
  • Outcome Evaluation of a Transnational Postgraduate Capacity-Building Program Using the Objective Structured Clinical Examination
    Kye-Yeung Park, Hoon-Ki Park, Jwa-Seop Shin, Taejong Kim, Youngjoo Jung, Min Young Seo, Ketsomsouk Bouphavanh, Sourideth Sengchanh, Ketmany Inthachack
    Evaluation Review.2023; 47(4): 680.     CrossRef
  • Developing a capacity building training model for public health managers of low and middle income countries
    Kritika Upadhyay, Sonu Goel, Preethi John, Sara Rubinelli
    PLOS ONE.2023; 18(4): e0272793.     CrossRef
  • Portfolios with Evidence of Reflective Practice Required by Regulatory Bodies: An Integrative Review
    Marco Zaccagnini, Patricia A. Miller
    Physiotherapy Canada.2022; 74(4): 330.     CrossRef
  • Implementation and evaluation of crowdsourcing in global health education
    Huanle Cai, Huiqiong Zheng, Jinghua Li, Chun Hao, Jing Gu, Jing Liao, Yuantao Hao
    Global Health Research and Policy.2022;[Epub]     CrossRef
  • An Evaluation of the Surgical Foundations Curriculum: A National Study
    Ekaterina Kouzmina, Stephen Mann, Timothy Chaplin, Boris Zevin
    Journal of Surgical Education.2021; 78(3): 914.     CrossRef
  • Surgical data strengthening in Ethiopia: results of a Kirkpatrick framework evaluation of a data quality intervention
    Sehrish Bari, Joseph Incorvia, Katherine R. Iverson, Abebe Bekele, Kaya Garringer, Olivia Ahearn, Laura Drown, Amanu Aragaw Emiru, Daniel Burssa, Samson Workineh, Ephrem Daniel Sheferaw, John G. Meara, Andualem Beyene
    Global Health Action.2021;[Epub]     CrossRef
  • Evaluation of a Neonatal Resuscitation Training Programme for Healthcare Professionals in Zanzibar, Tanzania: A Pre-post Intervention Study
    Xiang Ding, Li Wang, Mwinyi I. Msellem, Yaojia Hu, Jun Qiu, Shiying Liu, Mi Zhang, Lihui Zhu, Jos M. Latour
    Frontiers in Pediatrics.2021;[Epub]     CrossRef
  • Building Capacity of Programme Managers under National Health Mission on Public Health Management and Leadership: Experiences and Way Forward
    Sonu Goel, Kritika Upadhyay, BK Padhi
    Health and Population: Perspectives and Issues.2021; 44(3): 140.     CrossRef
  • Evaluation of a training program on primary eye care for an Accredited Social Health Activist (ASHA) in an urban district
    Pallavi Shukla, Praveen Vashist, SurajSingh Senjam, Vivek Gupta
    Indian Journal of Ophthalmology.2020; 68(2): 356.     CrossRef
  • Micro-feedback skills workshop impacts perceptions and practices of doctoral faculty
    Najma Baseer, James Degnan, Mandy Moffat, Usman Mahboob
    BMC Medical Education.2020;[Epub]     CrossRef
  • Residents working with Médecins Sans Frontières: training and pilot evaluation
    Alba Ripoll-Gallardo, Luca Ragazzoni, Ettore Mazzanti, Grazia Meneghetti, Jeffrey Michael Franc, Alessandro Costa, Francesco della Corte
    Scandinavian Journal of Trauma, Resuscitation and Emergency Medicine.2020;[Epub]     CrossRef
  • Medical education in Laos
    Timothy Alan Wittick, Ketsomsouk Bouphavanh, Vannyda Namvongsa, Amphay Khounthep, Amy Gray
    Medical Teacher.2019; 41(8): 877.     CrossRef
  • Evaluation of the effectiveness of a first aid health volunteers’ training programme using Kirkpatrick’s model: A pilot study
    Fatemeh Vizeshfar, Marzieh Momennasab, Shahrzad Yektatalab, Mohamad Taghi Iman
    Health Education Journal.2018; 77(2): 190.     CrossRef
  • Evaluation of a consulting training course for international development assistance for health
    Pan Gao, Hao Xiang, Suyang Liu, Yisi Liu, Shengjie Dong, Feifei Liu, Wenyuan Yu, Xiangyu Li, Li Guan, Yuanyuan Chu, Zongfu Mao, Shu Chen, Shenglan Tang
    BMC Medical Education.2018;[Epub]     CrossRef
  • Empowering the Filipino Physician through Continuing Professional Development in the Philippines: Gearing towards ASEAN Harmonization and Globalization
    Maria Minerva P Calimag
    Journal of Medicine, University of Santo Tomas.2018; 2(1): 121.     CrossRef
Smartphone-based evaluations of clinical placements—a useful complement to web-based evaluation tools  
Jesper Hessius, Jakob Johansson
J Educ Eval Health Prof. 2015;12:55.   Published online November 30, 2015
DOI: https://doi.org/10.3352/jeehp.2015.12.55
  • 30,006 View
  • 146 Download
  • 2 Web of Science
  • 5 Crossref
AbstractAbstract PDF
Purpose
Web-based questionnaires are currently the standard method for course evaluations. The high rate of smartphone adoption in Sweden makes possible a range of new uses, including course evaluation. This study examines the potential advantages and disadvantages of using a smartphone app as a complement to web-based course evaluation systems.
Methods
An iPhone app for course evaluations was developed and interfaced to an existing web-based tool. Evaluations submitted using the app were compared with those submitted using the web between August 2012 and June 2013, at the Faculty of Medicine at Uppsala University, Sweden.
Results
At the time of the study, 49% of the students were judged to own iPhones. Over the course of the study, 3,340 evaluations were submitted, of which 22.8% were submitted using the app. The median of mean scores in the submitted evaluations was 4.50 for the app (with an interquartile range of 3.70-5.20) and 4.60 (3.70-5.20) for the web (P= 0.24). The proportion of evaluations that included a free-text comment was 50.5% for the app and 49.9% for the web (P= 0.80).
Conclusion
An app introduced as a complement to a web-based course evaluation system met with rapid adoption. We found no difference in the frequency of free-text comments or in the evaluation scores. Apps appear to be promising tools for course evaluations. web-based course evaluation system met with rapid adoption. We found no difference in the frequency of free-text comments or in the evaluation scores. Apps appear to be promising tools for course evaluations.

Citations

Citations to this article as recorded by  
  • Practical tips for starting a successful national postgraduate course
    Magnus Sundbom
    MedEdPublish.2024; 13: 26.     CrossRef
  • Practical tips for starting a successful national postgraduate course
    Magnus Sundbom
    MedEdPublish.2024; 13: 26.     CrossRef
  • Practical tips for starting a successful national postgraduate course
    Magnus Sundbom
    MedEdPublish.2023; 13: 26.     CrossRef
  • Practical tips for starting a successful national postgraduate course
    Magnus Sundbom
    MedEdPublish.2023; 13: 26.     CrossRef
  • Enhancing emergency care in low-income countries using mobile technology-based training tools
    Hilary Edgcombe, Chris Paton, Mike English
    Archives of Disease in Childhood.2016; 101(12): 1149.     CrossRef
Small group effectiveness in a Caribbean medical school’s problem-based learning sessions  
P Ravi Shankar, Atanu Nandy, Ramanan Balasubramanium, Soumitra Chakravarty
J Educ Eval Health Prof. 2014;11:5.   Published online March 24, 2014
DOI: https://doi.org/10.3352/jeehp.2014.11.5
  • 66,905 View
  • 195 Download
  • 5 Web of Science
  • 5 Crossref
AbstractAbstract PDF
Purpose
The tutorial group effectiveness instrument was developed to provide objective information on the effectiveness of small groups. Student perception of small group effectiveness during the PBL process has not been previously studied in Xavier University School of Medicine. Hence the present study was carried out.
Methods
The study was conducted among the second and third semester undergraduate medical students during the last week of September 2013, in Xavier University School of Medicine, Aruba, Kingdom of the Netherlands. Students were informed about the objectives of the study and invited to participate after obtaining written, informed consent. Demographic information like gender, age, nationality and whether the respondent had been exposed to PBL before joining the institution were noted. Student perception about small group effectiveness was studied by noting their degree of agreement with a set of 19 statements using a Likert type scale.
Results
Thirty four of the 37 (91.9%) second and third semester medical students participated in the study. The mean cognitive score was 3.76 while the mean motivational and demotivational scores were 3.65 and 2.51 respectively. The median cognitive category score was 27 (maximum score 35) while the motivation score was 26 (maximum score 35) and the demotivational score was 12 (maximum being 25). There was no significant difference in scores according to respondents’ demographic characteristics.
Conclusion
Student perception about small group effectiveness was positive. Since most medical schools all over the world already have or are introducing PBL as a learning modality, Tutorial Group Effectiveness Instrument can provide valuable information about small group functioning during PBL sessions.

Citations

Citations to this article as recorded by  
  • Relationship of Prior Knowledge and Scenario Quality With the Effectiveness of Problem-based Learning Discussion among Medical Students of Universitas Malikussaleh, Aceh, Indonesia
    Mulyati Sri Rahayu, Sri Wahyuni, Yuziani Yuziani
    Malaysian Journal of Medicine and Health Sciences.2023; 19(4): 15.     CrossRef
  • Should the PBL tutor be present? A cross-sectional study of group effectiveness in synchronous and asynchronous settings
    Samuel Edelbring, Siw Alehagen, Evalotte Mörelius, AnnaKarin Johansson, Patrik Rytterström
    BMC Medical Education.2020;[Epub]     CrossRef
  • Initiating small group learning in a Caribbean medical school
    P. Ravi Shankar
    Journal of Educational Evaluation for Health Professions.2015; 12: 10.     CrossRef
  • Aprendizagem Baseada em Problemas na Graduação Médica – Uma Revisão da Literatura Atual
    Luciana Brosina de Leon, Fernanda de Quadros Onófrio
    Revista Brasileira de Educação Médica.2015; 39(4): 614.     CrossRef
  • Assessing the Effectiveness of Problem-Based Learning of Preventive Medicine Education in China
    Xiaojie Ding, Liping Zhao, Haiyan Chu, Na Tong, Chunhui Ni, Zhibin Hu, Zhengdong Zhang, Meilin Wang
    Scientific Reports.2014;[Epub]     CrossRef
Indian medical students’ perspectives on problem-based learning experiences in the undergraduate curriculum: One size does not fit all  
Nanda Bijli, Manjunatha Shankarappa
J Educ Eval Health Prof. 2013;10:11.   Published online December 31, 2012
DOI: https://doi.org/10.3352/jeehp.2013.10.11
  • 53,797 View
  • 217 Download
  • 13 Crossref
PDF

Citations

Citations to this article as recorded by  
  • Adoption of Problem-Based Learning in Medical Schools in Non-Western Countries: A Systematic Review
    See Chai Carol Chan, Anjali Rajendra Gondhalekar, George Choa, Mohammed Ahmed Rashid
    Teaching and Learning in Medicine.2024; 36(2): 111.     CrossRef
  • Barriers and Solutions to Successful Problem-Based Learning Delivery in Developing Countries – A Literature Review
    Jhiamluka Solano, Melba Zuniga Gutierrez, Esther Pinel-Guzmán, Génesis Henriquez
    Cureus.2023;[Epub]     CrossRef
  • PERCEPTION OF PHASE 1 MBBS STUDENTS ON E- LEARNING AND ONLINE ASSESSMENT DURING COVID-19 AT GOVT. MEDICAL COLLEGES, WEST BENGAL, INDIA
    Sanhita Mukherjee, Sagarika Sarkar, Hrishikesh Bagchi, Diptakanti Mukhopadhyay
    PARIPEX INDIAN JOURNAL OF RESEARCH.2022; : 42.     CrossRef
  • Ensuring Problem-Based Learning for Medical Students and Looking Forward to Developing Competencies
    Manish Taywade, Kumbha Gopi, Debkumar Pal, Bimal Kumar Sahoo
    Amrita Journal of Medicine.2022; 18(1): 1.     CrossRef
  • Practice and effectiveness of web-based problem-based learning approach in a large class-size system: A comparative study
    Yongxia Ding, Peili Zhang
    Nurse Education in Practice.2018; 31: 161.     CrossRef
  • LEARNING BIOLOGY THROUGH PROBLEM BASED LEARNING – PERCEPTION OF STUDENTS
    THAKUR PREETI, DUTT SUNIL, CHAUHAN ABHISHEK
    i-manager's Journal of Educational Technology.2018; 15(2): 44.     CrossRef
  • Development of an international comorbidity education framework
    C. Lawson, S. Pati, J. Green, G. Messina, A. Strömberg, N. Nante, D. Golinelli, A. Verzuri, S. White, T. Jaarsma, P. Walsh, P. Lonsdale, U.T. Kadam
    Nurse Education Today.2017; 55: 82.     CrossRef
  • A Rapid Review of the Factors Affecting Healthcare Students' Satisfaction with Small-Group, Active Learning Methods
    James M. Kilgour, Lisa Grundy, Lynn V. Monrouxe
    Teaching and Learning in Medicine.2016; 28(1): 15.     CrossRef
  • Students’ perceptions and satisfaction level of hybrid problem-based learning for 16 years in Kyungpook National University School of Medicine, Korea
    Sanghee Yeo, Bong Hyun Chang
    Korean Journal of Medical Education.2016; 28(1): 9.     CrossRef
  • Problem-based learning: Dental student's perception of their education environments at Qassim University
    Shahad S. Alkhuwaiter, Roqayah I. Aljuailan, Saeed M. Banabilh
    Journal of International Society of Preventive and Community Dentistry.2016; 6(6): 575.     CrossRef
  • PERCEPTION OF PHARMACOLOGY TEACHING METHODS AMONG SECOND YEAR UNDERGRADUATE STUDENTS TOWARDS BETTER LEARNING
    Padmanabha Thiruganahalli Shivaraju, Manu Gangadhar, Chandrakantha Thippeswamy, Neha Krishnegowda
    Journal of Evidence Based Medicine and Healthcare.2016; 3(30): 1352.     CrossRef
  • Aprendizagem Baseada em Problemas na Graduação Médica – Uma Revisão da Literatura Atual
    Luciana Brosina de Leon, Fernanda de Quadros Onófrio
    Revista Brasileira de Educação Médica.2015; 39(4): 614.     CrossRef
  • EFFECT OF PROBLEM BASED LEARNING IN COMPARISION WITH LECTURE BASED LEARNING IN FORENSIC MEDICINE
    Padma kumar K
    Journal of Evidence Based Medicine and Healthcare.2015; 2(37): 5932.     CrossRef
Is it time for integration of surgical skills simulation into the United Kingdom undergraduate medical curriculum? A perspective from King’s College London School of Medicine  
Hamaoui Karim, Sadideen Hazim, Saadeddin Munir, Onida Sarah, Hoey Andrew W, Rees John
J Educ Eval Health Prof. 2013;10:10.   Published online December 31, 2012
DOI: https://doi.org/10.3352/jeehp.2013.10.10
  • 37,421 View
  • 185 Download
  • 19 Crossref
PDF

Citations

Citations to this article as recorded by  
  • Understanding the relevance of surgical specialties in undergraduate medical education: Insights of graduates
    Fernando Girón‐Luque, Luis‐Jaime Téllez‐Rodríguez, Jorge Rueda‐Gutiérrez, John Vergel
    The Clinical Teacher.2024;[Epub]     CrossRef
  • A simple solution to improve surgical teaching among medical students
    Maisie de Wolf, Elizabeth Birch
    Clinical Anatomy.2021; 34(8): 1129.     CrossRef
  • A Commentary on “Core content of the medical school surgical curriculum: Consensus report from the association of surgeons in training (ASIT) (Int J Surg 2020; Epub ahead of print)
    Preeti Sandhu, Karanjeet Sagoo, Gurnoor Nagi
    International Journal of Surgery.2020;[Epub]     CrossRef
  • An expert-led and artificial intelligence system-assisted tutoring course to improve the confidence of Chinese medical interns in suturing and ligature skills: a prospective pilot study
    Ying-Ying Yang, Boaz Shulruf
    Journal of Educational Evaluation for Health Professions.2019; 16: 7.     CrossRef
  • Introduction of suturing skills acquisition into undergraduate surgical education: Early experience from Ile-Ife, Nigeria
    AdewaleAbdulwasiu Aderounmu, FunmilolaOlanike Wuraola, Olalekan Olasehinde, OludayoA Sowande, AdewaleOluseye Adisa
    Nigerian Journal of Surgery.2019; 25(2): 188.     CrossRef
  • Introducing In Vivo Dissection Modules for Undergraduate Level Trainees: What Is the Actual Benefit and How Could We Make It More Efficient?
    Michail Sideris, Apostolos Papalois, Korina Theodoraki, Georgios Paparoidamis, Nikolaos Staikoglou, Ismini Tsagkaraki, Efstratios Koletsis, Panagiotis Dedeilias, Nikolaos Lymperopoulos, Konstantinos Imprialos, Savvas Papagrigoriadis, Vassilios Papalois, G
    Indian Journal of Surgery.2018; 80(1): 68.     CrossRef
  • A Novel Clinical-Simulated Suture Education for Basic Surgical Skill: Suture on the Biological Tissue Fixed on Standardized Patient Evaluated with Objective Structured Assessment of Technical Skill (OSATS) Tools
    Zhanlong Shen, Fan Yang, Pengji Gao, Li Zeng, Guanchao Jiang, Shan Wang, Yingjiang Ye, Fengxue Zhu
    Journal of Investigative Surgery.2018; 31(4): 333.     CrossRef
  • Simulation-Based Learning Strategies to Teach Undergraduate Students Basic Surgical Skills: A Systematic Review
    Iakovos Theodoulou, Marios Nicolaides, Thanos Athanasiou, Apostolos Papalois, Michail Sideris
    Journal of Surgical Education.2018; 75(5): 1374.     CrossRef
  • Early and prolonged opportunities to practice suturing increases medical student comfort with suturing during clerkships: Suturing during cadaver dissection
    Edward P. Manning, Priti L. Mishall, Maxwell D. Weidmann, Herschel Flax, Sam Lan, Mark Erlich, William B. Burton, Todd R. Olson, Sherry A. Downie
    Anatomical Sciences Education.2018; 11(6): 605.     CrossRef
  • Hands train the brain—what is the role of hand tremor and anxiety in undergraduate microsurgical skills?
    John Hanrahan, Michail Sideris, Terouz Pasha, Parmenion P. Tsitsopoulos, Iakovos Theodoulou, Marios Nicolaides, Efstratia-Maria Georgopoulou, Dimitris Kombogiorgas, Alexios Bimpis, Apostolos Papalois
    Acta Neurochirurgica.2018; 160(9): 1673.     CrossRef
  • Promoting Undergraduate Surgical Education: Current Evidence and Students’ Views on ESMSC International Wet Lab Course
    Michail Sideris, Apostolos Papalois, Korina Theodoraki, Ioannis Dimitropoulos, Elizabeth O. Johnson, Efstratia-Maria Georgopoulou, Nikolaos Staikoglou, Georgios Paparoidamis, Panteleimon Pantelidis, Ismini Tsagkaraki, Stefanos Karamaroudis, Michael E. Pot
    Journal of Investigative Surgery.2017; 30(2): 71.     CrossRef
  • Op.-Simulation in der Chirurgie
    A. Nabavi, J. Schipper
    HNO.2017; 65(1): 7.     CrossRef
  • Has the Bachelor of Surgery Left Medical School?—A National Undergraduate Assessment
    Matthew J. Lee, Thomas M. Drake, Tom A.M. Malik, Timothy O’Connor, Ryad Chebbout, Ahmed Daoub, Jonathan R.L. Wild
    Journal of Surgical Education.2016; 73(4): 655.     CrossRef
  • Medical students’ satisfaction with the Applied Basic Clinical Seminar with Scenarios for Students, a novel simulation-based learning method in Greece
    Panteleimon Pantelidis, Nikolaos Staikoglou, Georgios Paparoidamis, Christos Drosos, Stefanos Karamaroudis, Athina Samara, Christodoulos Keskinis, Michail Sideris, George Giannakoulas, Georgios Tsoulfas, Asterios Karagiannis
    Journal of Educational Evaluation for Health Professions.2016; 13: 13.     CrossRef
  • Evaluating the educational environment of an international animal model-based wet lab course for undergraduate students
    Michail Ch. Sideris, Apostolos E. Papalois, Thanos Athanasiou, Ioannis Dimitropoulos, Korina Theodoraki, Francois Sousa Dos Santos, Georgios Paparoidamis, Nikolaos Staikoglou, Dimitrios Pissas, Peter C. Whitfield, Alexandros Rampotas, Savvas Papagrigoriad
    Annals of Medicine & Surgery.2016; 12: 8.     CrossRef
  • Prepared for Practice? Interns’ Experiences of Undergraduate Clinical Skills Training in Ireland
    M. Morris, A. O'Neill, A. Gillis, S. Charania, J. Fitzpatrick, A. Redmond, S. Rosli, P.F. Ridgway
    Journal of Medical Education and Curricular Development.2016; 3: JMECD.S39381.     CrossRef
  • Poor use of clinical skills centres by trainee doctors
    K Bedi, SJ Chapman, G Marangoni, KR Prasad, AR Hakeem
    The Bulletin of the Royal College of Surgeons of England.2016; 98(6): 258.     CrossRef
  • The role of student surgical interest groups and surgical Olympiads in anatomical and surgical undergraduate training in Russia
    Sergey Dydykin, Marina Kapitonova
    Anatomical Sciences Education.2015; 8(5): 471.     CrossRef
  • Comparison of Veterinary Student Ability to Learn 1‐Handed and 2‐Handed Techniques for Surgical Knot Tying
    Angharad C.J. Thomas, Graham M. Hayes, Jackie L. Demetriou
    Veterinary Surgery.2015; 44(6): 798.     CrossRef
Improved quality and quantity of written feedback is associated with a structured feedback proforma
Philip M. Newton, Melisa J. Wallace, Judy McKimm
J Educ Eval Health Prof. 2012;9:10.   Published online August 13, 2012
DOI: https://doi.org/10.3352/jeehp.2012.9.10
  • 57,141 View
  • 216 Download
  • 19 Crossref
AbstractAbstract PDF
Facilitating the provision of detailed, deep and useful feedback is an important design feature of any educational programme. Here we evaluate feedback provided to medical students completing short transferable skills projects. Feedback quantity and depth were evaluated before and after a simple intervention to change the structure of the feedback-provision form from a blank free-text feedback form to a structured proforma that asked a pair of short questions for each of the six domains being assessed. Each pair of questions consisted of asking the marker ?占퐓hat was done well???and ?占퐓hat changes would improve the assignment???Changing the form was associated with a significant increase in the quantity of the feedback and in the amount and quality of feedback provided to students. We also observed that, for these double-marked projects, the marker designated as ?占퐉arker 1??consistently wrote more feedback than the marker designated ?占퐉arker 2??

Citations

Citations to this article as recorded by  
  • Animated process-transparency in student evaluation of teaching: effects on the quality and quantity of student feedback
    Marloes Nederhand, Bas Giesbers, Judith Auer, Ad Scheepers
    Assessment & Evaluation in Higher Education.2024; 49(3): 288.     CrossRef
  • Development and evaluation of two interventions to improve students’ reflection on feedback
    Richard Harris, Pam Blundell-Birtill, Madeleine Pownall
    Assessment & Evaluation in Higher Education.2023; 48(5): 672.     CrossRef
  • How an EPA-based curriculum supports professional identity formation
    Anne E. Bremer, Marjolein H. J. van de Pol, Roland F. J. M. Laan, Cornelia R. M. G. Fluit
    BMC Medical Education.2022;[Epub]     CrossRef
  • Narrative Assessments in Higher Education: A Scoping Review to Identify Evidence-Based Quality Indicators
    Molk Chakroun, Vincent R Dion, Kathleen Ouellet, Ann Graillon, Valérie Désilets, Marianne Xhignesse, Christina St-Onge
    Academic Medicine.2022; 97(11): 1699.     CrossRef
  • Teaching in Geriatrics: The Potential of a Structured Written Feedback for the Improvement of Lectures
    Theresa Pohlmann, Volker Paulmann, Sandra Steffens, Klaus Hager
    European Journal of Geriatrics and Gerontology.2022; 4(3): 123.     CrossRef
  • Enhancing written feedback: The use of a cover sheet influences feedback quality
    J.G. Arts, M. Jaspers, D. Joosten-ten Brinke, Sammy King Fai Hui
    Cogent Education.2021;[Epub]     CrossRef
  • Implementation of written structured feedback into a surgical OSCE
    J. Sterz, S. Linßen, M. C. Stefanescu, T. Schreckenbach, L. B. Seifert, M. Ruesseler
    BMC Medical Education.2021;[Epub]     CrossRef
  • Eliciting student feedback for course development: the application of a qualitative course evaluation tool among business research students
    Carly Steyn, Clint Davies, Adeel Sambo
    Assessment & Evaluation in Higher Education.2019; 44(1): 11.     CrossRef
  • Diş Hekimliği Eğitiminde Beceri ve Yeterliğin Değerlendirilmesi II: Değerlendirme Yöntemleri
    Kadriye Funda AKALTAN
    Selcuk Dental Journal.2019; 6(5): 72.     CrossRef
  • Effect of individual structured and qualified feedback on improving clinical performance of dental students in clinical courses‐randomised controlled study
    I. M. Schüler, R. Heinrich‐Weltzien, M. Eiselt
    European Journal of Dental Education.2018;[Epub]     CrossRef
  • The Effect of High-Frequency, Structured Expert Feedback on the Learning Curves of Basic Interventional Ultrasound Skills Applied to Regional Anesthesia
    Getúlio Rodrigues de Oliveira Filho, Francisco de Assis Caire Mettrau
    Anesthesia & Analgesia.2018; 126(3): 1028.     CrossRef
  • A case study on written comments as a form of feedback in teacher education: so much to gain
    Jorik Gerardus Arts, Mieke Jaspers, Desiree Joosten-ten Brinke
    European Journal of Teacher Education.2016; 39(2): 159.     CrossRef
  • Medical students’ satisfaction with the Applied Basic Clinical Seminar with Scenarios for Students, a novel simulation-based learning method in Greece
    Panteleimon Pantelidis, Nikolaos Staikoglou, Georgios Paparoidamis, Christos Drosos, Stefanos Karamaroudis, Athina Samara, Christodoulos Keskinis, Michail Sideris, George Giannakoulas, Georgios Tsoulfas, Asterios Karagiannis
    Journal of Educational Evaluation for Health Professions.2016; 13: 13.     CrossRef
  • The effect of written standardized feedback on the structure and quality of surgical lectures: A prospective cohort study
    Jasmina Sterz, Sebastian H. Höfer, Bernd Bender, Maren Janko, Farzin Adili, Miriam Ruesseler
    BMC Medical Education.2016;[Epub]     CrossRef
  • Group Peer Teaching: A Strategy for Building Confidence in Communication and Teamwork Skills in Physical Therapy Students
    Christopher Seenan, Sivaramkumar Shanmugam, Jennie Stewart
    Journal of Physical Therapy Education.2016; 30(3): 40.     CrossRef
  • Does Reflective Learning with Feedback Improve Dental Students’ Self‐Perceived Competence in Clinical Preparedness?
    Jung-Joon Ihm, Deog-Gyu Seo
    Journal of Dental Education.2016; 80(2): 173.     CrossRef
  • Encouraging formative assessments of leadership for foundation doctors
    Lindsay Hadley, David Black, Jan Welch, Peter Reynolds, Clare Penlington
    The Clinical Teacher.2015; 12(4): 231.     CrossRef
  • Use of the ‘Stop, Start, Continue’ method is associated with the production of constructive qualitative feedback by students in higher education
    Alice Hoon, Emily Oliver, Kasia Szpakowska, Philip Newton
    Assessment & Evaluation in Higher Education.2015; 40(5): 755.     CrossRef
  • The New Era of : What Should Be Prepared to Be a Top Journal in the Category of Gastroenterology and Hepatology
    Sun Huh
    Journal of Neurogastroenterology and Motility.2013; 19(4): 419.     CrossRef
Brief Report
Potential advantage of student-run clinics for diversifying a medical school class
Chris N. Gu, Jane A. McElroy, Blake C. Corcoran
J Educ Eval Health Prof. 2012;9:8.   Published online May 25, 2012
DOI: https://doi.org/10.3352/jeehp.2012.9.8
  • 34,440 View
  • 149 Download
  • 6 Crossref
AbstractAbstract PDF
The purpose of this study was to evaluate the influence of a student-run clinic on the diversification of a medical student class. We distributed a two-page, 20-item, paper survey to students of the University of Missouri School of Medicine (MU SOM) class of 2015 in July of 2011. The survey gathered information on general demographics, opinions on the importance of medical education opportunities, and opinions on the importance of medical school characteristics in applying to and attending MU SOM. A total of 104 students responded to the survey. A majority of the students identified the MedZou Community Health Clinic, a student-run, free health clinic affiliated with MU SOM, and simulated-patient encounters as important educational experiences (81% and 94%, respectively). More than half of the self-identified non-white??students reported MedZou as an important factor in their choice to apply to (60%; 95% confidence interval [CI], 32 to 88) and attend (71%; 95% CI, 44 to 98) MU SOM, over half of the females reported MedZou as important in their choice to apply (59%; 95% CI, 43 to 76) and attend (57%; 95% CI, 40 to 74), and over half of non-Missouri residents reported MedZou as important in their choice to apply (64%; 95% CI, 36 to 93) and attend (71%; 95% CI, 44 to 98). According to the above results, it can be said that students clearly value both MedZou and simulated-patient encounters as important educational experiences. Women, minorities, and non-Missouri residents value MedZou more highly than their peers who are First Year Medical Students who are Missouri residents, suggesting that MedZou may provide a promising opportunity to advance diversity within MU SOM. These results highlight the need for additional research to further explore MedZou?占퐏 potential to enhance the recruitment of a diverse medical student class.

Citations

Citations to this article as recorded by  
  • Student‐led activities of daily living group program in a hospital inpatient rehabilitation setting
    Dione Miller, Sarah Mugridge, Meagan Elder, Megan Holt, Karen P. Y. Liu
    Australian Occupational Therapy Journal.2024; 71(4): 486.     CrossRef
  • College competitiveness and medical school exam performance
    Joshua Levy, Hiba Kausar, Deepal Patel, Shaun Andersen, Edward Simanton
    BMC Medical Education.2022;[Epub]     CrossRef
  • Learning in student‐run clinics: a systematic review
    Tim Schutte, Jelle Tichelaar, Ramon S Dekker, Michiel A van Agtmael, Theo P G M de Vries, Milan C Richir
    Medical Education.2015; 49(3): 249.     CrossRef
  • The role of prehealth student volunteers at a student-run free clinic in New York, United States
    Syed H. Shabbir, Maria Teresa M. Santos
    Journal of Educational Evaluation for Health Professions.2015; 12: 49.     CrossRef
  • Peer‐supported learning during ‘Health Mela’
    Lucy Cornthwaite, James Humphreys, Romesh Gupta, Satyan Rajbhandari
    The Clinical Teacher.2013; 10(5): 296.     CrossRef
  • Innovative Approaches to Promote a Culturally Competent, Diverse Health Care Workforce in an Institution Serving Hispanic Students
    Suad Ghaddar, John Ronnau, Shawn P. Saladin, Glenn Martínez
    Academic Medicine.2013; 88(12): 1870.     CrossRef
Research Article
Effect of portfolio assessment on student learning in prenatal training for midwives
Nourossadat Kariman, Farnoosh Moafi
J Educ Eval Health Prof. 2011;8:2.   Published online March 25, 2011
DOI: https://doi.org/10.3352/jeehp.2011.8.2
  • 30,448 View
  • 159 Download
  • 4 Crossref
AbstractAbstract PDF
The tendency to use portfolios for evaluation has been developed with the aim of optimizing the culture of assessment. The present study was carried out to determine the effect of using portfolios as an evaluation method on midwifery students??learning and satisfaction in prenatal practical training. In this prospective cohort study, all midwifery students in semester four (n=40), were randomly allocated to portfolio and routine evaluation groups. Based on their educational goals, the portfolio groups prepared packages which consisted of a complete report of the history, physical examinations, and methods of patient management (as evaluated by a checklist) for women who visited a prenatal clinic. During the last day of their course, a posttest, clinical exam, and student satisfaction form were completed. The two groups??mean age, mean pretest scores, and their prerequisite course that they should have taken in the previous semester were similar. The mean difference in the pre and post test scores for the two groups??knowledge and comprehension levels did not differ significantly (P>0.05). The average scores on questions in Bloom?占퐏 taxonomy 2 and 3 of the portfolio group were significantly greater than those of the routine evaluation group (P=0.002, P=0.03, respectively). The mean of the two groups??clinical exam scores was significantly different. The portfolio group?占퐏 mean scores on generating diagnostic and therapeutic solutions and the ability to apply theory in practice were higher than those of the routine group. Overall, students??satisfaction scores in the two evaluation methods were relatively similar. Portfolio evaluation provides the opportunity for more learning by increasing the student?占퐏 participation in the learning process and helping them to apply theory in practice.

Citations

Citations to this article as recorded by  
  • The Effect of Integrating Service-Learning and Learning Portfolio Construction into the Curriculum of Gerontological Nursing
    Pei-Ti Hsu, Ya-Fang Ho, Jeu-Jung Chen
    Healthcare.2022; 10(4): 652.     CrossRef
  • Portfolios and Assessment of Personal Protective Devices Course Learning
    Seyedeh Negar Assadi
    Research and Development in Medical Education.2016; 4(2): 123.     CrossRef
  • Development and Evaluation of an e-portfolio for Use in a Dietetic Internship Program
    Ann Gaba
    Procedia - Social and Behavioral Sciences.2015; 174: 1151.     CrossRef
  • Opiniones de alumnos y docentes en cuanto a la evaluación de competencias mediante el uso del portafolio en medicina
    Marcela Agostini, Laura París, Francisco Heit, Alejandro Sartorio, Roberto Cherjovsky
    Debate Universitario.2015; 4(7): 39.     CrossRef
Brief Reports
Evaluation of a Team-Based Learning Tutor Training Workshop on Research and Publication Ethics by Faculty and Staff Participants
Young-Su Ju
J Educ Eval Health Prof. 2009;6:5.   Published online December 20, 2010
DOI: https://doi.org/10.3352/jeehp.2009.6.5
  • 46,117 View
  • 164 Download
  • 4 Crossref
AbstractAbstract PDF
A team-based Learning (TBL) tutor training workshop on research and publication ethics was offered to 8 faculty members and 3 staff at Hallym University in 2009. To investigate the effect of the workshop and any attitude changes, a questionnaire survey was performed after the 8-hr course. Questions in four categories-general course content, change in attitudes toward research and publication ethics, the TBL format, and an open-ended question about the course--were included. Participants responded positively to all items on general course content. There was a positive change in attitude on research and publication ethics. Participants also responded positively to six items on team-based learning. The overall positive response to the workshop on research and publication ethics suggested the effectiveness of this kind of TBL tutor training course for university faculty and staff.

Citations

Citations to this article as recorded by  
  • Pedagogic Strategies and Contents in Medical Writing/Publishing Education: A Comprehensive Systematic Survey
    Behrooz Astaneh, Ream Abdullah, Vala Astaneh, Sana Gupta, Romina Brignardello-Petersen, Mitchell A. H. Levine, Gordon Guaytt
    European Journal of Investigation in Health, Psychology and Education.2024; 14(9): 2491.     CrossRef
  • Nursing Faculties’ Knowledge of and Attitudes Toward Research Ethics According to Demographic Characteristics and Institutional Environment in Korea
    Sukhee Ahn, Geum Hee Jeong, Hye Sook Shin, Jeung-Im Kim, Yunmi Kim, Ju-Eun Song, Sun-Hee Kim, Ju Hee Kim, Yun Jung Lee, Young A. Song, Eun Hee Lee, Myoung-Hee Kim
    Sage Open.2020;[Epub]     CrossRef
  • Faculty Development Effectiveness: Insights from a Program Evaluation
    Anupma Wadhwa, Lopamudra Das, Savithiri Ratnapalan
    Journal of Biomedical Education.2014; 2014: 1.     CrossRef
  • Perspective
    Paul Haidet, Ruth E. Levine, Dean X. Parmelee, Sheila Crow, Frances Kennedy, P. Adam Kelly, Linda Perkowski, Larry Michaelsen, Boyd F. Richards
    Academic Medicine.2012; 87(3): 292.     CrossRef
Program Evaluation in Medical Education: An Overview of the Utilization-focused Approach
Matt Vassar, Denna L. Wheeler, Machelle Davison, Johnathan Franklin
J Educ Eval Health Prof. 2010;7:1.   Published online June 15, 2010
DOI: https://doi.org/10.3352/jeehp.2010.7.1
  • 33,095 View
  • 324 Download
  • 17 Crossref
AbstractAbstract PDF
Medical school administrators, educators, and other key personnel must often make difficult choices regarding the creation, retention, modification, or termination of the various programs that take place at their institutions. Program evaluation is a data-driven strategy to aide decision-makers in determining the most appropriate outcome for programs within their purview. The purpose of this brief article is to describe one program evaluation model, the utilization-focused approach. In particular, we address the focus of this model, the personal factor, the role of the evaluator, and the evaluation process. Based on the flexibility of this model as well as its focus on stakeholder involvement, we encourage readers to consider the utilization-focused approach when evaluating programs.

Citations

Citations to this article as recorded by  
  • Program Evaluation in Competence by Design: A Mixed-Methods Study
    Jenna Milosek, Kaylee Eady, Katherine A. Moreau
    Journal of Medical Education and Curricular Development.2025;[Epub]     CrossRef
  • Development of a Program Evaluation Framework for Improving the Quality of Undergraduate Medical Education
    Yulim Kang, Hae Won Kim, Jun Yong Choi
    Korean Medical Education Review.2025; 27(1): 60.     CrossRef
  • Implementation and evaluation of a communication coaching program: a CFIR-Informed qualitative analysis mapped onto a logic model
    Rachel M. Jensen, Marzena Sasnal, Uyen T. Mai, James R. Korndorffer, Rebecca K. Miller-Kuhlmann, Arden M. Morris, Aussama K. Nassar, Carl A. Gold
    BMC Medical Education.2025;[Epub]     CrossRef
  • Opportunities for Pedagogical Change in Turkish Medical Education Revealed in the Wake of the COVID-19 Pandemic
    Umit Kartoglu, Sevgi Turan, Alp Ergör, Dilek Aslan, Gülriz Erişgen, Duygu Fındık, Özlem Kayım Yıldız, Thomas C. Reeves
    Teaching and Learning in Medicine.2024; 36(4): 488.     CrossRef
  • Teaching and Facilitation Course for Family as Faculty: Preparing Families to be Faculty Partners in Healthcare Education
    Clara Ho, Ami Goulden, Darlene Hubley, Keith Adamson, Jean Hammond, Adrienne Zarem
    Clinical Social Work Journal.2024; 52(1): 23.     CrossRef
  • Evaluating competency-based medical education: a systematized review of current practices
    Nouf Sulaiman Alharbi
    BMC Medical Education.2024;[Epub]     CrossRef
  • Gathering Trainee Feedback to Improve Programs With Low Annual ACGME Survey Content Area Compliance: A Pilot Study
    Mara M Hoffert, Leslie Pfeiffer, Molly Hepke, Wendy Brink, Jennifer Newman, Karla D Passalacqua, Kimberly Baker-Genaw
    Academic Medicine.2024; 99(4): 419.     CrossRef
  • Strategies to foster stakeholder engagement in residency coaching: a CFIR-Informed qualitative study across diverse stakeholder groups
    Marzena Sasnal, Rachel M. Jensen, Uyen T. Mai, Carl A. Gold, Aussama K. Nassar, James R. Korndorffer, Arden M. Morris, Rebecca K. Miller-Kuhlmann
    Medical Education Online.2024;[Epub]     CrossRef
  • Teste de Progresso: a percepção do discente de Medicina
    Marlene Moraes Rosa Chinelato, Jose Eduardo Martinez, Gisele Regina de Azevedo
    Revista Brasileira de Educação Médica.2022;[Epub]     CrossRef
  • Pandemi Döneminde Tıp Eğitimini Sürdürmek: Giresun Üniversitesi Tıp Fakültesi Deneyimi
    Hülya AKAN, Berkan ŞAHİN, Murat USTA, Özkan ÖZAY, Hakan YÜZÜAK, Ural OĞUZ
    Tıp Eğitimi Dünyası.2021; 20(60-1): 54.     CrossRef
  • Ongoing Value and Practice Improvement Outcomes from Pediatric Palliative Care Education: The Quality of Care Collaborative Australia
    Penelope J Slater, Caroline J Osborne, Anthony R Herbert
    Advances in Medical Education and Practice.2021; Volume 12: 1189.     CrossRef
  • Faculty Feedback Program Evaluation in CIMS Multan, Pakistan
    Ambreen Shabbir, Hina Raja, Anjum A Qadri, Muhammad Hisaan Anjum Qadri
    Cureus.2020;[Epub]     CrossRef
  • A Guide to Evaluation of Quality Improvement and Patient Safety Educational Programs: Lessons From the VA Chief Resident in Quality and Safety Program
    Rebecca L. Butcher, Kathleen L. Carluzzo, Bradley V. Watts, Karen E. Schifferdecker
    American Journal of Medical Quality.2019; 34(3): 251.     CrossRef
  • Design and Content Validation of Three Setting-Specific Assessment Tools for Advanced Pharmacy Practice Experiences
    Eric H. Gilliam, Jason M. Brunner, Wesley Nuffer, Toral C. Patel, Megan E. Thompson
    American Journal of Pharmaceutical Education.2019; 83(9): 7067.     CrossRef
  • Is it a match? a novel method of evaluating medical school success
    Leslie L. Chang, Alisa Nagler, Mariah Rudd, Colleen O’Connor Grochowski, Edward G. Buckley, Saumil M. Chudgar, Deborah L. Engle
    Medical Education Online.2018; 23(1): 1432231.     CrossRef
  • Evaluation of medical ethics doctoral program; a utilization-focused approach
    Leila Afshar, Seyed Ziaedin Tabei, Mohammad Hosseinzade
    International Journal of Ethics Education.2018; 3(1): 89.     CrossRef
  • How we conduct ongoing programmatic evaluation of our medical education curriculum
    Kelly Karpa, Catherine S. Abendroth
    Medical Teacher.2012; 34(10): 783.     CrossRef
Research Articles
Students' Evaluation of a Team-based Course on Research and Publication Ethics: Attitude Change in Medical School Graduate Students
Soo Young Kim
J Educ Eval Health Prof. 2008;5:3.   Published online December 22, 2008
DOI: https://doi.org/10.3352/jeehp.2008.5.3
  • 33,034 View
  • 130 Download
  • 6 Crossref
AbstractAbstract PDF
In response to a growing need for students to appreciate ethical issues in medical research and publication, a brief team-based learning (TBL) course was presented to graduate students in the medical school of Hallym University in October and November 2007. To gather information as a basis for improving the course, questionnaires were distributed to 19 students and the feedback was evaluated. The questionnaire consisted of four categories: general course content (7 items), changes in attitudes toward research and publication ethics (6 items), the TBL format (6 items), and an open-ended question about the class (1 item). The most positive response had to do with the importance of the material. Students reported that their knowledge about ethical issues increased, and they expressed satisfaction regarding the communication with their tutors within the TBL format. Most students showed positive responses to the subject as well as to TBL. Since this was the first trial offering of this material in the graduate program at this medical school, it may have been novel to the students. The attitude change and the knowledge acquisition reported by students reflect a very positive outcome of this class. After adjustments to improve weaknesses, such as the short time allocation and students??lack of prior background, the outcomes of this TBL course on research and publication ethics provide a good basis for its continuation.

Citations

Citations to this article as recorded by  
  • Pedagogic Strategies and Contents in Medical Writing/Publishing Education: A Comprehensive Systematic Survey
    Behrooz Astaneh, Ream Abdullah, Vala Astaneh, Sana Gupta, Romina Brignardello-Petersen, Mitchell A. H. Levine, Gordon Guaytt
    European Journal of Investigation in Health, Psychology and Education.2024; 14(9): 2491.     CrossRef
  • Team-based Learning: Enhancing Academic Performance of Psychology Students
    Nadia Rania, Stefania Rebora, Laura Migliorini
    Procedia - Social and Behavioral Sciences.2015; 174: 946.     CrossRef
  • Team-Based Learning Instruction for Responsible Conduct of Research Positively Impacts Ethical Decision-Making
    Wayne T. McCormack, Cynthia W. Garvan
    Accountability in Research.2014; 21(1): 34.     CrossRef
  • Team-Based Learning in Pharmacy Education
    William Ofstad, Lane J. Brunner
    American Journal of Pharmaceutical Education.2013; 77(4): 70.     CrossRef
  • Perspective
    Paul Haidet, Ruth E. Levine, Dean X. Parmelee, Sheila Crow, Frances Kennedy, P. Adam Kelly, Linda Perkowski, Larry Michaelsen, Boyd F. Richards
    Academic Medicine.2012; 87(3): 292.     CrossRef
  • Evaluation of a Team-Based Learning Tutor Training Workshop on Research and Publication Ethics by Faculty and Staff Participants
    Young-Su Ju
    Journal of Educational Evaluation for Health Professions.2010; 6: 5.     CrossRef
Proposal of the Implementation of an International Pharmacy Graduate Preliminary Examination
Kyenghee Kwon, Jeoung Hill Park, Jinwoong Kim, Seung Ki Lee
J Educ Eval Health Prof. 2008;5:2.   Published online December 22, 2008
DOI: https://doi.org/10.3352/jeehp.2008.5.2
  • 31,609 View
  • 132 Download
  • 3 Crossref
AbstractAbstract PDF
At present, graduates of international pharmacy schools can apply to take the Korean Pharmacist Licensing Examination after passing a review by the Accreditation Board of the Pharmacy Schools and Licenses. However, since the educational content of different schools and the roles of pharmacists differ from country to country, a preliminary examination might be necessary before the Pharmacist Licensing Examination. To prepare to implement a preliminary examination for foreign pharmacy graduates in Korea, we summarized the preliminary examinations used in four other countries and presented a proposal for a preliminary examination. Data were collected via the internet and through telephone interviews with appropriate persons. The proposal was revised after a public forum. There are preliminary examinations in the USA, Canada, Australia, and the United Kingdom, and these involve written, oral, practice, and English proficiency tests. We proposed that the Korean preliminary examination consist of a written test on basic pharmacy, a test in the Korean language, and an interview. The preliminary examination should include suitable items that effectively evaluate international graduates. Graduates of international pharmacy schools who have an ability equivalent to graduates of Korean pharmacy schools should be eligible to write the Korean Licensing Examination.

Citations

Citations to this article as recorded by  
  • Palestinian pharmacists’ knowledge of issues related to using psychotropic medications in older people: a cross-sectional study
    Ramzi Shawahna, Mais Khaskiyyi, Hadeel Abdo, Yasmen Msarwe, Rania Odeh, Souad Salame
    Journal of Educational Evaluation for Health Professions.2017; 14: 8.     CrossRef
  • Can a medical regulatory system be implemented in Korea?
    Sun Huh, Myung-Hyun Chung
    Journal of the Korean Medical Association.2013; 56(3): 158.     CrossRef
  • Career Perspectives of Future Graduates of the Newly Implemented 6-year Pharmacy Educational System in South Korea
    Eunyoung Kim, Saurav Ghimire
    American Journal of Pharmaceutical Education.2013; 77(2): 37.     CrossRef
Effects of Rating Training on Inter-Rater Consistency for Developing a Dental Hygiene Clinical Rater Qualification System
Jeong Ran Park, Jung Sook Oh, Moungae Chae, Jae Yeon Jung, Sung Suk Bae
J Educ Eval Health Prof. 2007;4:5.   Published online December 20, 2007
DOI: https://doi.org/10.3352/jeehp.2007.4.5
  • 28,278 View
  • 144 Download
AbstractAbstract PDF
We tried to develop itemized evaluation criteria and a clinical rater qualification system through rating training of inter-rater consistency for experienced clinical dental hygienists and dental hygiene clinical educators. A total of 15 clinical dental hygienists with 1-year careers participated as clinical examination candidates, while 5 dental hygienists with 3-year educations and clinical careers or longer participated as clinical raters. They all took the clinical examination as examinees. The results were compared, and the consistency of competence was measured. The comparison of clinical competence between candidates and clinical raters showed that the candidate group?占퐏 mean clinical competence ranged from 2.96 to 3.55 on a 5-point system in a total of 3 instruments (Probe, Explorer, Curet), while the clinical rater group?占퐏 mean clinical competence ranged from 4.05 to 4.29. There was a higher inter-rater consistency after education of raters in the following 4 items: Probe, Explorer, Curet, and insertion on distal surface. The mean score distribution of clinical raters ranged from 75% to 100%, which was more uniform in the competence to detect an artificial calculus than that of candidates (25% to 100%). According to the above results, there was a necessity in the operating clinical rater qualification system for comprehensive dental hygiene clinicians. Furthermore, in order to execute the clinical rater qualification system, it will be necessary to keep conducting a series of studies on educational content, time, frequency, and educator level.
Brief Report
Is the Pass/Fail System Applicable to a Medical School in Korea?
Mee Young Kim
J Educ Eval Health Prof. 2007;4:3.   Published online December 20, 2007
DOI: https://doi.org/10.3352/jeehp.2007.4.3
  • 40,619 View
  • 154 Download
AbstractAbstract PDF
To determine whether a pass/fail system is more appropriate for medical education instead of a grade-based system, a survey of medical students and faculty members of Hallym University, Korea, was taken. A questionnaire was delivered to 54 junior students and 36 faculty members from a medical school in Korea and analyzed. Of these participants, 37.7% of students and 36.1% of faculty agreed to the pass/fail system, while 28.3% of students and 52.8% of faculty objected to it. The most frequent reason for objection was the potential decrease in learning achievement. A pass/fail system should be considered after persuasion of the students and faculty to think positively of this system.
Review Article
The New Horizon for Evaluations in Medical Education in Korea
Sang-Ho Baik
J Educ Eval Health Prof. 2005;2(1):7-22.   Published online June 30, 2005
DOI: https://doi.org/10.3352/jeehp.2005.2.1.7
  • 36,401 View
  • 223 Download
  • 7 Crossref
AbstractAbstract PDF
Over the last two decades, there have been a number of significant changes in the evaluation system in medical education in Korea. One major improvement in this respect has been the listing of learning objectives at medical schools and the construction of a content outline for the Korean Medical Licensing Examination that can be used as a basis of evaluation. Item analysis has become a routine method for obtaining information that often provides valuable feedback concerning test items after the completion of a written test. The use of item response theory in analyzing test items has been spreading in medical schools as a way to evaluate performance tests and computerized adaptive testing. A series of recent studies have documented an upward trend in the adoption of the objective structured clinical examination (OSCE) and clinical practice examination (CPX) for measuring skill and attitude domains, in addition to tests of the knowledge domain. There has been an obvious increase in regional consortiums involving neighboring medical schools that share the planning and administration of the OSCE and CPX; this includes recruiting and training standardized patients. Such consortiums share common activities, such as case development and program evaluation. A short history and the pivotal roles of four organizations that have brought about significant changes in the examination system are discussed briefly.

Citations

Citations to this article as recorded by  
  • Presidential address: Adoption of a clinical skills examination for dental licensing, implementation of computer-based testing for the medical licensing examination, and the 30th anniversary of the Korea Health Personnel Licensing Examination Institute
    Yoon-Seong Lee
    Journal of Educational Evaluation for Health Professions.2022; 19: 1.     CrossRef
  • Effectiveness of Medical Education Assessment Consortium Clinical Knowledge Mock Examination (2011‐2016)
    Sang Yeoup Lee, Yeli Lee, Mi Kyung Kim
    Korean Medical Education Review.2018; 20(1): 20.     CrossRef
  • Long for wonderful leadership in a new era of the Korean Association of Medical Colleges
    Young Hwan Lee
    Korean Journal of Medical Education.2014; 26(3): 163.     CrossRef
  • Major Reforms and Issues of the Medical Licensing Examination Systems in Korea
    Sang-Ho Baik
    Korean Medical Education Review.2013; 15(3): 125.     CrossRef
  • A Study on the Feasibility of a National Practical Examination in the Radiologic Technologist
    Soon-Yong Son, Tae-Hyung Kim, Jung-Whan Min, Dong-Kyoon Han, Sung-Min Ahn
    Journal of the Korea Academia-Industrial cooperation Society.2011; 12(5): 2149.     CrossRef
  • The Relationship between Senior Year Examinations at a Medical School and the Korean Medical Licensing Examination
    Ki Hoon Jung, Ho Keun Jung, Kwan Lee
    Korean Journal of Medical Education.2009; 21(1): 17.     CrossRef
  • What Qualities Do Medical School Applicants Need to Have? - Secondary Publication
    Yera Hur, Sun Kim
    Yonsei Medical Journal.2009; 50(3): 427.     CrossRef
Original Articles
Correlations between the scores of computerized adaptive testing, paper and pencil tests, and the Korean Medical Licensing Examination
Mee Young Kim, Yoon Hwan Lee, Sun Huh
J Educ Eval Health Prof. 2005;2(1):113-118.   Published online June 30, 2005
DOI: https://doi.org/10.3352/jeehp.2005.2.1.113
  • 44,464 View
  • 167 Download
  • 3 Crossref
AbstractAbstract PDF
To evaluate the usefulness of computerized adaptive testing (CAT) in medical school, the General Examination for senior medical students was administered as a paper and pencil test (P&P) and using CAT. The General Examination is a graduate examination, which is also a preliminary examination for the Korean Medical Licensing Examination (KMLE). The correlations between the results of the CAT and P&P and KMLE were analyzed. The correlation between the CAT and P&P was 0.8013 (p=0.000); that between the CAT and P&P was 0.7861 (p=0.000); and that between the CAT and KMLE was 0.6436 (p=0.000). Six out of 12 students with an ability estimate below 0.52 failed the KMLE. The results showed that CAT could replace P&P in medical school. The ability of CAT to predict whether students would pass the KMLE was 0.5 when the criterion of the theta value was set at -0.52 that was chosen arbitrarily for the prediction of pass or failure.

Citations

Citations to this article as recorded by  
  • Analysis on Validity and Academic Competency of Mock Test for Korean Medicine National Licensing Examination Using Item Response Theory
    Han Chae, Eunbyul Cho, SeonKyoung Kim, DaHye Choi, Seul Lee
    Keimyung Medical Journal.2023; 42(1): 7.     CrossRef
  • Application of Computerized Adaptive Testing in Medical Education
    Sun Huh
    Korean Journal of Medical Education.2009; 21(2): 97.     CrossRef
  • Estimation of an Examinee's Ability in the Web-Based Computerized Adaptive Testing Program IRT-CAT
    Yoon-Hwan Lee, Jung-Ho Park, In-Yong Park
    Journal of Educational Evaluation for Health Professions.2006; 3: 4.     CrossRef
Students' Attitude toward and Acceptability of Computerized Adaptive Testing in Medical School and their Effect on the Examinees' Ability
Mee Young Kim, Sun Huh
J Educ Eval Health Prof. 2005;2(1):105-111.   Published online June 30, 2005
DOI: https://doi.org/10.3352/jeehp.2005.2.1.105
  • 33,659 View
  • 178 Download
  • 3 Crossref
AbstractAbstract PDF
An examinee's ability can be evaluated precisely using computerized adaptive testing (CAT), which is shorter than written tests and more efficient in terms of the duration of the examination. We used CAT for the second General Examination of 98 senior students in medical college on November 27, 2004. We prepared 1,050 pre-calibrated test items according to item response theory, which had been used for the General Examination administered to senior students in 2003. The computer was programmed to pose questions until the standard error of the ability estimate was smaller than 0.01. To determine the students' attitude toward and evaluation of CAT, we conducted surveys before and after the examination, via the Web. The mean of the students' ability estimates was 0.3513 and its standard deviation was 0.9097 (range -2.4680 to +2.5310). There was no significant difference in the ability estimates according to the responses of students to items concerning their experience with CAT, their ability to use a computer, or their anxiety before and after the examination (p>0.05). Many students were unhappy that they could not recheck their responses (49%), and some stated that there were too few examination items (24%). Of the students, 79 % had no complaints concerning using a computer and 63% wanted to expand the use of CAT. These results indicate that CAT can be implemented in medical schools without causing difficulties for users.

Citations

Citations to this article as recorded by  
  • Computer‐Based Testing and Construction of an Item Bank Database for Medical Education in Korea
    Sun Huh
    Korean Medical Education Review.2014; 16(1): 11.     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef
  • Application of Computerized Adaptive Testing in Medical Education
    Sun Huh
    Korean Journal of Medical Education.2009; 21(2): 97.     CrossRef
Comparison of item analysis results of Korean Medical Licensing Examination according to classical test theory and item response theory
Eun Young Lim, Jang Hee Park, ll Kwon, Gue Lim Song, Sun Huh
J Educ Eval Health Prof. 2004;1(1):67-76.   Published online January 31, 2004
DOI: https://doi.org/10.3352/jeehp.2004.1.1.67
  • 33,279 View
  • 225 Download
  • 4 Crossref
AbstractAbstract PDF
The results of the 64th and 65th Korean Medical Licensing Examination were analyzed according to the classical test theory and item response theory in order to know the possibility of applying item response theory to item analys and to suggest its applicability to computerized adaptive test. The correlation coefficiency of difficulty index, discriminating index and ability parameter between two kinds of analysis were got using computer programs such as Analyst 4.0, Bilog and Xcalibre. Correlation coefficiencies of difficulty index were equal to or more than 0.75; those of discriminating index were between - 0.023 and 0.753; those of ability parameter were equal to or more than 0.90. Those results suggested that the item analysis according to item response theory showed the comparable results with that according to classical test theory except discriminating index. Since the ability parameter is most widely used in the criteria-reference test, the high correlation between ability parameter and total score can provide the validity of computerized adaptive test utilizing item response theory.

Citations

Citations to this article as recorded by  
  • Journal of Educational Evaluation for Health Professions received the top-ranking Journal Impact Factor―9.3—in the category of Education, Scientific Disciplines in the 2023 Journal Citation Ranking by Clarivate
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2024; 21: 16.     CrossRef
  • Analysis on Validity and Academic Competency of Mock Test for Korean Medicine National Licensing Examination Using Item Response Theory
    Han Chae, Eunbyul Cho, SeonKyoung Kim, DaHye Choi, Seul Lee
    Keimyung Medical Journal.2023; 42(1): 7.     CrossRef
  • Item difficulty index, discrimination index, and reliability of the 26 health professions licensing examinations in 2022, Korea: a psychometric study
    Yoon Hee Kim, Bo Hyun Kim, Joonki Kim, Bokyoung Jung, Sangyoung Bae
    Journal of Educational Evaluation for Health Professions.2023; 20: 31.     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions
TOP