Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Author index

Page Path
HOME > Browse articles > Author index
Search
Melanie Regan 1 Article
Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
Debra Nestel, Melanie Regan, Priyanga Vijayakumar, Irum Sunderji, Cathy Haigh, Cathy Smith, Alistair Wright
J Educ Eval Health Prof. 2011;8:13.   Published online December 17, 2011
DOI: https://doi.org/10.3352/jeehp.2011.8.13
  • 28,347 View
  • 164 Download
  • 5 Citations
AbstractAbstract PDF
Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that included participants??reactions and considered transfer of learning to the workplace and retention of learning. However, with participants in distributed locations and limited program resources, this was likely to prove challenging. Elsewhere, we have reported the outcomes of this evaluation. In this educational development report, we describe our evaluation strategy as a case study, its underpinning theoretical framework, the strategy, and its benefits and challenges. The strategy sought to address issues of program structure, process, and outcomes. We used a modified version of Kirkpatrick?占퐏 model as a framework to map our evaluation of participants??experiences, acquisition of knowledge and skills, and their application in the workplace. The predominant benefit was that most of the evaluation instruments allowed for personalization of the program. The baseline instruments provided a broad view of participants??expectations, needs, and current perspective on their role. Immediate evaluation instruments allowed ongoing tailoring of the program to meet learning needs. Intermediate evaluations facilitated insight on the transfer of learning. The principal challenge related to the resource intensive nature of the evaluation strategy. A dedicated program administrator was required to manage data collection. Although resource-intensive, we recommend baseline, immediate, and intermediate data collection points, with multi-source feedback being especially illuminating. We believe our experiences may be valuable to faculty involved in program evaluations.

Citations

Citations to this article as recorded by  
  • The evaluation of a home-based paediatric nursing service: concept and design development using the Kirkpatrick model
    Catherine Jones, Jennifer Fraser, Sue Randall
    Journal of Research in Nursing.2018; 23(6): 492.     CrossRef
  • Evaluation of a consulting training course for international development assistance for health
    Pan Gao, Hao Xiang, Suyang Liu, Yisi Liu, Shengjie Dong, Feifei Liu, Wenyuan Yu, Xiangyu Li, Li Guan, Yuanyuan Chu, Zongfu Mao, Shu Chen, Shenglan Tang
    BMC Medical Education.2018;[Epub]     CrossRef
  • Cumulative evaluation data: pediatric airway management simulation courses for pediatric residents
    Sawsan Alyousef, Haifa Marwa, Najd Alnojaidi, Hani Lababidi, Muhammad Salman Bashir
    Advances in Simulation.2017;[Epub]     CrossRef
  • Supporting international medical graduates’ transition to their host-country: realist synthesis
    Amelia Kehoe, John McLachlan, Jane Metcalf, Simon Forrest, Madeline Carter, Jan Illing
    Medical Education.2016; 50(10): 1015.     CrossRef
  • Liaison Officer for International Medical Graduates: Research Findings from Australia
    Pam McGrath, David Henderson, Hamish A. Holewa
    Illness, Crisis & Loss.2013; 21(1): 15.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions