Purpose The literature suggests that the ability to numerate cannot be fully understood without accounting for the social context in which mathematical activity is represented. Team-based learning (TBL) is an andragogical approach with theoretical links to sociocultural and community-of-practice learning. This study aimed to quantitatively explore the impact of TBL instruction on numeracy development in 2 cohorts of pharmacy students and identify the impact of TBL instruction on numeracy development from a social perspective for healthcare education.
Methods Two cohorts of students were administered the Health Science Reasoning Test-Numeracy (HSRT-N) before beginning pharmacy school. Two years after using TBL as the primary method of instruction, both comprehensive and domain data from the HSRT-N were analyzed.
Results In total, 163 pharmacy student scores met the inclusion criteria. The students’ numeracy skills measured by HSRT-N improved after 2 years of TBL instruction.
Conclusion Numeracy was the most significantly improved HSRT-N domain in pharmacy students following two years of TBL instruction. Although a closer examination of numeracy development in TBL is warranted, initial data suggest that TBL instruction may be an adequate proxy for advancing numeracy in a cohort of pharmacy students. TBL may encourage a social practice of mathematics to improve pharmacy students’ ability to numerate critically.
Purpose Simulation training is an ever-growing means of healthcare education and often involves simulated participants (SPs), commonly known as actors. Simulation-based education (SBE) can sometimes endanger SPs, and as such we have created a safety checklist for them to follow. This study describes how we developed the checklist through a quality improvement project, and then evaluated feedback responses to assess whether SPs felt our checklist was safe.
Methods The checklist was provided to SPs working in an acute trust simulation service when delivering multidisciplinary SBE over 4 months. Using multiple plan–do–study–act cycles, the checklist was refined by reflecting on SP feedback to ensure that the standards of the safe simulation were met. We collected 21 responses from September to December 2021 after SPs completed an SBE event.
Results The responses showed that 100% of SPs felt safe during SBE when using our checklist. The average “confidence in safety” rating before using the checklist was 6.8/10, which increased significantly to 9.2/10 after using the checklist (P<0.0005). The checklist was refined throughout the 4 months and implemented in adult and pediatric SBE as a standard operating procedure.
Conclusion We recommend using our safety checklist as a standard operating procedure to improve the confidence and safety of SPs during safe and effective simulations.
Purpose Active video gaming (AVG) is used in physical therapy (PT) to treat individuals with a variety of diagnoses across the lifespan. The literature supports improvements in balance, cardiovascular endurance, and motor control; however, evidence is lacking regarding the implementation of AVG in PT education. This study investigated doctoral physical therapy (DPT) students’ confidence following active exploration of AVG systems as a PT intervention in the United States.
Methods This pretest-posttest study included 60 DPT students in 2017 (cohort 1) and 55 students in 2018 (cohort 2) enrolled in a problem-based learning curriculum. AVG systems were embedded into patient cases and 2 interactive laboratory classes across 2 consecutive semesters (April–December 2017 and April–December 2018). Participants completed a 31-question survey before the intervention and 8 months later. Students’ confidence was rated for general use, game selection, plan of care, set-up, documentation, setting, and demographics. Descriptive statistics and the Wilcoxon signed-rank test were used to compare differences in confidence pre- and post-intervention.
Results Both cohorts showed increased confidence at the post-test, with median (interquartile range) scores as follows: cohort 1: pre-test, 57.1 (44.3–63.5); post-test, 79.1 (73.1–85.4); and cohort 2: pre-test, 61.4 (48.0–70.7); post-test, 89.3 (80.0–93.2). Cohort 2 was significantly more confident at baseline than cohort 1 (P<0.05). In cohort 1, students’ data were paired and confidence levels significantly increased in all domains: use, Z=-6.2 (P<0.01); selection, Z=-5.9 (P<0.01); plan of care, Z=-6.0 (P<0.01); set-up, Z=-5.5 (P<0.01); documentation, Z=-6.0 (P<0.01); setting, Z=-6.3 (P<0.01); and total score, Z=-6.4 (P<0.01).
Conclusion Structured, active experiences with AVG resulted in a significant increase in students’ confidence. As technology advances in healthcare delivery, it is essential to expose students to these technologies in the classroom.
Purpose Frontline healthcare professionals are well positioned to improve the systems in which they work. Educational curricula, however, have not always equipped healthcare professionals with the skills or knowledge to implement and evaluate improvements. It is important to have a robust and standardized framework in order to evaluate the impact of such education in terms of improvement, both within and across European countries. The results of such evaluations will enhance the further development and delivery of healthcare improvement science (HIS) education. We aimed to describe the development and piloting of a framework for prospectively evaluating the impact of HIS education and learning.
Methods The evaluation framework was designed collaboratively and piloted in 7 European countries following a qualitative methodology. The present study used mixed methods to gather data from students and educators. The framework took the Kirkpatrick model of evaluation as a theoretical reference.
Results The framework was found to be feasible and acceptable for use across differing European higher education contexts according to the pilot study and the participants’ consensus. It can be used effectively to evaluate and develop HIS education across European higher education institutions.
Conclusion We offer a new evaluation framework to capture the impact of HIS education. The implementation of this tool has the potential to facilitate the continuous development of HIS education.
Citations
Citations to this article as recorded by
Developing the American College of Surgeons Quality Improvement Framework to Evaluate Local Surgical Improvement Efforts Clifford Y. Ko, Tejen Shah, Heidi Nelson, Avery B. Nathens JAMA Surgery.2022; 157(8): 737. CrossRef
Kirkpatrick Model: Its Limitations as Used in Higher Education Evaluation Michael CAHAPAY International Journal of Assessment Tools in Education.2021; 8(1): 135. CrossRef
Transforming the Future Healthcare Workforce across Europe through Improvement Science Training: A Qualitative Approach Maria Cristina Sierras-Davo, Manuel Lillo-Crespo, Patricia Verdu, Aimilia Karapostoli International Journal of Environmental Research and Public Health.2021; 18(3): 1298. CrossRef
Qualitative evaluation of an educational intervention about healthcare improvement for nursing students María Cristina Sierras-Davó, Manuel Lillo-Crespo, Patricia Verdú Rodríguez Aquichan.2021; 21(1): 1. CrossRef
Evaluation of Advanced Field Epidemiology Training Programs in the Eastern Mediterranean Region: A Multi-Country Study Mohannad Al Nsour, Yousef Khader, Haitham Bashier, Majd Alsoukhni Frontiers in Public Health.2021;[Epub] CrossRef
The United Kingdom Field Epidemiology Training Programme: meeting programme objectives Paola Dey, Jeremy Brown, John Sandars, Yvonne Young, Ruth Ruggles, Samantha Bracebridge
Eurosurveillance.2019;[Epub] CrossRef
Mapping the Status of Healthcare Improvement Science through a Narrative Review in Six European Countries Manuel Lillo-Crespo, Maria Cristina Sierras-Davó, Alan Taylor, Katrina Ritters, Aimilia Karapostoli International Journal of Environmental Research and Public Health.2019; 16(22): 4480. CrossRef