A Comprehensive Evaluation in Medical Curriculum Using the Kirkpatrick Hierarchical Approach: A Review and Update
Main Article Content
Abstract
This paper reviews the application of the Kirkpatrick hierarchical model in evaluating medical education curricula, emphasizing its role in enhancing curriculum quality through stakeholder feedback. It outlines the four levels of evaluation defined by the Kirkpatrick model: Reaction, Learning, Behaviour, and Results, each involving diverse stakeholders in assessing educational outcomes. The paper highlights the importance of multi-source feedback (MSF) methodologies, which provide holistic insights into medical students' competencies, facilitating targeted improvements. Despite the challenges associated with implementing comprehensive evaluations, such as stakeholder resistance and varying interests, the framework promotes ongoing quality enhancement in medical education, ensuring alignment between educational strategies and workforce needs. The study reinforces that effective evaluation systems are crucial for cultivating skilled, competent healthcare professionals and suggests further integration of various theoretical models to bolster educational practices and policy decisions.
Article Details
The Medical Research Archives grants authors the right to publish and reproduce the unrevised contribution in whole or in part at any time and in any form for any scholarly non-commercial purpose with the condition that all publications of the contribution include a full citation to the journal as published by the Medical Research Archives.
References
2. Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE Guide No. 67. Med Teach. 2012;34(5). doi:10.3109/0142159X.2012.668637
3. Ruhe V, Boudreau JD. The 2011 Program Evaluation Standards: A framework for quality in medical education programme evaluations. J Eval Clin Pract. 2013;19(5):925-932. doi:10.1111/j.1365-2753. 2012.01879.x
4. Oktay C, Senol Y, Rinnert S, Cete Y. Utility of 360-degree assessment of residents in a Turkish academic emergency medicine residency program. Turkish J Emerg Med. 2017;17(1):12-15. doi:10.1016/j.tjem.2016.09.007
5. Kusmiati M, Sanip S, Bahari R. The Development of a 360-degree Evaluation Model of Medical Curriculum with the Kirkpatrick Hierarchy Approach. Educ Med J. 2024;16(1):93-115. doi:10.21315/eimj2024.16.1.7
6. Sanchez-Reilly S, Ross JS. Hospice and palliative medicine: Curriculum evaluation and learner assessment in medical education. J Palliat Med. 2012;15(1):116-122. doi:10.1089/jpm.2011.0155
7. Heydari MR, Taghva F, Amini M, Delavari S. Using Kirkpatrick’s model to measure the effect of a new teaching and learning methods workshop for health care staff. BMC Res Notes. 2019;12(388):1-5. doi:10.1186/s13104-019-4421-y
8. Spiel C, Schober B, Reimann R. Evaluation of curricula in higher education: Challenges for evaluators. Eval Rev. 2006;30(4):430-450. doi:10.1177/0193841X 05285077
9. Klenowski V. Curriculum Evaluation: Approaches and Methodologies. Int Encycl Educ Third Ed. Published online 2009:335-341. doi:10.1016/B978-0-08-044894-7.00069-5
10. Donnon T, Al Ansari A, Al Alawi S, Violato C. The reliability, validity, and feasibility of multisource feedback physician assessment: A systematic review. Acad Med. 2014;89(3):511-516. doi:10.1097/ACM.0000000000000147
11. Ruhe V, D Boudreau. The 2011 Program Evaluation Standards : a framework for quality in medical education programme evaluations. J Eval Clin Pract ISSN 1365-2753. Published online 2012:1-8. doi:10.1111/j.1365-2753.2012.01879.x
12. Baxter SK, Blank L, Woods HB, Payne N, Rimmer M, Goyder E. Using logic model methods in systematic review synthesis: Describing complex pathways in referral management interventions. BMC Med Res Methodol. 2014;14(1):1-9. doi:10.1186/1471-2288-14-62
13. Cooksy LJ, Gill P, Kelly PA. The program logic model as an integrative framework for a multimethod evaluation. Eval Program Plann. 2001;24(2):119-128. doi:10.1016/S0149-7189(01)00003-9
14. Prideaux D. Curriculum development in medical education: From acronyms to dynamism. Teach Teach Educ. 2007;23(3):294-302. doi:10.1016/j.tate.2006.12.017
15. Tavakol M, Gruppen LD, Torabi S. Using evaluation research to improve medical education. Clin Teach. 2010;7(3):192-196. doi:10.1111/j.1743-498X. 2010.00383.x
16. Bates R. A critical analysis of evaluation practice : the Kirkpatrick model and the principle of beneficence. Eval Program Plann. 2004;27(2004):341-347. doi:10.1016/j.evalprogplan.2004.04.011
17. Stake RE. Excerpts from: “Program evaluation, particularly responsive evaluation.” J MultiDi Isc Eval. 2011;7(15):183-201. doi:10.1016/0886-1633 (91)90025-S
18. Bleaklay A, Browne J, Ellis K. Quality in Medical Education. In: Swanwick T, ed. Understanding Medical Education: Evidence, Theory and Practice. second. John Wiley & Sons; 2014:47-57. doi:10.1007/s40037-014-0113-4
19. Scriven M, Paul M. Critical Thinking Defined. Critical Thinking Conference; 1992.
20. Csiernik R, Chaulk P, McQuaid S, McKeon K. Applying the Logic Model Process to Employee Assistance Programming. J Workplace Behav Health. 2015;30(19 September):306-323. doi:10.1080/ 15555240.2014.999078
21. Carraccio CL, Englander R. From flexner to competencies: Reflections on a decade and the journey ahead. Acad Med. 2013;88(8):1067-1073. doi:10.1097/ACM.0b013e318299396f
22. Tavakol M, Gruppen L. Using evaluation research to improve medical education. Clin Teach. 2010;7:192-196.
23. Kusmiati M, Bahari R, Sanip S, Hamid NAA, Emilia O. The development of an evaluation tool to assess professional behavior and clinical competencies from the graduates’ perspective. Korean J Med Educ. 2020;32(1):1-11. doi:10.3946/kjme.2020.148
24. Biggs JS, Farrell L, Lawrence G, Johnson JK. A practical example of Contribution Analysis to a public health intervention. Evaluation. 2014;20(2):214-229. doi:10.1177/1356389014527527
25. Ambu-Saidi B, Fung, C.Y., Turner, K., Lim, A.S.S. A Critical Review on Training Evaluation Models: A Search for Future Agenda. J Cogn Sci Hum Dev. 2024;10(1):142-170. doi:10.33736/jcshd.6336.2024
26. Reio TG, Rocco TS, Smith DH, Chang E. A Critique of Kirkpatrick’s Evaluation Model. New Horizons Adult Educ Hum Resour Dev. 2017;29(2):35-53. doi:10.1002/nha3.20178
27. Stufflebeam DL. The CIPP Model for Evaluation. In: International Handbook of Educational Evaluation. ; 2003:31-62. doi:10.1007/978-94-010-0309-4_4
28. Darma IK. The effectiveness of teaching program of CIPP evaluation model. Int Res J Eng IT Sci Res. 2019;5(3):1-13. doi:10.21744/irjeis.v5n3.619
29. Aziz S, Mahmood M, Rehman Z. Implementation of CIPP Model for Quality Evaluation at School Level: A Case Study. J Educ Educ Dev. 2018;5(1):189. doi:10.22555/joeed.v5i1.1553
30. Chen Y, Li H. Research on Engineering Quality Management Based on PDCA Cycle. IOP Conf Ser Mater Sci Eng. 2019;490(6). doi:10.1088/1757-899X/490/6/062033
31. Bunglowala A, Asthana N. A Total Quality Management Approach in Teaching and Learning Process. Int J Manag. 2016;7(5):223-227. http://www.iaeme.com/MasterAdmin/uploadfolder/IJM_07_05_021/IJM_07_05_021.pdf%0A
32. Cahapay M. Kirkpatrick Model: Its Limitations as Used in Higher Education Evaluation. Int J Assess Tools Educ. 2021;8(1):135-144. doi:10.21449/ijate. 856143