Programmatic Evaluation, the end of “all or nothing” in medical education.

Authors

DOI: https://doi.org/10.6018/edumed.702031
Keywords: competencies, assessment, feedback

Abstract

Programmatic evaluation occupies an increasingly central place in contemporary medical education, especially in institutions that have adopted competency-based training models. While not yet a universal standard, it has established itself as one of the most influential and discussed approaches in the field of clinical assessment. Over the past two decades, medical education has shifted from content-centered curricula to integrated professional competency frameworks. International organizations such as the Accreditation Council for Graduate Medical Education (ACGME), the CanMEDS of the Royal College of Physicians and Surgeons of Canada, and the General Medical Council (GMC) have promoted frameworks that describe physicians not only as clinical experts but also as communicators, collaborators, professionals, and lifelong learners. This shift has necessitated a profound rethinking of assessment systems. In this context, programmatic evaluation stands out as the most coherent methodological response to competency-based education. Its role is not simply to be "another technique," but rather a structural framework that organizes all assessments within a training program. In many medical schools and residency programs, assessment is no longer conceived as a set of isolated exams, but rather as a longitudinal system for collecting and integrating evidence. Its influence is also felt in the educational culture. Frequent feedback, individualized monitoring, and collegial deliberation on student progress are gaining ground as quality standards. Furthermore, the scientific literature in medical education recognizes Programmatic Assessment as a model with high conceptual validity for evaluating complex competencies in real-world clinical settings.

Downloads

Download data is not yet available.
Metrics
Views/Downloads
  • Abstract
    1
  • pdf (Español (España))
    1
  • pdf
    1

Author Biography

Joaquin García-Estañ, Centro de Estudios en Educación Médica

* Catedrático de Fisiología, Universidad de Murcia (2003)

* Decano de la Facultad de Medicina de Murcia (2006-2014).

* Presidente de la Conferencia Nacional de Decanos de Facultades de Medicina de España (2008-2012).

* Centro de Estudios en Educación Médica de la Universidad de Murcia: Director (2014-2018), Secretario (2018-).

* Sociedad Española de Educación Médica: Vocal de la Junta Directiva (2015-1019), Tesorero (2019-2022), Presidente electo (2023-2026).

Publicaciones:

CV en Pubmed.

CV en Scopus.

CV en Google Scholar.

ORCID Record.

Preprints.

References

1. Schuwirth LWT, van der Vleuten CPM. Programmatic assessment: From assessment of learning to assessment for learning. Med Teach. 2011, 33(6), 478–85. https://doi.org/10.3109/0142159x.2011.565828

2. Cook DA, Brydges R, Ginsburg S, Hatala R. A contemporary approach to validity arguments: A practical guide to Kane’s framework. Medical Education. 2015, 49(6), 560–75. https://doi.org/10.1111/medu.12678

3. ten Cate O. Nuts and bolts of entrustable professional activities. Journal of Graduate Medical Education. 2013, 5(1), 157–8. https://doi.org/10.4300/JGME-D-12-00380.1

4. Govaerts MJB, van der Vleuten CPM. Validity in work-based assessment: Expanding our horizons. Medical Education. 2013, 47(12), 1164–74. https://doi.org/10.1111/medu.12289

5. van der Vleuten CPM, Schuwirth LWT, Driessen EW, Govaerts MJB, Heeneman S. A model for programmatic assessment fit for purpose. Medl Teach. 2012, 34(3), 205–14. https://doi.org/10.3109/0142159x.2012.652239

6. Heeneman S, de Jong LH, Dawson LJ, Wilkinson TJ, Ryan A, Tait GR, Rice N, Torre D, Freeman A, van der Vleuten CPM. Ottawa 2020 consensus statement for programmatic assessment - 1. Agreement on the principles. Med Teach. 2021, 43(10), 1139-1148. https://doi.org/10.1080/0142159X.2021.1957088

7. Schuwirth L, van der Vleuten C, Durning SJ. What programmatic assessment in medical education can learn from healthcare. Perspect Med Educ. 2017, 6(4), 211-215. https://doi.org/10.1007/s40037-017-0345-1.

8. van der Vleuten CPM, Schuwirth LWT, Driessen EW, Govaerts MJB, Heeneman S. Twelve tips for programmatic assessment. Med Teach. 2015, 37(7), 641–6. https://doi.org/10.3109/0142159x.2014.973388

9. Hodges B, Lingard L. The question of competence, Reconsidering medical education in the twenty-first century. Ithaca, NY, Cornell University Press, 2012. https://clipi.cc/SBYJ

10. Bok HG, Teunissen PW, Favier RP, Rietbroek NJ, Theyse LF, Brommer H, Haarhuis JC, van Beukelen P, van der Vleuten CP, Jaarsma DA. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ. 2013, 13, 123. https://doi.org/10.1186/1472-6920-13-123

11. Govaerts MJB, van der Vleuten CPM, Schuwirth LWT, Muijtjens AMM. Broadening perspectives on clinical performance assessment: Rethinking the nature of in-training assessment. Advances in Health Sciences Education. 2007, 12(2), 239–60. https://doi.org/10.1007/s10459-006-9043-1

12. Kornegay JG, Kraut A, Manthey D, Omron R, Caretta-Weyer H, Kuhn G, Martin S, Yarris LM. Feedback in Medical Education: A Critical Appraisal. AEM Educ Train. 2017, 1(2), 98-109. https://doi.org/10.1002/aet2.10024

13. ten Cate O, Chen HC, Hoff RG, Peters H, Bok HGJ, van der Schaaf M. Curriculum development for the workplace using entrustable professional activities (EPAs), AMEE Guide No. 99. Med Teach. 2015, 37(11), 983–1002. https://doi.org/10.3109/0142159x.2015.1060308

14. van der Vleuten CPM, Schuwirth LWT, Scheele F, Driessen EW, Hodges B. The assessment of professional competence: building blocks for theory development. Best Practice & Research Clinical Obstetrics & Gynaecology. 2010, 24(6), 703–19. https://doi.org/10.1016/j.bpobgyn.2010.04.001

Published
13-02-2026
How to Cite
García-Estañ, J. (2026). Programmatic Evaluation, the end of “all or nothing” in medical education. Spanish Journal of Medical Education, 7(1). https://doi.org/10.6018/edumed.702031