2
VIEW FROM THE ASSOCIATION OF PEDIATRIC PROGRAM DIRECTORS Towards Meaningful Outcomes Assessment: Collaborative Efforts in Pediatric Medical Education Ann E. Burke, MD; Patricia J. Hicks, MD; Carol Carraccio, MD, MA Association of Pediatric Program Directors, Department of Pediatrics, Wright State University Boonshoft School of Medicine, Dayton, Oh (Dr Burke); Association of Pediatric Program Directors, The Children’s Hospital of Philadelphia, Department of Pediatrics, The Perlman School of Medicine at the University of Pennsylvania, Philadelphia, Pa (Dr Hicks); Director, Competency-based Assessment Programs, American Board of Pediatrics, Chapel Hill, N.C. (Dr. Carraccio) Address correspondence to Ann E. Burke, MD, Dayton Children’s Medical Center, Medical Education Department, One Children’s Plaza, Dayton, OH 45419 (e-mail: [email protected]). Received for publication January 19, 2012; accepted January 19, 2012. ACADEMIC PEDIATRICS 2012;12:79–80 “STANDARDIZATION OF LEARNING outcomes and individualization of the learning process” is the first recom- mended goal in Cooke, Irby, and O’Brien’s Educating Physicians: A Call for Reform of Medical School and Resi- dency . 1 Indeed, the measurement of medical education outcomes is being recognized as increasingly important by all stakeholders. Variation in graduate medical educa- tion program structure and content resulting from program innovation 2 and Accreditation Council for Graduate Medical Education (ACGME) duty hour and program requirement changes 3 has heightened calls for outcome measures that are not related to time-in-seat, postgraduate year-level, or other chronologic milestones. A demonstration of competence in performance before changing roles and responsibilities is now recognized as crit- ical for many training transitions, including moving from the directly supervised role of an early intern to one who can function with indirect supervision to one who can supervise a team. The ACGME Outcome Project 4 called for assess- ment data with sufficient validity evidence to allow for infer- ences to be made about individual resident functioning and thus, deliberately aligns responsibilities, training privileges, and progression according to demonstrated performance. The ACGME has called for each specialty to develop milestones 5 that could be used to demonstrate appropriate progression in the development of residents and alignment of patient care responsibilities with the competence required to carry out those responsibilities. The Pediatrics Milestones Working Group, with input from the Association for Pediatric Program Directors (APPD) membership, has written and published the first draft, The Pediatrics Milestone Project: A Joint Initiative of the ACGME and the American Board of Pediatrics (ABP). (See https:// www.abp.org/abpwebsite/publicat/milestones.pdf). The greatest challenge to providing meaningful outcomes is the lack of assessment instruments that yield data with high validity and reliability, providing a means to inform the learner and the program about the learner’s level of achievement. An additional challenge, reported by respondents of a 2010 APPD member survey (unpub- lished data), is a gap in program leader knowledge and skills regarding assessment in medical education. Informed by this survey, the Program Directors Committee of the ABP and the APPD partnered to write Assessment in Graduate Medical Education: A Primer for Pediatric Program Directors. 6 This document, offered to all pediatric program directors in printed form and to the public in pdf, e-book version, and Kindle version on the ABP Program Directors webpage, 7 was edited by assessment expert, Alan Schwartz, PhD. The primer is divided into 2 sections: the first section addresses founda- tional concepts and principles of assessment, and the second section offers application of these theoretical concepts and principles to the 6 general competencies. One goal of the second section is to offer examples and provide consideration for the design and critique of assess- ment instruments as they are used in the authentic setting of pediatric residency education. The APPD Task Forces are excited about this new work and have plans to bring each of the chapters in the second section to life in the form of educational workshops. A tangible outcome of these work- shops will be instructional materials such as slide sets, teaching cases, and example assessment instruments that can be disseminated throughout the pediatric medical community through posting on APPD Share Warehouse. 8 The development and implementation of meaningful assessment instruments for medical education will require a combination of assessment expertise and significant faculty training to identify and consistently score learner performance. Implementation of meaningful assessment will need to consider the complex behavioral factors associ- ated with leading this necessary change. 9 Careful consider- ation is critical in any successful innovative change, particularly if that innovation is addressing a challenge or problem. 10 Looking carefully at the problem of assessment, symptoms are abundant—a lack of observational assess- ment instruments proven to yield data with high validity evidence, 11 faculty rater bias, 12 limited faculty skills in ACADEMIC PEDIATRICS Volume 12, Number 2 Copyright ª 2012 by Academic Pediatric Association 79 March–April 2012

Towards Meaningful Outcomes Assessment: Collaborative Efforts in Pediatric Medical Education

Embed Size (px)

Citation preview

VIEW FROM THE ASSOCIATION OF PEDIATRIC PROGRAM DIRECTORS

Towards Meaningful Outcomes Assessment: Collaborative

Efforts in Pediatric Medical EducationAnn E. Burke, MD; Patricia J. Hicks, MD; Carol Carraccio, MD, MA

Association of Pediatric Program Directors, Department of Pediatrics, Wright State University Boonshoft School of Medicine, Dayton, Oh (DrBurke); Association of Pediatric ProgramDirectors, The Children’s Hospital of Philadelphia, Department of Pediatrics, The Perlman School ofMedicine at the University of Pennsylvania, Philadelphia, Pa (Dr Hicks); Director, Competency-based Assessment Programs, AmericanBoard of Pediatrics, Chapel Hill, N.C. (Dr. Carraccio)Address correspondence to Ann E. Burke, MD, Dayton Children’s Medical Center, Medical Education Department, One Children’s Plaza,Dayton, OH 45419 (e-mail: [email protected]).

Received for publication January 19, 2012; accepted January 19, 2012.

ACADEMIC PEDIATRICS 2012;12:79–80

“STANDARDIZATION OF LEARNING outcomes andindividualization of the learning process” is the first recom-mended goal in Cooke, Irby, and O’Brien’s EducatingPhysicians: A Call for Reform of Medical School and Resi-dency.1 Indeed, the measurement of medical educationoutcomes is being recognized as increasingly importantby all stakeholders. Variation in graduate medical educa-tion program structure and content resulting from programinnovation2 and Accreditation Council for GraduateMedical Education (ACGME) duty hour and programrequirement changes3 has heightened calls for outcomemeasures that are not related to time-in-seat, postgraduateyear-level, or other chronologic milestones.

A demonstration of competence in performance beforechanging roles and responsibilities is now recognized as crit-ical formany training transitions, includingmoving from thedirectly supervised role of an early intern to one who canfunction with indirect supervision to one who can supervisea team. The ACGME Outcome Project4 called for assess-ment datawith sufficient validity evidence to allow for infer-ences to be made about individual resident functioning andthus, deliberately aligns responsibilities, training privileges,and progression according to demonstrated performance.The ACGME has called for each specialty to developmilestones5 that could be used to demonstrate appropriateprogression in the development of residents and alignmentof patient care responsibilitieswith the competence requiredto carry out those responsibilities.

ThePediatricsMilestonesWorkingGroup,with input fromthe Association for Pediatric Program Directors (APPD)membership, has written and published the first draft, ThePediatricsMilestoneProject:AJoint Initiativeof theACGMEand the American Board of Pediatrics (ABP). (See https://www.abp.org/abpwebsite/publicat/milestones.pdf).

The greatest challenge to providing meaningfuloutcomes is the lack of assessment instruments that yielddata with high validity and reliability, providing a meansto inform the learner and the program about the learner’slevel of achievement. An additional challenge, reported

ACADEMIC PEDIATRICSCopyright ª 2012 by Academic Pediatric Association 79

by respondents of a 2010 APPD member survey (unpub-lished data), is a gap in program leader knowledge andskills regarding assessment in medical education.Informed by this survey, the Program Directors

Committee of the ABP and the APPD partnered to writeAssessment in Graduate Medical Education: A Primerfor Pediatric Program Directors.6 This document, offeredto all pediatric program directors in printed form and tothe public in pdf, e-book version, and Kindle version onthe ABP Program Directors webpage,7 was edited byassessment expert, Alan Schwartz, PhD. The primer isdivided into 2 sections: the first section addresses founda-tional concepts and principles of assessment, and thesecond section offers application of these theoreticalconcepts and principles to the 6 general competencies.One goal of the second section is to offer examples andprovide consideration for the design and critique of assess-ment instruments as they are used in the authentic setting ofpediatric residency education. The APPD Task Forces areexcited about this new work and have plans to bring eachof the chapters in the second section to life in the form ofeducational workshops. A tangible outcome of these work-shops will be instructional materials such as slide sets,teaching cases, and example assessment instruments thatcan be disseminated throughout the pediatric medicalcommunity through posting on APPD Share Warehouse.8

The development and implementation of meaningfulassessment instruments for medical education will requirea combination of assessment expertise and significantfaculty training to identify and consistently score learnerperformance. Implementation of meaningful assessmentwill need to consider the complex behavioral factors associ-ated with leading this necessary change.9 Careful consider-ation is critical in any successful innovative change,particularly if that innovation is addressing a challenge orproblem.10 Looking carefully at the problem of assessment,symptoms are abundant—a lack of observational assess-ment instruments proven to yield data with high validityevidence,11 faculty rater bias,12 limited faculty skills in

Volume 12, Number 2March–April 2012

80 BURKE ET AL ACADEMIC PEDIATRICS

direct observational assessment,13 gap in resources, andpolicies to support medical education assessment, includingsupport and incentives towards promotion and advancementfor faculty participating in direct observations.

The APPD is engaged in efforts related to developingmeaningful assessment instruments and open to lookingat assessment methods that have not been previouslyexplored in pediatrics. Consideration of the use of clustersof activities that are grouped according to their function-ality in carrying out specific tasks within a specialty mayprovide an understandable and practical framework forassessment. Such clusters, referred to as entrustable profes-sional activities (EPAs),14 can be used to describe corework-related functioning for each specialty. An exampleEPAwould be caring for a healthy newborn, which requirescompetence in performing a cluster of activities related tocaring for the infant, communicating with the mother, andtransferring care to the community pediatrician. Withineach cluster, behaviors and activities relate to separatecompetencies and subcompetencies, each of which hasrelatively limited meaning in isolation but the combinationof which, however, provides a real-world example ofmeaningful activities important to competence in practice.

Another example of an EPA for Pediatric Primary Care iscare of the uncomplicated general pediatric inpatient.15

Specific activities to contribute to the advancement of assess-ment in pediatric medical education include the develop-ment of the APPD’s Longitudinal Educational AssessmentResearch Network (APPD LEARN), APPD engagementwith the Pediatrics Milestones Project, and the Initiativefor Innovation in Pediatric Education.16,17 Through theseactivities and wide membership involvement, ourcommunity has the capability to lead change in pediatricmedical education assessment by providing meaningfuldata about learner outcomes, program effectiveness, andlearner readiness to progress towards competence for thenext career stage (unsupervised practice) and ultimately,the improved care of patients.

REFERENCES

1. Cooke M, Irby DM, O’Brien BC. Educating Physicians—A Call for

Reform of Medical School and Residency. San Francisco, Calif:

Jossey-Bass; 2010.

2. Initiative for Innovation in Pediatric Education (IIPE) website.

Available at: http://www.innovatepedsgme.org. Accessed January

19, 2012.

3. ACGME. New Common Requirements. 2010. Available at:

http://acgme-2010standards.org/pdf/Common_Program_Requirements

_07012011.pdf. Accessed January 27, 2012.

4. ACGME. The ACGME Outcome Project: An Introduction. http://

www.acgme.org/acWebsite/navPages/nav_commonpr.asp. Accessed

February 13, 2012.

5. Nasca T. CEO’s Welcoming Address at the ACGME Annual

Educational Conference. 2010. Available at: http://www.acgme.

org/acWebsite/newsReleases/newsRel_3_23_10_1.asp. Accessed

January 19, 2012.

6. Alan Schwartz P, ed. Assessment in Graduate Medical Education:

A Primer for Pediatric Program Directors. Chapel Hill, N.C.: Amer-

ican Board of Pediatrics, https://www.abp.org/abpwebsite/publicat/

primer.pdf; 2011.

7. Program Directors Page of the ABP. 2011; American Board of

Pediatrics Program Directors Page. Available at: https://www.abp.

org/ABPWebStatic/?anticache¼0.34619953106819084#murl%3D%

2FABPWebStatic%2Fprogdirector.html%26surl%3D%2Fabpwebsite

%2Fprogramdirectors%2Fprogramdirectors.htm. Accessed January

19, 2012.

8. A.P.P.D. APPD Share Warehouse. 2008. Available at: http://appd.org/

ed_res/share_warehouse.cfm. Accessed January 19, 2012.

9. Michie S, van Stralen MM, West R. The behavior change wheel:

a newmethod for characterizing and designing behavior change inter-

ventions. Implementation Sci. 2011;6:1–11.

10. Kanter SL. Towards better descriptions of innovations. Acad Med.

2008;83:703–704.

11. Kogan JR, Holmboe ES, Hauer KE. Tools for Direct Observation and

Assessment of Clinical Skills of Medical Trainees. JAMA. 2009;302:

1316–1326.

12. Kogan JR, Conforti L, Bernabeo E, et al. Opening the black box of

clinical skills assessment via observation: a conceptual model. Med

Educ. 2011;45:1048–1060.

13. Holmboe E. Faculty and the observation of trainees’ clinical skills:

Problems and opportunities. Acad Med. 2004;79:16–22.

14. ten Cate O, Scheele F. Competency-based postgraduate training: Can

we bridge the gap between theory and clinical practice? Acad Med.

2007;82:542–547.

15. Carraccio C, Burke AE. Beyond competencies and milestones: add-

ing meaning through context. J Graduate Med Educ. 2010;2:

419–422.

16. Jones MD, McGuinness GA, First LR, et al. Linking process to

outcome: are we training pediatricians to meet evolving health care

needs? Pediatrics. 2009;123:S1–S7.

17. Jones MD. Innovation in residency education and evolving pediatric

health needs. Pediatrics. 2010;125:1–3.