View
213
Download
0
Category
Preview:
Citation preview
CBME and Assessment
Competency-Based Medical Education
is an outcomes-based approach to the design, implementation, assessment and evaluation of a medical education program using an organizing framework of competencies
The International CMBE Collaborators 2009
Traditional versus CBME: Start with System Needs
3
Frenk J. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010
Competency BasedEducation
Fixed length, variable outcome
Variable length, defined outcome
Structure/Process•Knowledge acquisition•Single subjective measure•Norm referenced evaluation•Evaluation setting removed•Emphasis on summative
Competency Based•Knowledge application•Multiple objective measures•Criterion referenced•Evaluation setting: DO•Emphasis on formative
Caraccio et al 2002
The Transition to Competency
Miller’s Assessment Pyramid
KNOWS
KNOWS HOW
SHOWS
HOW
DOES
MCQ EXAM
Extended matching / CRQ
Standardized Patients
Impact on Patient
Faculty observation, audits, surveys
Training and Safe Patient Care
Trainee performance* X
Appropriate level of supervision**
Must = Safe, effective patient-centered care
* a function of level of competence in context
**a function of attending competence in context
Educational Program
Variable Structure/Process Competency-based
Driving force: curriculum
Content-knowledge acquisition
Outcome-knowledge application
Driving force: process Teacher Learner
Path of learning Hierarchical (Teacher→student)
Non-hierarchical (Teacher↔student)
Responsibility: content Teacher Student and Teacher
Goal of educ. encounter
Knowledge acquisition Knowledge application
Typical assessment tool Single subject measure Multiple objective measures
Assessment tool Proxy Authentic (mimics real tasks of profession)
Setting for evaluation Removed (gestalt) Direct observation
Evaluation Norm-referenced Criterion-referenced
Timing of assessment Emphasis on summative Emphasis on formative
Program completion Fixed time Variable time
Carracchio, et al. 2002.
Assessment “Building Blocks”
Choice of right outcomes tied to an effective curriculum – step 1!!
Right combination of assessment methods and tools– MiniCEX, DOPS, Chart stimulated recall (CSR),
medical record audit
Effective application of the methods and tools Effective processes to produce good
judgments
Measurement Tools: Criteria
Cees van der Vleuten’s utility index: Utility = V x R x A x EI x CE/Context*
– Where:V = validity
R = reliability
A = acceptability
E = educational impact
C = cost effectiveness
*Context = ∑ Microsystems
Criteria for “Good” Assessment1
– Validity or Coherence– Reproducibility or Consistency– Equivalence– Feasibility– Educational effect– Catalytic effect
• This is the “new” addition – relates to feedback that “drives future learning forward.”
– Acceptability
1Ottawa Conference Working Group 2010
Measurement Model
Donabedian Model (adapted)• Structure: the way a training program is set up
and the conditions under which the program is administered
• Organization, people, equipment and technology
• Process: the activities that result from the training program
• Outcomes: the changes (desired or undesired) in individuals or institutions that can be attributed to the training program
Structured Portfolio•ITE (formative only)•Monthly Evaluations•MiniCEX•Medical record audit/QI project•Clinical question log•Multisource feedback•Trainee contributions (personal portfolio)
o Research project
Trainee•Review portfolio •Reflect on contents•Contribute to portfolio
Program Leaders•Review portfolio periodically and systematically•Develop early warning system•Encourage reflection and self-assessment
Clinical Competency Committee•Periodic review – professional growth opportunities for all•Early warning systems
Program Summative Assessment Process
Licensing and Certification• Licensure and certification in Qatar
Assessment During Training: Components
Advisor
Time
AssessmentActivities
TrainingActivities
SupportingActivities
v v v v v v
= learning task
= learning artifact
= single assessment data-point
= single certification data point for mastery tasks
= learner reflection and planning= social interaction around reflection (supervision)
= learning task being an assessment task also
Model For Programmatic Assessment(With permission from CPM van der Vleuten)
Committee
Assessment Subsystem
An assessment subsystem is a group of people who work together on a regular basis to perform evaluation and provide feedback to a population of trainees over a defined period of time
This system has a structure to carry out evaluation processes that produce an outcome
The assessment subsystem must ultimately produce a valid entrustment judgment
Assessment Subsystem
This group shares:– Educational goals and outcomes– Linked assessment and evaluation processes – Information about trainee performance– A desire to produce a trainee truly competent (at
a minimum) to enter practice or fellowship at the end of training
Assessment Subsystem The subsystem must:
– Involve the trainees in the evaluation structure and processes
– Provide both formative and summative evaluation to the trainees
– Be embedded within, not outside the overall educational system (assessment not an “add-on”
– Provide a summative judgment for the profession and public• Effective Evaluation = Professionalism
Subsystem Components
Effective Leadership Clear communication of goals
– Both trainees and faculty Evaluation of competencies is multi-faceted Data and Transparency
– Involvement of trainees– Self-directed assessment and reflection by
trainees– Trainees must have access to their “file”
Subsystem Components
“Competency” committees– Need wisdom and perspectives of the group
Continuous quality improvement– The evaluation program must provide data as
part of the CQI cycle of the program and institution
– Faculty development
Supportive Institutional Culture
Structured Portfolio
Medical record audit andQI project
MSF: Directed per protocolTwice/year
Practice-based learning and improvement
Systems-based prac
Mini-CEX:10/year
Interpersonal skil ls and Communication
ITE:1/year
Patient care
Faculty Evaluations
EBM/Question Log
Medical knowledge
Professionalism
Multi-faceted Evaluation
■ Trainee-directed ■ Direct observation
Structured Portfolio•ITE (formative only)•Monthly Evaluations•MiniCEX•Medical record audit/QI project•Clinical question log•Multisource feedback•Trainee contributions (personal portfolio)
o Research project
Trainee•Review portfolio •Reflect on contents•Contribute to portfolio
Program Leaders•Review portfolio periodically and systematically•Develop early warning system•Encourage reflection and self-assessment
Clinical Competency Committee•Periodic review – professional growth opportunities for all•Early warning systems
Program Summative Assessment Process
Licensing and Certification• USLME•American Boards of Medical Specialties
Assessment During Training: Components
Advisor
Performance Data
A training program cannot reach its full potential without robust and ongoing performance data– Aggregation of individual trainee performance– Performance measurement of the quality and
safety of the clinical care provided by the training institution and the program
Competency Committees
Structured Portfolio•ITE (formative only)•Monthly Evaluations•MiniCEX•Medical record audit/QI project•Clinical question log•Multisource feedback•Trainee contributions (personal portfolio)
o Research project
Trainee•Review portfolio •Reflect on contents•Contribute to portfolio
Program Leaders•Review portfolio periodically and systematically•Develop early warning system•Encourage reflection and self-assessment
Clinical Competency Committee•Periodic review – professional growth opportunities for all•Early warning systems
Program Summative Assessment Process
Licensing and Certification• USLME•American Boards of Medical Specialties
Assessment During Training: Components
Advisor
Time
AssessmentActivities
TrainingActivities
SupportingActivities
v v v v v v
= learning task
= learning artifact
= single assessment data-point
= single certification data point for mastery tasks
= learner reflection and planning= social interaction around reflection (supervision)
= learning task being an assessment task also
Model For Programmatic Assessment(With permission from CPM van der Vleuten)
Committee
Committees and Information Evaluation (“competency”) committees can be
invaluable• Develop group goals• “Real-time” faculty development• Key for dealing with difficult trainees
Key “receptor site” for frameworks/milestones• Synthesis and integration of multiple assessments
“Wisdom of the Crowd”
Hemmer (2001) – Group conversations more likely to uncover deficiencies in professionalism among students
Schwind, Acad. Med. (2004) – • 18% of resident deficiencies requiring
active remediation only became apparent through group discussion.• Average discussion 5 minutes/resident
(range 1 – 30 minutes)
“Wisdom of the Crowd” Williams, Teach. Learn. Med. (2005)
• No evidence that individuals in groups dominate discussions.• No evidence of ganging up or piling on
Thomas (2011) – Group assessment improved inter-rater reliability and reduced range restriction in multiple domains in an internal medicine residency
Narratives and Judgments
Pangaro (1999) – matching students to a “synthetic” descriptive framework (RIME) reliable and valid across multiple clerkships
Regehr (2007) – Matching students to a standardized set of holistic, realistic vignettes improved discrimination of student performance
Regehr (2012) – Faculty created narrative “profiles” (16 in all) found to produce consistent rankings of excellent, competent and problematic performance.
The “System”
Assessments within Program:
•Direct observations•Audit and performance data•Multi-source FB•Simulation•ITExam
Judgment and Synthesis:Committee
Residents
Faculty, PDs and others
Milestone and EPAs as Guiding Framework and Blueprint
Accreditation:ACGME/RRC
NAS Milestones
ABIM Fastrak
Program Aggregation
Certification:ABIM
No AggregationNo Aggregation
Institution and Program
Questions
Recommended