9
The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine William Bond, MD, Gloria Kuhn, DO, PhD, Emily Binstadt, MD, MPH, Mark Quirk, EdD, Teresa Wu, MD, Matthew Tews, DO, Parvati Dev, PhD, K. Anders Ericsson, PhD Abstract This consensus group from the 2008 Academic Emergency Medicine Consensus Conference, ‘‘The Sci- ence of Simulation in Healthcare: Defining and Developing Clinical Expertise,’’ held in Washington, DC, May 28, 2008, focused on the use of simulation for the development of individual expertise in emergency medicine (EM). Methodologically sound qualitative and quantitative research will be needed to illumi- nate, refine, and test hypotheses in this area. The discussion focused around six primary topics: the use of simulation to study the behavior of experts, improving the overall competence of clinicians in the shortest time possible, optimizing teaching strategies within the simulation environment, using simula- tion to diagnose and remediate performance problems, and transferring learning to the real-world envi- ronment. Continued collaboration between academic communities that include medicine, cognitive psychology, and education will be required to answer these questions. ACADEMIC EMERGENCY MEDICINE 2008; 15:1037–1045 ª 2008 by the Society for Academic Emer- gency Medicine Keywords: simulation, cognitive, expertise, deliberate practice, competence, remediation E ducation in the specialty of emergency medicine (EM) covers the spectrum of learning, from sim- ple knowledge acquisition to making complex decisions about the value of information. 1 As such, it involves many teaching and learning strategies, of which simulation is just one possibility. Questions remain as to the best means of using simulation to achieve greater expertise. The discussion at the consensus conference was structured around a series of questions for future research pertaining to the use of simulation to further acquisition of expertise. This article aims to frame research questions surrounding this topic that are of highest priority and to discuss research techniques that may be applied to their solution. This discussion, in terms of both the literature cited and the breakout group members, draws from leaders in EM, cognitive psychology and education. It creates a plan for translational work from principles that have been tested in the psychological laboratory and others developed in educational theory to the simulation labo- ratory environment, including virtual or computer- based simulation. The underlying psychological and educational principles and assumptions will be ex- plained within each section to the extent needed for clarity. We begin with the presumption that the relative youth of simulation training techniques in medicine leaves room for high-quality descriptive studies. Such studies should use good observational tools and be aimed at developing future hypotheses when such hypothesis testing is needed. 2 We also acknowledge the potentially bidirectional nature of translational research. That is, successful individuals or teams of EM ª 2008 by the Society for Academic Emergency Medicine ISSN 1069-6563 doi: 10.1111/j.1553-2712.2008.00229.x PII ISSN 1069-6563583 1037 From the Department of Emergency Medicine, Lehigh Valley Hospital and Health Network (WB), Allentown, PA; the Depart- ment of Emergency Medicine, Wayne State University (GK), Detroit, MI; the Department of Emergency Medicine, Regions Hospital (EB), St. Paul, MN; the University of Massachusetts Medical School (MQ), Worcester, MA; Graduate Medical Education, Orlando Regional Medical Center, Orlando, the Department of Emergency Medicine, Florida State University, Tallahassee, and the University of Florida School of Medicine (TW), Gainesville, FL; the Department of Emergency Medi- cine, Medical College of Wisconsin (MT), Milwaukee, WI; Innovation in Learning, Inc. (PD), Los Angeles, CA; and the Department of Psychology, Florida State University (KAE), Tallahassee, FL. Received July 7, 2008; revision received July 9, 2008; accepted July 9, 2008. This is a proceeding from a workshop session of the 2008 Academic Emergency Medicine Consensus Conference, ‘‘The Science of Simulation in Healthcare: Defining and Developing Clinical Expertise,’’ Washington, DC, May 28, 2008. Address for correspondence and reprints: William Bond, MD; e-mail: [email protected].

The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

Embed Size (px)

Citation preview

Page 1: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

The Use of Simulation in the Developmentof Individual Cognitive Expertise inEmergency MedicineWilliam Bond, MD, Gloria Kuhn, DO, PhD, Emily Binstadt, MD, MPH, Mark Quirk, EdD, Teresa Wu, MD,Matthew Tews, DO, Parvati Dev, PhD, K. Anders Ericsson, PhD

AbstractThis consensus group from the 2008 Academic Emergency Medicine Consensus Conference, ‘‘The Sci-ence of Simulation in Healthcare: Defining and Developing Clinical Expertise,’’ held in Washington, DC,May 28, 2008, focused on the use of simulation for the development of individual expertise in emergencymedicine (EM). Methodologically sound qualitative and quantitative research will be needed to illumi-nate, refine, and test hypotheses in this area. The discussion focused around six primary topics: the useof simulation to study the behavior of experts, improving the overall competence of clinicians in theshortest time possible, optimizing teaching strategies within the simulation environment, using simula-tion to diagnose and remediate performance problems, and transferring learning to the real-world envi-ronment. Continued collaboration between academic communities that include medicine, cognitivepsychology, and education will be required to answer these questions.

ACADEMIC EMERGENCY MEDICINE 2008; 15:1037–1045 ª 2008 by the Society for Academic Emer-gency Medicine

Keywords: simulation, cognitive, expertise, deliberate practice, competence, remediation

E ducation in the specialty of emergency medicine(EM) covers the spectrum of learning, from sim-ple knowledge acquisition to making complex

decisions about the value of information.1 As such, itinvolves many teaching and learning strategies, of which

simulation is just one possibility. Questions remain as tothe best means of using simulation to achieve greaterexpertise. The discussion at the consensus conferencewas structured around a series of questions for futureresearch pertaining to the use of simulation to furtheracquisition of expertise. This article aims to frameresearch questions surrounding this topic that are ofhighest priority and to discuss research techniques thatmay be applied to their solution.

This discussion, in terms of both the literature citedand the breakout group members, draws from leadersin EM, cognitive psychology and education. It creates aplan for translational work from principles that havebeen tested in the psychological laboratory and othersdeveloped in educational theory to the simulation labo-ratory environment, including virtual or computer-based simulation. The underlying psychological andeducational principles and assumptions will be ex-plained within each section to the extent needed forclarity.

We begin with the presumption that the relativeyouth of simulation training techniques in medicineleaves room for high-quality descriptive studies. Suchstudies should use good observational tools and beaimed at developing future hypotheses when suchhypothesis testing is needed.2 We also acknowledgethe potentially bidirectional nature of translationalresearch. That is, successful individuals or teams of EM

ª 2008 by the Society for Academic Emergency Medicine ISSN 1069-6563doi: 10.1111/j.1553-2712.2008.00229.x PII ISSN 1069-6563583 1037

From the Department of Emergency Medicine, Lehigh ValleyHospital and Health Network (WB), Allentown, PA; the Depart-ment of Emergency Medicine, Wayne State University (GK),Detroit, MI; the Department of Emergency Medicine, RegionsHospital (EB), St. Paul, MN; the University of MassachusettsMedical School (MQ), Worcester, MA; Graduate MedicalEducation, Orlando Regional Medical Center, Orlando, theDepartment of Emergency Medicine, Florida State University,Tallahassee, and the University of Florida School of Medicine(TW), Gainesville, FL; the Department of Emergency Medi-cine, Medical College of Wisconsin (MT), Milwaukee, WI;Innovation in Learning, Inc. (PD), Los Angeles, CA; and theDepartment of Psychology, Florida State University (KAE),Tallahassee, FL.Received July 7, 2008; revision received July 9, 2008; acceptedJuly 9, 2008.This is a proceeding from a workshop session of the 2008Academic Emergency Medicine Consensus Conference, ‘‘TheScience of Simulation in Healthcare: Defining and DevelopingClinical Expertise,’’ Washington, DC, May 28, 2008.Address for correspondence and reprints: William Bond, MD;e-mail: [email protected].

Page 2: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

clinicians may need to be brought into the simulationlaboratory to study them, thus yielding new observa-tions about good clinical practice and how those abili-ties are achieved.3

The task before the breakout group was to define areasof research that would be helpful to graduate medicaleducators in EM. The nature of expertise is such that todefine it you must often enter a discourse on its assess-ment. We endeavor to discuss assessment only whennecessary, so that our work will not be redundant rela-tive to the group focused on the assessment of expertise.Assessment at the specialty level is often concerned withmeeting a minimum standard of performance. One ofour questions specifically addresses the issue of findingand bolstering weak areas of performance to reach sucha minimum standard. However, the primary concern ofthe consensus group is to have our learners achieve thehighest level of expertise possible and continually main-tain and improve that expertise.

1. How can simulation be combined with other strate-gies, such as ‘‘think aloud’’ or interview (reflection andself-assessment), to effectively identify expert behavior(including problem-solving and decision-making), thuscreating training targets (benchmarks) for junior learners?

Social learning theory, developed by Bandura,4

would suggest that EM residents learn some of theirmost powerfully imprinted behaviors from attendingphysicians who model behavior in the emergencydepartment (ED). The impact of such modeling throughobservation has been sparsely studied in medical pro-fessionalism.5 In other areas of resident education, theimpact of modeling remains understudied. Clearly, theopportunity to create vicarious experiences that arefocused on any aspect of medical care is one of thegreat strengths of simulation. Simulation may be stagedto meet specific goals6 or can be conducted by expertsand interrupted by learners so that they can understandthe thought processes of the expert clinician.

In an effort to understand the underlying clinicaljudgment or decision-making process of an experiencedphysician, the apprenticeship model often falls short.Evidence exists that critical care experience leads doc-tors to more proactive interventions rather than reac-tive responses.7 But how are these proactive behaviorslearned? The expert may not have time to explain theirthought process in the clinical setting, may leave outmany factors considered, and may simply be unable toexplain his or her thought process at the appropriatelevel of simplicity for the given learner.8

Several authors have now suggested that part ofexpertise is the decision to move from ‘‘System One’’or heuristic guided and intuitive thinking, to ‘‘SystemTwo’’ or analytical and hypothesis-driven thinking(forward-thinking).9–12 This concept is framed byMoulton et al.13 as ‘‘slowing down when you should.’’The expert uses pattern recognition to be efficient atthe mundane, typical patient presentations, but is ableto recognize when the pieces do not fit in the clinicalpicture and changes into a more attentive analyticalmode of thought. Experts move fluidly between thesetwo abilities. It should be noted that this concept of

shifting between modes of thought is questioned bysome who believe that experts with superior perfor-mance develop more refined representations of thecurrent situation, which give them increased abilityto control, evaluate, and anticipate outcomes withprolonged practice.14,15 These representations provideworking memory support for metacognitive strategiesused by expert clinicians to successfully navigatecomplex cases. The question remains as to the expert’sability to recognize (at a conscious level) and explainhis or her thinking to the learner or alternatively howthe expert can facilitate the development of morerefined representations in the learner.16

One option for providing clearer targets for learners isto better analyze the thought processes of experts withintheir domain of expertise. This might be done within thecare environment with the proper training of the expert.However, given the nature of EM practice, with a highvolume and acuity of patients, this might prove quitechallenging. Alternatively, we can create a simulationlaboratory environment that provides cues similar tothose of the clinical environment and train our experts inverbal protocol techniques whereby they ‘‘think aloud’’and reveal their thought processes. Here we must makethe distinction between reflective techniques that allowtime for metacognitive processes to be called upon andtrue spontaneous verbal protocol analysis, wherethoughts are being verbalized immediately. If time isallowed during the ‘‘think aloud’’ to reflect, recall andother biases may alter or filter the information that isobtained.14,17 Our hope is that experts will reveal,through verbal protocol analysis, the steps they are usingto solve the clinical problem so that the issues can bereconstructed at a level of detail that would be useful to alearner. During the ‘‘debriefing’’ session with learners,experts can reflect upon their own verbalizations andadd reflection to the learning process.16

Military simulators have shown that one can increaseoverall expertise with a complex task by breaking itdown into smaller parts and practicing them individu-ally.18 This same methodology may be helpful once thedetails can be specified and potentially separated incomplex medical decisions. Multiple EM expert clini-cians might be studied solving the same problem in thesimulation laboratory setting to find those strategiesthat appear most reliable or diagnostically successful.The reconstructed solution may effectively become a‘‘worked example’’ in our specific domain of expertise.

2. Can simulation-based training (SBT) produce morecompetent physicians (not just in procedures) or can wereach a given level of competency in a shorter time?What is the influence of simulation on the learningcurve?

Competency can be defined as the ability to do some-thing well, especially if acquired through experience ortraining, measured against a standard. Over the pastfew years, there has been a renewed focus on compe-tency-based assessment and the use of simulation totrain and assess complex skills. The AccreditationCouncil for Graduate Medical Education (ACGME)’sOutcome Project19 has highlighted the need for

1038 Bond et al. • DEVELOPING COGNITIVE EXPERTISE

Page 3: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

graduate medical education (GME) programs to con-centrate on the actual learning outcomes of residentsand not just the educational structure and process ofthe training programs. This is a shift from evaluating aprogram’s potential to educate to an evaluation of howmuch that program’s residents are actually learning.20

One key point is that experience is a necessary, butnot sufficient, condition for expertise. That is, one mayhave years of experience and may even be regarded asan expert, yet not function very expertly when tested inobjective settings.21,22 Experience, in and of itself, canbe a very valuable teacher when it leads to immediatefeedback and generates knowledge within a meaningfulcontext.23 Research in the aviation, military, and work-force literature has shown that deliberate practice (DP)and guided experiences promote accurate and efficientmental models that facilitate the learning of complextasks.24 Experience provides trainees with higher confi-dence levels when responding to previously experi-enced situations.25 Experience must be coupled withassessment, debriefing, reflection, and further experi-ence to be most effective. Also, as has been demon-strated across multiple domains of expertise,26,27 it isdifficult if not impossible to achieve expertise in adomain without years of practice within that domain.

Is the experience of practicing EM the same as DP?To draw upon the sports analogy, is playing golf byplaying the whole course at once like performing EM inthe ED environment? EM bedside practice does createsome opportunities for immediate feedback (for exam-ple, an immediate procedural complication), and patientfollow-up after ED visits is one excellent example ofdelayed feedback whose utility has only recently beenstudied and appreciated.28 However, we would arguethat bedside clinical practice is usually not DP withfeedback. DP with feedback in golf would be hittingfrom the same spot on the course multiple times, whilevideotaping, and then reviewing the tape with an expertand returning to the same spot to try corrections. Thereal ED apprenticeship allows no such replay opportu-nities, but the simulation setting may. Time constraintsin the ED setting dictate that thought processes oftenstop after the disposition decision has been made,rather than searching for the true cause of the medicalproblem. Primarily due to time constraints, even in theeducational setting, simulations are not often repeated,but this may be a missed golden opportunity.

While it is unlikely we could shorten the time frame toexpertise, we could ensure through simulation that a cer-tain variety of cases are seen within the allowed trainingtime, leading to better overall competence. SBT exercisesenable educators to provide these structured anddynamic experiences within the time constraints of anytraining program.29,30 With SBT, trainees can encounterboth the high-severity ⁄ high-frequency patient cases andthe high-severity ⁄ low-frequency scenarios to developthe competencies required before they are certified topractice independently.

To date, there is very little literature available high-lighting the development and acceleration of global andspecific competencies using SBT in EM. Although someof the following issues have been touched upon brieflyin small-scale studies in the literature, very little data

are available from large-scale validated trials. Futureresearch projects on SBT and competency shouldaddress the following questions:

• What defines an expert versus one that is merelycompetent enough to practice?

• Which competencies are best targeted and taught viaSBT exercises?

• Can SBT accelerate acquisition of expertise in theclinical arena by providing learners with experiencesthat are normally limited by chance of exposure?

• Do we need to change the paradigm of GME andfrontload residents’ education with simulation-basedlearning, so they are ready to be clinically competentlater in training?

• How can understanding natural biases ⁄ cognitiveerrors influence the design of simulation training?

3. What is the optimal teaching strategy for simulationcases? When should they be brief so that they are morereproducible and get to a very clear set of objectives?When should they be longer and more complex so as tobe realistic? What are the preferred debriefing tech-niques for EM training?

Curriculum developers can create simulations toachieve specific objectives in a comprehensive anddevelopmental manner.31 They can be organized andconsidered in relation to learner levels, defined asbenchmarks tied to specific competencies. The level ofcomplexity of the objective will help determine the nat-ure of the required simulation exercise. Educators cancombine simulations to developmentally advance thelearner through the progression from knowledge tobehavior change in a spiral curriculum, in which certainthemes are repeated and broadened with more knowl-edge, skills, and appropriate attitudes being establishedas the learner develops.32 Repetition, self-assessment,and the opportunity for feedback together, provide thecornerstone for DP as defined by Ericsson.33

Expected behaviors should be broken down intomeaningful parts, specific objectives defined, and themost appropriate simulation method applied. Considerthe example of intubation. The desired ‘‘set of behav-iors’’ that comprise the behavior or performance couldbe demonstrated, and learners could practice the appli-cation of new knowledge in a rudimentary way, usingscreen-based simulations. Studies have demonstratedthat participation in screen-based simulations withdebriefing improve performance in high-fidelity simula-tions.34 Partial task simulations could be used to facili-tate practice with relatively straightforward ‘‘micro’’skills, such as induction of anesthesia. These exercisescould be brief, stand-alone—even self-learning—partialtask simulation exercises that utilize practice withoutdebriefing. Self-assessment and reflection may be builtinto the exercise to reinforce behavior and help ensurereplication in a more complex high-fidelity or teamtraining situation.

Simulation exercises can be effectively debriefed withor without the use of video. Simulations that are video-taped allow learners to self-critique performance in amore ‘‘objective’’ manner by seeing what actions theyhave taken.35,36 Checklists can be used with note-taking

ACAD EMERG MED • November 2008, Vol. 15, No. 11 • www.aemj.org 1039

Page 4: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

during observation, and specific behaviors identifiedand discussed. Debriefing can focus on the behaviorsand thought processes at the time of the encoun-ter and ⁄ or the participants’ subsequent reflections(thoughts about thoughts or ‘‘metacognition’’ and feel-ings). The think-aloud technique can be used during theencounter to observe cognitive processes as they occurin real time.37 This technique has been used for sometime in cognitive therapy and research, and it has beenreferred to as ‘‘articulated thoughts during simulatedsituations.’’38 Rudolph et al.39 broaden the scope ofdebriefing to include ‘‘not only the trainees’ actions, butalso the meaning-making systems of the trainees suchas their frames, assumptions, and knowledge.’’ Simula-tions may also be debriefed in writing by individualsusing narrative.16,40 Written technique is flexible in thatit can be ‘‘asynchronous‘‘ and allows more time formetacognitive reflection.

In sum, short simulations without debriefing may beuseful in teaching (or self-learning) a specific skill orsubskill, but they are certainly ineffective in teachingself-assessment of complex procedures that requireadvanced clinical problem-solving. For the latter, meta-cognitive debriefing should focus not only on thethought processes as they occur, but also on thoughtsabout one’s knowledge of options, as well as how,when, and why to use those options. Using the examplefrom Rudolph, the anesthesiologist who is focusedexclusively on using a bag-mask apparatus to resusci-tate an unresponsive patient would need to assesswhether or not he or she knows 1) that passive oxygen-ation or delivering mouth to mask rescue breath areoptions (declarative knowledge), 2) how to do it (proce-dural knowledge), and 3) why he or she did not chooseto use alternatives in this situation (contextual knowl-edge).16 It is clear that as task complexity increases, forexample, along the dimensions outlined by Xiao et al.,36

debriefing time and complexity must increase also.Using Xiao’s dimensions, metacognitive analysis couldeffectively be introduced to reflect on the adequacy ofplanning and adapting to the process of managinguncertainty. Debriefing could be complemented with‘‘prebriefing’’ to help learners develop a sense ofreflecting before action—anticipation.

Self-assessment during a simulation and after a simu-lation in a debriefing is the primary antidote forinappropriate use of heuristics and resulting poor per-formance or error. Developing this capability and itsuse is the most effective and efficient method of short-ening the time frame to expertise. This can enhanceDP by providing the awareness and opportunity toestablish new goals in action and ‘‘avoid the arresteddevelopment of automaticity.’’33 From a learning per-spective, the real purpose of reflection during simula-tion debriefing is to enable the learner to engage inself-assessment. Reflection alone will not necessarilyhelp the learner improve the accuracy of ‘‘confidencejudgments’’ that are integral to performance improve-ment.41 To improve performance through self-assess-ment, reflections should compare performance, asviewed by self and others, to accepted criteria.Developing the ability to elicit and understand the per-spectives of others will not only enhance learning and

self-assessment, but also improve the patients’ percep-tion of the care they receive.42

Self-assessment can be fostered during debriefing.Many current models of debriefing begin with adescriptive recollection of what happened.43 Thisshould be expanded to include the learner(s) perceptionof how they believe it went (positives ⁄ negatives) andhow it might have been improved. In groups, this couldbe done quickly through contemplation or in writing (aquick plus ⁄ delta [+ ⁄ )] card). Then, everyone involvedcan share their perspectives. Ultimately, the other per-spectives should be used as a ‘‘test of validity’’ of self-assessments and progress toward achieving goals. Asummary of what went well (reinforcement) and whatdid not should be included as well as an action plan forchange. This process of ‘‘shared guidance’’ could berequired for long periods to promote the acquisition ofself-assessment skills that can be activated during expe-rience.33,44 The literature demonstrates that physiciansin training and practice perform poorly on self-assess-ment activities.45,46 There is evidence to suggest thatmetacognitive understanding that includes reflectionon behavior and thoughts of self and others helps vali-date self-assessment and improve associatedperformance.47–49

• How can we structure prebriefing to teach anticipa-tion?

• How can we use reflection to improve teaching abouthow to handle clinical uncertainty and complexity?

• When and how is video review most effective fordebriefing?

• What are the benefits of written debriefing andwhen can it be used to foster metacognitive devel-opment?

• By deeply analyzing complex simulation cases experi-enced, can learners begin to anticipate the cues theyneed to diagnose and selectively look for the cues?

• How can ‘‘shared guidance’’ with instructors or fel-low residents be used with simulation to foster self-assessment?

• How do we teach residents how to learn in the simu-lation environment?

• What debriefing techniques lead to effective reflec-tion?

• What methods would train the learner to properlyself-assess their simulation efforts so that instructor-led feedback can be reduced?

• Is there a way to make individualized learningthrough simulation feasible? What level of individual-ization is needed for effectiveness?

• What methods are needed to turn simulation intotrue DP with feedback? Is high-fidelity simulationmore effective in delivering DP or can less expensivemodels provide similar outcomes?

• Is there an ideal time to run certain SBT scenarios?Is it early on in training or after some clinical experi-ence and knowledge acquisition?

• Interruptions may reduce realism, but the EM envi-ronment requires doctors to suddenly focus on anew patient or a new problem. What are the optimalstrategies to allow the learner to confront an emer-gency, pause, and redirect, yet maintain realism?

1040 Bond et al. • DEVELOPING COGNITIVE EXPERTISE

Page 5: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

• What are some examples of truly ‘‘integrated’’ simu-lation curricula that developmentally address learn-ing objectives related to knowledge, use ofknowledge, skills, and metacognitive capabilities?

4. Can simulation be used to diagnose learning deficitsand performance problems? Can simulation be used todiagnose ⁄ pinpoint the cause of poor performance? Whatassessment tools would adequately identify the perfor-mance deficits?

The current model used in graduate medical educationtraining programs is based on a time-in-grade ratherthan proof-of-competency. Educators are faced withresidents who show varying abilities, knowledge, andcompetency, despite being in the same year of training.A partial explanation for this is that each resident has aunique background of experiences based on the patientsthey have seen and treated.50 Additionally, people varyin their learning speed in dynamic environments,51,52

and some residents require additional time for learning.Other learners are overwhelmed by the amount of andvariability of stimuli and find it impossible to cope withthe rapidly changing work environment.53 A proportionof residents will require additional training to reach amastery level of learning.54,55 This being the case, agroup of residents in the same year of training will showa bell-shaped pattern of performance.

Owing to this variability, it is difficult to find a sharpcutoff between residents that demonstrate satisfactoryperformance and those who lag in demonstratingimprovement. Adding to the problem of judging ade-quacy of performance is the fact that a ‘‘very good’’resident may perform poorly when seeing an unusualpatient problem or even a commonplace problem notpreviously encountered. In contrast, the ‘‘problem’’ res-ident may perform in a satisfactory or even exemplarymanner on that very same case if the resident hasencountered it in the past. Therefore, determining thata resident needs closer supervision, additional trainingtime, or a formal remediation program is very difficultand may prove impossible to defend if challenged bythe resident. Prior studies also note that self-assessmentof performance is poor in both medical students56 andphysicians,45 and there is no reason to believe that resi-dents are any more accurate.

This challenging task of the faculty is compounded bythe fact that performance standards vary across resi-dencies. To agree that there is a ‘‘problem with perfor-mance,’’ the acceptable level of performance expectedfrom the novice, intermediate, or experienced residentneeds to be standardized. The Model of the ClinicalPractice of Emergency Medicine has gained acceptanceas the standardized content area of EM,57 and modelcurricula for simulation31,58 have also been published inthe EM literature, but specific educational objectivesand their implementation are defined by each residencyprogram. Specific competency requirements for yearlypromotion during training have not been described oragreed upon at a national level. Further research willbe required to standardize SBT for use in promotion orgraduation competencies, as well as to diagnose resi-dents falling behind with performance problems.

Determining why the resident is having problemsmay be very difficult. First, a performance problemmust be differentiated from a learning difficulty thatmay lead to problems in one’s ability to learn. This maygo undetected until neuropsychological testing is per-formed, although this accounts for a small number ofphysicians.59,60 In residents without actual learning dif-ficulties, a plethora of causes for poor performanceexist that can range from inadequate acquisition ofknowledge during medical school, inability to transferlearning from the classroom to the workplace, and ⁄ orlack of self confidence in decision-making, to name justa few possibilities. For all of the above reasons, facultyoften have difficulty in 1) agreeing that a resident hasperformance problems, 2) pinpointing the exact causeof the learning failure, and 3) justifying their decisionthat a problem exists if challenged.

Can high-fidelity simulation:

• Aid in determining that there is a performanceproblem?

• Diagnose the cause of the performance problem as:— Lack of knowledge?— Need for additional time to reach mastery?— Lack of self confidence?

Can high-fidelity simulation aid in correcting perfor-mance problems by:

• Varying the time allowed for performance and grad-ually decreasing time for decision-making?

• Allowing residents to see the results of actions takenin a safe environment?

• Assisting residents to improve self-confidence indecision-making?

Many researchers have noted the value of simulationto determine if learners have difficulty with situationawareness because of incorrect perception of stimuli,failure to understand the meaning of stimuli, or inabilityto predict future events based on the actions taken dur-ing the situation.53,61–63

Can high-fidelity simulation:

• Demonstrate that a learner has difficulty with situa-tion awareness?

• Aid in defining the exact type of problem with situa-tion awareness?

• Demonstrate that a learner only has difficulty withsituation awareness in new, infrequently encoun-tered, or extremely complex cases, rather than allcases?

5. Can simulation be effectively used as a remediationtool? Can simulation provide an opportunity for practiceto overcome the diagnosed problem with performance?Can simulation be used to assess the efficacy of pre-scribed remediation programs to overcome performancedeficits?

Remediation is defined as ‘‘an organized effort toassist physicians less than competent in one or moreareas of practice to recognize their deficiency andreturn to competence.’’64 The use of simulation in theremediation of residents has not been described, andonly a few articles outline its use for students65–69 and

ACAD EMERG MED • November 2008, Vol. 15, No. 11 • www.aemj.org 1041

Page 6: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

practicing physicians.70,71 Without clearly defined com-petencies, the need for remediation becomes difficultto explain and defend. The simulation literaturedescribes the trend toward development of reliableand validated objective assessment tools for certainresident competencies.72–75 These assessments not onlyprovide a framework for determining competency, butcould also assist in identifying areas of resident defi-ciencies and provide a framework in which a remedia-tion program could be developed, focusing oncompetency expectations for different levels of train-ing.76,77

After competencies and educational objectives havebeen agreed on, a stepwise approach is needed todevelop a remediation program.78 The steps include: 1)diagnosing the problem, 2) developing a program tai-lored to the learner’s needs, and 3) determining ifremediation has been successful.

• What is the role of simulation in diagnosis of theproblem?

— Can simulation identify appropriate competencies forlevel of training?

— Which performance problems are best diagnosed bysimulation?

— Does the type of fidelity (high or low) affect the accu-racy and validity of diagnosis?

Repeated practice, without feedback and reflection, isinsufficient for residents having performance prob-lems.27 Feedback on both correct and incorrect actionsmust be given as part of the remediation program. Infact, if repeated performance of the same error occurs,it may solidify the performance error.79 Simulationgives the instructor or expert the chance to directlyobserve behaviors. Verbalizing reasoning during per-formance allows instructors to examine the resident’sthought processes. Think-aloud verbal protocol analy-sis, while labor-intense, might aid in diagnosing thesource of difficulty. This methodology has been used toexamine expertise among physicians.80,81 Simulationand practice under the supervision of an expert in acontrolled setting, which is extremely costly in terms offaculty resources, has not been compared to bedsideteaching; however, it has been shown that few resi-dents are directly observed by faculty when they seepatients and direct observation would be needed inboth settings for comparison.82

• Development of a remediation program:— Can simulation be used alone or as part of a pro-

gram to remediate a performance problem?— What type of simulation is most effective for remedi-

ation?— How can independent practice and simulation be

successfully combined?• Determining successful remediation:— Can scenarios be developed that reliably lead to lear-

ner improvement?— Does successful remediation in the simulation lab

transfer to clinical practice?— Can simulation demonstrate learner mastery of the

required material?

6. Can we think of an effective methodology and mea-surable outcomes in EM that prove that transfer oflearning to the real environment has occurred?

Research has shown that residents trained in perfor-mance of procedures by use of simulation models cantransfer the taught skills to the workplace.83–86 Morecomplex tasks create a greater challenge for the dem-onstration of improved performance. Funding organi-zations are concerned that while we may be able todocument improved performance in the simulation lab-oratory, there may be factors that prevent this improve-ment in the clinical setting. While certainly otherquestions in the field of EM simulation need to beanswered, during the next decade we will need toaddress whether learning via simulation translates toenhanced clinical effectiveness.

Outcome measures can be generated from review ofvideos from actual clinical scenarios before and afterthe learning intervention. However, it can be difficult todevise a method to capture specific types of patient pre-sentations considering the real-time variability in theED. Certain types of patient scenarios (i.e., trauma acti-vations, acute stroke patients, obstetric patients, pediat-rics) may be triaged to certain rooms in the ED, whichmight make video recordings of these actual clinical sit-uations more manageable. Review of in situ simulationscould also prove some elements of transfer. Alterna-tively, standardized patients (SPs) could appear in theclinical environment, with providers (the study partici-pant) blinded to the SP’s true identity. SPs could usechecklists,87 tallies, timing intervals,88 global assess-ments, and affective responses as outcomes. Perfor-mance assessments of physicians by SPs have foundhigh concordance with other measures of perfor-mance.89–92 If SPs are used, it is important to validatetheir ratings with a criterion standard (either experts inthe field or actual patients). Correlations with directobservations of performance in some fashion, despitetheir labor intensity, seem warranted.

CONCLUDING REMARKS

Clearly, there is substantial room for hypothesis devel-opment and qualitative work to refine hypotheses.Hypothesis testing will evolve as important questionscome to light that are worthy of the resources required.The form of hypothesis testing might be the traditionalrandomized trial, the crossover trial with staggeredtraining intervals, or the cluster-randomized design forlarge-scale efforts. Cross-disciplinary collaboration,such as the type demonstrated during this consensusconference, will lead to more credible hypotheses thatare worthy of funding. It is our hope that these pro-ceedings will help guide researchers to fruitful endeav-ors in medical education research.

References

1. Bloom B. Taxonomy of Educational Objectives:Handbook I: The Cognitive Domain. New York, NY:David McKay Co. Inc., 1956.

1042 Bond et al. • DEVELOPING COGNITIVE EXPERTISE

Page 7: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

2. Marincola F. In support of descriptive studies; rele-vance to translational research. J Transl Med. 2007;5:1–3.

3. Ericsson KA, Williams AM. Capturing naturallyoccurring superior performance in the laboratory:translational research on expert performance. J ExpPsychol Appl. 2007; 13:115–23.

4. Bandura A. Social Learning Theory. Upper SaddleRiver, NJ: Prentice Hall, 1976.

5. Kumar AS, Shibru D, Bullard MK, Liu T, HarkenAH. Case-based multimedia program enhances thematuration of surgical residents regarding theconcepts of professionalism. J Surg Educ. 2007;64:194–8.

6. Gisondi MA, Smith-Coggins R, Harter PM, SoltysikRC, Yarnold PR. Assessment of residentprofessionalism using high-fidelity simulation ofethical dilemmas. Acad Emerg Med. 2004; 11:931–7.

7. Young JS, Stokes JB, Denlinger CE, Dubose JE.Proactive versus reactive: the effect of experienceon performance in a critical care simulator. Am JSurg. 2007; 193(1):100–4.

8. Salas E, Klein G (eds). Linking Expertise and Natu-ralistic Decision Making. Mahwah, NJ: LawrenceErlbaum Associates, 2001.

9. Croskerry P. Critical thinking and reasoning inemergency medicine. In: Croskerry P, Crosby KS,Schenkel SM, Wears RL (eds). Patient Safety inEmergency Medicine. Philadelphia, PA: LippincottWilliams & Wilkins, (in press), 2008.

10. Young JS, Smith RL, Guerlain S, Nolley B. Howresidents think and make medical decisions: impli-cations for education and patient safety. Am Surg.2007; 73:548–53.

11. Patel VL, Evans DA, Kaufman DR. Reasoning strate-gies and the use of biomedical knowledge by medi-cal students. Med Educ. 1990; 24:129–36.

12. Crespo KE, Torres JE, Recio ME. Reasoningprocess characteristics in the diagnostic skills ofbeginner, competent, and expert dentists. J DentEduc. 2004; 68:1235–44.

13. Moulton CA, Regehr G, Mylopoulos M, MacRaeHM. Slowing down when you should: a new modelof expert judgment. Acad Med. 2007; 82(10 Suppl):S109–16.

14. Ericsson K. Protocol analysis and expert thought:Concurrent verbalizations of thinking duringexperts’ performance on representative task. In:Ericsson K, Charness N, Feltovich P, Hoffman R(eds). Cambridge Handbook of Expertise andExpert Performance. Cambridge, UK: CambridgeUniversity Press, 2006, pp 223–42.

15. Ericsson K, Kintsch W. Long-term workingmemory. Psychol Rev. 1995; 102:211–45.

16. Quirk M. Intuition and Metacognition in MedicalEducation. New York, NY: Springer, 2006.

17. Ericsson KA, Simon HA. Protocol Analysis: VerbalReports as Data. Cambridge, MA: BradfordBooks ⁄ MIT Press, 1993.

18. Sohn MH, Douglass SA, Chen MC, Anderson JR.Characteristics of fluent skills in a complex,

dynamic problem-solving task. Hum Factors. 2005;47:742–52.

19. The ACGME Outcome Project: AccreditationCouncil for Graduate Medical Education, Availableat: http://www.acgme.org/outcome/project/proHome.asp. Accessed June 2008.

20. Scalese R, Obeso V, Issenberg B. Simulation tech-nology for skills training and competency assess-ment in medical education. J Gen Intern Med. 2007;23(Suppl I):46–59.

21. Choudhry NK, Fletcher RH, Soumerai SB. System-atic review: the relationship between clinical experi-ence and quality of health care. Ann Intern Med.2005; 142:260–73.

22. Ericsson KA, Whyte JT, Ward P. Expert perfor-mance in nursing: reviewing research on expertisein nursing within the framework of the expert-per-formance approach. Adv Nursi Sci. 2007; 30:E58–71.

23. Satish U, Streufert S. Value of a cognitive simula-tion in medicine: towards optimizing decision mak-ing performance of healthcare personnel. Qual SafHealth Care. 2002; 11:163–7.

24. Tannenbaum SI, Yukl G. Training and developmentin work organizations. Annu Rev Psychol. 1992;43:399–441.

25. Chase WG, Simon HA. Perception in chess. CogPsychol. 1973; 4:55–81.

26. Ericsson K. The influence of experience and deliber-ate practice on the development of superior expertperformance. In: Ericsson K, Charness N, FeltovichP, Hoffman R (eds). Cambridge Handbook of Exper-tise and Expert Performance. Cambridge, UK: Cam-bridge University Press, 2006, pp 685–706.

27. Ericsson KA. Deliberate practice and the acquisitionand maintenance of expert performance in medi-cine and related domains. Acad Med. 2004; 79(10Suppl):S70–81.

28. Sadosty AT, Stead LG, Boie ET, Goyal DG, WeaverAL, Decker WW. Evaluation of the educational util-ity of patient follow-up. Acad Emerg Med. 2004;11:715–9.

29. Issenberg SB, McGaghie WC, Petrusa ER, LeeGordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effectivelearning: a BEME systematic review. Med Teach.2005; 27:10–28.

30. Salas E, Priest HA, Wilson KA. Scenario-basedtraining: improving military mission performanceand adaptability. In: Britt T, Adler A, Castro C (eds).Military Life: The Psychology of Serving in Peaceand Combat. Westort, CT: Praeger Security Interna-tional, 2006, pp 32–53.

31. McLaughlin SA, Doezema D, Sklar DP. Human Sim-ulation in Emergency Medicine Training: A ModelCurriculum. Acad Emerg Med. 2002; 9:1310–8.

32. Harden R, Stamper N. What is a spiral curriculum?Med Teach. 1999; 21:141–3.

33. Ericsson KA. Deliberate practice and acquisition ofexpert performance: a general overview. AcadEmerg Med. 2008; 15:988–94.

34. Schwid HA, Rooke GA, Michalowski P, Ross BK.Screen-based anesthesia simulation with debriefing

ACAD EMERG MED • November 2008, Vol. 15, No. 11 • www.aemj.org 1043

Page 8: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

improves performance in a mannequin-based anes-thesia simulator. Teach Learn Med. 2001; 13:92–6.

35. Mackenzie C, Xiao Y, Horst R. Video task analysisin high performance teams. Cogn Technol Work.2004; 6:139–47.

36. Xiao Y, Hunter WA, Mackenzie CF, Jefferies NJ,Horst RL. Task complexity in emergency medicalcare and its implications for team coordination.LOTAS Group. Level One Trauma Anesthesia Simu-lation. Hum Factors. 1996; 38:636–45.

37. Davison G, Navarre S, Vogel R. The articulatedthoughts in simulated situations paradigm: A think-aloud approach to cognitive assessment. Curr DirPsychol Sci. 1995; 4:29–33.

38. Davison G, Robins C, Johnson M. Articulatedthoughts during simulated situations: a paradigmfor studying cognition in emotion and behavior.Cognit Ther Res. 1983; 7:17–39.

39. Rudolph JW, Simon R, Dufresne RL, Raemer DB.There’s no such thing as ‘‘nonjudgmental’’ debrief-ing: a theory and method for debriefing with goodjudgment. Simul Healthc. 2006; 1:49–55.

40. Petranek C. Written debriefing: the next vital stepin learning with simulations. Simul Gam. 2000;31:109–18.

41. Son LK, Schwartz BL. Relation between metacogni-tive monitoring and control. In: Perfect TJ, Sch-wartz BL (eds). Applied Metacognition. CambridgeUK: Cambridge University Press, 2002.

42. Quirk M, Mazor K, Haley HL, et al. How patientsperceive a caring attitude. Patient Educ Couns.2008; in press.

43. Fanning R, Gaba D. The role of debriefing in simu-lation-based learning. Simul Healthc. 2007; 2:115–25.

44. ten Cate O, Snell L, Mann K, Vermunt J. Orientingteaching toward the learning process. Acad Med.2004; 79:219–28.

45. Davis DA, Mazmanian PE, Fordis M, Van HarrisonR, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures ofcompetence: a systematic review. JAMA. 2006;296:1094–102.

46. Gruppen LD, Garcia J, Grum CM, et al. Medicalstudents’ self-assessment accuracy in communica-tion skills. Acad Med. 1997; 72(Suppl 10):S57–9.

47. Greer M. When intuition misfires. APA Monitor.2005; 36:58–60.

48. Perkins DN, Grotzer TA. Teaching intelligence. AmPsychol. 1997; 52:1125–33.

49. Zimmerman B. Self regulation involves more thanmetacognition: a social cognitive perspective. EducPsychol. 1995; 30:217–21.

50. Norman G. Building on experience-the developmentof clinical reasoning. N Engl J Med. 2006; 355:2251–2.

51. Gonzalez C. Learning to make decisions in dynamicenvironments: effects of time constraints and cogni-tive abilities. Hum Factors. 2004; 46:449–60.

52. Gonzalez C. Task workload and cognitive abilities indynamic decision making. Hum Factors. 2005;47:92–101.

53. Endsley M. Toward a theory of situation awarenessin dynamic systems. Hum Factors. 1995; 37:32–64.

54. Wayne D. Mastery learning of thoracentesis skillsby internal medicine residents using simulationtechnology and deliberate practice. J Hosp Med.2008; 3:48–54.

55. Wayne DB, Butter J, Siddall VJ, et al. Masterylearning of advanced cardiac life support skills byinternal medicine residents using simulation tech-nology and deliberate practice. J Gen Intern Med.2006; 21:251–6.

56. Eva KW, Cunnington JP, Reiter HI, Keane DR,Norman GR. How can I know what I don’t know?Poor self assessment in a well-defined domain. AdvHealth Sci Educ Theory Pract. 2004; 9:211–24.

57. Hockberger RS, Binder LS, Graber MA, et al. Themodel of the clinical practice of emergency medi-cine. Ann Emerg Med. 2001; 37:745–70.

58. Binstadt ES, Walls RM, White BA, et al. A compre-hensive medical simulation education curriculumfor emergency medicine residents. Ann EmergMed. 2007; 49:495–504.

59. Turnbull J, Carbotte R, Hanna E, et al. Cognitivedifficulty in physicians. Acad Med. 2000; 75:177–81.

60. Turnbull J, Cunnington J, Unsal A, Norman G, Fer-guson B. Competence and cognitive difficulty inphysicians: a follow-up study. Acad Med. 2006;81:915–8.

61. Adams M, Tenney Y, Pew R. Situation awarenessand the cognitive management of complex systems.Hum Factors. 1995; 35:85–104.

62. Endsley MR. Measurement of situation awarenessin dynamic systems. Hum Factors. 1995; 35:65–84.

63. Gaba DM, Howard SK, Small SD. Situation aware-ness in anesthesiology. Hum Factors. 1995; 37:20–31.

64. Rosner F, Balint JA, Stein RM. Remedial medicaleducation. Arch Intern Med. 1994; 154:274–9.

65. Goulet F, Gagnon R, Gingras ME. Influence ofremedial professional development programs forpoorly performing physicians. J Contin Educ HealthProf. 2007; 27:42–8.

66. Goulet F, Jacques A, Gagnon R. An innovativeapproach to remedial continuing medical education,1992–2002. Acad Med. 2005; 80:533–40.

67. Hanna E, Premi J, Turnbull J. Results of remedialcontinuing medical education in dyscompetentphysicians. Acad Med. 2000; 75:174–6.

68. Lin CT, Barley GE, Cifuentes M. Personalized reme-dial intensive training of one medical student incommunication and interview skills. Teach LearnMed. 2001; 13:232–9.

69. Reamy BV, Harman JH. Residents in trouble: an in-depth assessment of the 25-year experience of asingle family medicine residency. Fam Med. 2006;38:252–7.

70. Haskvitz LM, Koop EC. Students struggling in clini-cal? A new role for the patient simulator. J NursEduc. 2004; 43:181–4.

71. Rosenblatt MA, Abrams KJ. The use of a humanpatient simulator in the evaluation of and develop-ment of a remedial prescription for an anesthesiolo-gist with lapsed medical skills. Anesth Analg. 2002;94:149–53.

72. Brett-Fleegler MB, Vinci RJ, Weiner DL, Harris SK,Shih MC, Kleinman ME. A simulator-based tool

1044 Bond et al. • DEVELOPING COGNITIVE EXPERTISE

Page 9: The Use of Simulation in the Development of Individual Cognitive Expertise in Emergency Medicine

that assesses pediatric resident resuscitationcompetency. Pediatrics. 2008; 121:e597–603.

73. Murray DJ, Boulet JR, Avidan M, et al. Performanceof residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology. 2007;107:705–13.

74. Schwid HA, Rooke GA, Carline J, et al. Evaluationof anesthesia residents using mannequin-based sim-ulation: a multiinstitutional study. Anesthesiology.2002; 97:1434–44.

75. Boulet JR, Murray D, Kras J, Woodhouse J, McAll-ister J, Ziv A. Reliability and validity of a simula-tion-based acute care skills assessment for medicalstudents and residents. Anesthesiology. 2003;99:1270–80.

76. Girzadas DV Jr, Clay L, Caris J, Rzechula K, Har-wood R. High fidelity simulation can discriminatebetween novice and experienced residents whenassessing competency in patient care. Med Teach.2007; 29:452–6.

77. Murray DJ, Boulet JR, Kras JF, McAllister JD, CoxTE. A simulation-based acute skills performanceassessment for anesthesia training. Anesth Analg.2005; 101:1127–34.

78. Hauer KE, Teherani A, Irby DM, Kerr KM, O’Sulli-van PS. Approaches to medical student remediationafter a comprehensive clinical skills examination.Med Educ. 2008; 42:104–12.

79. Couture M, Lafond D, Tremblay S. Learning correctresponses and errors in the Hebb repetition effect:two faces of the same coin. J Exp Psychol LearnMem Cogn. 2008; 34:524–32.

80. Norman GR, Brooks LR, Allen SW. Recall by expertmedical practitioners and novices as a record ofprocessing attention. J Exp Psychol Learn MemCogn. 1989; 15:1166–74.

81. Patel V, Arocha J, Zhang J. Thinking and reasoningin medicine. In: Holyoak K, Morrison R (eds). TheCambridge Handbook of Thinking and Reasoning.Cambridge, UK: Cambridge University Press, 2005,pp 727–50.

82. Burdick WP, Schoffstall J. Observation of emer-gency medicine residents at the bedside: how oftendoes it happen? Acad Emerg Med. 1995; 2:909–13.

83. Berg DA, Milner RE, Fisher CA, Goldberg AJ,Dempsey DT, Grewal H. A cost-effective approachto establishing a surgical skills laboratory. Surgery.2007; 142:712–21.

84. Scott DJ, Bergen PC, Rege RV, et al. Laparoscopictraining on bench models: better and more costeffective than operating room experience? J AmColl Surg. 2000; 191:272–83.

85. Taitz J, Wyeth B, Lennon R, et al. Effect of theintroduction of a lumbar-puncture sticker andteaching manikin on junior staff documentation andperformance of paediatric lumbar punctures. QualSaf Health Care. 2006; 15:325–8.

86. Seymour NE, Gallagher AG, Roman SA, et al. Vir-tual reality training improves operating room per-formance: results of a randomized, double-blindedstudy. Ann Surg. 2002; 236:458–63.

87. Knudson MM, Khaw L, Bullard MK, et al. Traumatraining in simulation: translating skills from SIMtime to real time. J Trauma. 2008; 64:255–63.

88. Cioffi J, Purcal N, Arundell F. A pilot study to inves-tigate the effect of a simulation strategy on the clin-ical decision making of midwifery students. J NursEduc. 2005; 44:131–4.

89. Beaulieu MD, Rivard M, Hudon E, Saucier D,Remondin M, Favreau R. Using standardizedpatients to measure professional performanceof physicians. Int J Qual Health Care. 2003; 15:251–9.

90. Boulet JR, Ben-David MF, Ziv A, et al. Using stan-dardized patients to assess the interpersonal skillsof physicians. Acad Med. 1998; 73(10 Suppl):S94–6.

91. Ram P, van der Vleuten C, Rethans JJ, Grol R, AretzK. Assessment of practicing family physicians: com-parison of observation in a multiple-station exami-nation using standardized patients with observationof consultations in daily practice. Acad Med. 1999;74:62–9.

92. van Zanten M, Boulet JR, McKinley D. Using stan-dardized patients to assess the interpersonal skillsof physicians: six years’ experience with a high-stakes certification examination. Health Comm.2007; 22:195–205.

ACAD EMERG MED • November 2008, Vol. 15, No. 11 • www.aemj.org 1045