9
Filling in the gaps of clerkship with a comprehensive clinical skills curriculum Pamela Veale Julie Carson Sylvain Coderre Wayne Woloschuk Bruce Wright Kevin McLaughlin Received: 2 August 2013 / Accepted: 3 February 2014 Ó Springer Science+Business Media Dordrecht 2014 Abstract Although the clinical clerkship model is based upon sound pedagogy, including theories of social learning and situated learning, studies evaluating clinical performance of residents suggests that this model may not fully meet the learning needs of students. Here our objective was to design a curriculum to bridge the learning gaps of the existing clerkship model and then evaluate the impact of this on performance on clerkship summative evaluations. We followed Kern’s framework to design our curric- ulum and then compared performance on the clerkship objective structured clinical examination (OSCE), all summative clerkship multiple choice question (MCQ) examin- ations, and the Medical Council of Canada Qualifying Examination (MCCQE) Part 1 before and after the introduction of our curriculum. In the 2 years following the intro- duction of our clinical skills curriculum the mean score on the clerkship OSCE was significantly higher than in the 2 years prior to our curriculum [67.12 (5.3) vs. 62.44 (4.93), p \ 0.001, d = 0.91]. With the exception of the surgical clerkship MCQ, per- formance on all clerkship summative MCQ examinations and MCCQE Part 1 was sig- nificantly higher following the introduction of our curriculum. In this study we found a significant improvement in the performance on clerks on summative evaluations of knowledge and clinical skills following the introduction of our clinical skills curriculum. Given the unpredictable nature of clinical rotations, the clerkship will always be a risk of failing to deliver the intended curriculum—so medical schools should continue to explore and evaluate ways of changing the delivery of clerkship training to improve learning outcomes. Keywords Clerkship Á Clinical skills Á Curriculum P. Veale Á J. Carson Á S. Coderre Á W. Woloschuk Á B. Wright Á K. McLaughlin (&) Office of Undergraduate Medical Education, Health Sciences Centre, University of Calgary, 3330 Hospital Drive NW, Calgary, AB T2N 4N1, Canada e-mail: [email protected] 123 Adv in Health Sci Educ DOI 10.1007/s10459-014-9496-6

Filling in the gaps of clerkship with a comprehensive clinical skills curriculum

  • Upload
    kevin

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Filling in the gaps of clerkship with a comprehensiveclinical skills curriculum

Pamela Veale • Julie Carson • Sylvain Coderre • Wayne Woloschuk •

Bruce Wright • Kevin McLaughlin

Received: 2 August 2013 / Accepted: 3 February 2014� Springer Science+Business Media Dordrecht 2014

Abstract Although the clinical clerkship model is based upon sound pedagogy,

including theories of social learning and situated learning, studies evaluating clinical

performance of residents suggests that this model may not fully meet the learning needs

of students. Here our objective was to design a curriculum to bridge the learning gaps of

the existing clerkship model and then evaluate the impact of this on performance on

clerkship summative evaluations. We followed Kern’s framework to design our curric-

ulum and then compared performance on the clerkship objective structured clinical

examination (OSCE), all summative clerkship multiple choice question (MCQ) examin-

ations, and the Medical Council of Canada Qualifying Examination (MCCQE) Part 1

before and after the introduction of our curriculum. In the 2 years following the intro-

duction of our clinical skills curriculum the mean score on the clerkship OSCE was

significantly higher than in the 2 years prior to our curriculum [67.12 (5.3) vs. 62.44

(4.93), p \ 0.001, d = 0.91]. With the exception of the surgical clerkship MCQ, per-

formance on all clerkship summative MCQ examinations and MCCQE Part 1 was sig-

nificantly higher following the introduction of our curriculum. In this study we found a

significant improvement in the performance on clerks on summative evaluations of

knowledge and clinical skills following the introduction of our clinical skills curriculum.

Given the unpredictable nature of clinical rotations, the clerkship will always be a risk of

failing to deliver the intended curriculum—so medical schools should continue to explore

and evaluate ways of changing the delivery of clerkship training to improve learning

outcomes.

Keywords Clerkship � Clinical skills � Curriculum

P. Veale � J. Carson � S. Coderre � W. Woloschuk � B. Wright � K. McLaughlin (&)Office of Undergraduate Medical Education, Health Sciences Centre, University of Calgary,3330 Hospital Drive NW, Calgary, AB T2N 4N1, Canadae-mail: [email protected]

123

Adv in Health Sci EducDOI 10.1007/s10459-014-9496-6

Introduction

During their undergraduate training, medical students transition from preclinical learning

experiences—primarily didactic and small group learning—to the immersive experience of

clerkship where they transfer their knowledge to the clinical setting and hone their clinical

skills. The clinical clerkship model is based upon theories of social learning and situated

learning, and should therefore be the ideal preparation for further training in residency and

beyond (Brown et al. 1989; Miller and Dollard 1941; Tulving and Thomson 1973). Or at

least this is what we had assumed until data began to emerge highlighting significant

deficiencies in the clinical skills of residents (Fred 2005; Mangione and Nieman 1997;

Mangione and Nieman 1999; Mangione 2001). So why does our current clerkship model

fail to provide the type of learning experiences necessary to develop clinical skills?

Finding deficiencies in trainees’ clinical skills does not necessarily imply that the

objectives of clerkship are misguided and/or our clerkship teachers are unskilled (He-

idenreich et al. 2000; Irby 1995; Sutkin et al. 2008). A more likely explanation for this

problem is the inescapable truth that training opportunities during clerkship are unpre-

dictable and conditional. First, we need patients with appropriate clinical findings who are

willing to be examined for the benefit of students, and then we need preceptors who have

the time and motivation to create this learning opportunity. But, when rounding on in-

patients we don’t usually have a stable patient with a pleural rub or critical aortic stenosis,

and even when we do we rarely have the time to allow each of our learners to perform the

appropriate physical examination and then provide them with feedback on their perfor-

mance. Thus, all too often valuable learning experiences are compromised by the need to

deliver efficient clinical care (Irby et al. 2004; Neher et al. 1992).

Prompted by an accreditation review of our program by the Liaison Committee on

Medical Education (LCME) in 2009, we sought to address some of the deficiencies of our

clerkship model at the University of Calgary. Our objectives in this study were first to

design a curriculum to bridge the learning gaps of the existing clerkship model and,

second, to evaluate the impact of this on performance on clerkship summative evaluations.

We followed Kern’s framework to create and implement a clinical skills curriculum that

runs alongside the clinical clerkship (Kern et al. 1998). Herein we describe how we

completed Kern’s six-step process for curricular design: problem identification and general

needs assessment; targeted needs assessment; goals and objectives; educational strategies;

implementation; and evaluation and feedback. Having designed our curriculum, we then

used a pre/post study design to compare performance on knowledge and clinical skills

evaluations for cohorts before and after the introduction of our clinical skills curriculum to

evaluate the impact of our new curriculum on learning outcomes.

Method

Participants

Our participants included medical students, recent graduates, and clerkship teachers at the

University of Calgary. We have a 3-year undergraduate curriculum, during which the first

2 years is a pre-clerkship Clinical Presentation Curriculum and the final year is the clinical

clerkship (Mandin et al. 1995). We have two clerkship streams: the rotation-based clerk-

ship where students rotate between seven mandatory rotations (Emergency Medicine,

Family Medicine, Internal Medicine, Obstetrics and Gynecology, Pediatrics, Psychiatry,

P. Veale et al.

123

and Surgery), and the Rural Integrated Community Clerkship, where students spend

9 months in a primary care setting covering the clinical presentation from each discipline,

and complete their clerkship with rotations in Internal Medicine and Pediatrics

(McLaughlin et al. 2011). In addition to clinical experiences, each of the clerkship rota-

tions also has a formal teaching curriculum, which is primarily in the form of didactic

teaching. Prior to initiation, our study was approved by the Conjoint Health Research

Ethics Board at the University of Calgary.

Participants who contributed to the needs assessment for our curriculum were clerkship

program directors, evaluation coordinators, clerks (from graduating class of 2010), and first

year residents who had graduated from the class of 2009. The participants who helped us

study the impact of our curriculum on learning outcomes were students from the two

graduating classes preceding the introduction of our curriculum (classes of 2009 and 2010,

n = 290) and the two classes following the introduction of the curriculum (classes of 2011

and 2012, n = 346).

Materials

The data used to guide our curriculum included the 2009 LCME Accreditation documents,

Graduate Questionnaire and student log books for the classes of 2007 and 2008, and

questionnaires and focus groups during which we asked participants to suggest changes to

the content and delivery of the clinical clerkship that would improve the quality of the

learning experience. As a measure of baseline academic performance we compared the

mean performance on all summative evaluations prior to clerkship for students from the

classes of 2009 and 2010 to that of students from classes 2011 and 2012. To assess the

impact of our curriculum on learning outcomes we used students’ scores on the summative

objective structured clinical examination (OSCE) and clerkship summative multiple choice

question (MCQ) examination for the classes of 2009–12, in addition to the Medical

Council of Canada Qualifying Examination (MCCQE) Part 1. During the 4 years of

observation there were no major changes in the content or minimum performance level of

the OSCE or local MCQ examinations.

Procedure

When creating our new curriculum we followed the six-step process for curricular design

described by Kern et al. (1998). For step 1 (problem identification and general needs

assessment) we used the accreditation report and supplemented this by results of student

log books and data from the Graduate Questionnaire. For step 2 (targeted needs assess-

ment) we used the detailed normative and perceived needs assessment from the key

stakeholders in undergraduate medical education program—including students, curriculum

planners, and teachers that formed part of the accreditation report—and supplemented

these data by administering questionnaires followed by focus groups interviews of clerk-

ship program directors, evaluation coordinators, clerks, and first year residents who had

recently graduated from our undergraduate program. The theme for our questionnaire and

focus group interviews was ways to improve the learning experience during clerkship. We

conducted two focus groups with clerkship directors and evaluation coordinators and two

focus groups with clerks and residents. In each case no new themes were identified in the

second focus group, from which we inferred saturation of themes.

Filling in the gaps of clerkship

123

To evaluate the impact of our new curriculum on learning outcomes we used a pre/post

study design to compare performance of cohorts from 2 years before and 2 years after the

introduction of our curriculum on all summative evaluations of knowledge during clerkship

in addition to the summative clerkship OSCE.

Statistical analyses

We performed thematic analysis of questionnaire data and the transcripts of our focus

groups to identify areas for improvement in clerkship. Two researchers (JC and KM)

extracted themes independently before reaching a consensus on the major themes and

categories. We used data from the questionnaires for data source triangulation of the focus

group data (Thurmond 2001). We used an independent sample t test with Cohen’s d as a

measure of effect size to compared scores for the two cohorts on pre-clerkship evaluations

and on clerkship evaluations before and after the introduction of our curriculum (Cohen

1988). We categorized effect sizes according to the thresholds suggested by Cohen (Cohen

1988): small (d = 0.20), medium (d = 0.5), and large (d = 0.8). We used STATA�

version 11.0 (StataCorp LP, College Station, TX) for our statistical analyses.

Results

Steps involved in creating a clinical skills curriculum in clerkship

Step 1: problem identification and general needs assessment

The LCME accreditation identified two clerkship-related concerns: ED-2, which states that

‘‘…faculty must monitor student experience and modify it as necessary to ensure that the

objectives of the clinical education program will be met’’; and ED-27 that requires

‘‘…ongoing assessment that assures students have acquired and can demonstrate on direct

observation the core clinical skills…’’ (http://www.lcme.org/functions.pdf, June 2008).

Reviewing the available data on our clerkship curriculum, we articulated the problem of

our existing model as inconsistent observation and teaching of clinical skills in important

clinical presentations.

Step 2: targeted needs assessment of learners

Qualitative analysis of our questionnaire and focus group interviews identified ten prob-

lems with our existing clerkship model that were grouped into the themes of content and

delivery. Content-related deficiencies included: inconsistent exposure to some clinical

presentations (Accreditation Standard ED-2), clinical skills training, procedural skills

training, teaching in diagnostic and therapeutics, teaching in basic science (including

microbiology), teaching in chronic disease management, and training in conflict resolution.

Delivery-related deficiencies included: limited direct observation of clinical skills

(Accreditation Standard ED-27), block learning rather than dispersed learning, and over-

reliance on didactic teaching.

P. Veale et al.

123

Step 3: goals and objectives

For each component of our curriculum we articulated specific learning objectives,

including the desired changes in knowledge, skills, and/or attitudes as a result of the

planned learning experiences. Consistent with the problem identified in step 1, however,

the a priori goal of our curriculum was that: following the introduction of the clinical skills

curriculum the rating of students’ knowledge and clinical skills will increase.

Step 4: educational strategies

When we considered ways of addressing the deficiencies of our existing clerkship model,

we opted for a curriculum to run alongside clerkship rather than to try and change each

clerkship rotation. This was because we considered that some of the problems identified

may not be remediable during busy clerkship rotations. For example, some conditions are

seasonal—so it is not possible to provide cases of croup for students who complete their

Pediatrics rotation during the summer months; in teaching hospitals residents typically

have priority in performing the limited number of procedural skills available, thus limiting

training opportunities for clerks; and a clerkship model based on rotation blocks is not

designed to create dispersed learning (Glenberg and Lehmann 1980; Sisti et al. 2007).

Based upon the needs assessment, we deciding that our curriculum must be compre-

hensive, should be dispersed, and that learning experiences should, where possible, adhere

to the principles of ‘‘deliberate practice’’ (Ericsson and Lehmann 1994; Glenberg and

Lehmann 1980; Kerfoot et al. 2007; Raman et al. 2010; Sisti et al. 2007). We began by

identifying all mandatory clinical presentations for clerkship, after which we created a

blueprint for our curriculum to ensure that each of these presentations would be encoun-

tered (Coderre et al. 2009). We then selected the most appropriate delivery format for each

presentation. For example, to cover emergency presentations—such as acute onset chest

pain and/or dyspnea—we selected training on a human patient simulator. For presentations

emphasizing either communication skills (e.g., depression, family violence) or physical

examination skills (e.g., hypertension, joint pain) we used standardized patients, whereas

we used computerized virtual patients for presentations dealing with abnormal laboratory

tests (e.g., hyperkalemia, anemia). Training on human patient simulators, task trainers (for

procedural skills), standardized patients, and virtual patients involved direct observation of

performance with feedback and the opportunity for practice for training (Ericsson and

Lehmann 1994). Learning experiences were primarily in a small group or team-based

learning format (Michaelsen et al. 2002). Table 1 shows the type of learning experience for

each content area. As clerks are on a variety of rotations at any point in their clerkship year,

the timing of our curricular content did not coincide with the clinical experiences of

clerkship and was thus dispersed (Kerfoot et al. 2007; Raman et al. 2010). With the

exception of conflict resolution, we also introduced formative evaluations for each section

to provide feedback and enhance learning (Roediger and Karpicke 2006).

Step 5: implementation

We introduced our curriculum beginning with the class of 2011. Since then, on alternate

Friday afternoon from 12 to 5 pm clerks leave their clerkship rotations to attend the clinical

skills curriculum. The schedule for each session is shown in ‘‘Appendix’’ section. With the

exception of clerkships giving up 5 % of their scheduled time to our curriculum, the format

of the clinical clerkship has not changed.

Filling in the gaps of clerkship

123

Step 6: evaluation and feedback

At the end of the first year of our curriculum we gathered feedback in the form of

questionnaires for each component of our curriculum, in addition to exit focus groups.

Based upon these data we have made minor revisions for future iterations of the

curriculum.

The impact of the clinical skills curriculum on learning outcomes

For students from graduating classes of 2009 and 2010 the mean score (SD) on pre-

clerkship summative evaluations was 79.62 % (6.89), which was not significantly different

from the mean score for students from classes of 2011 and 2012 (80.54 % (5.07),

p = 0.187). In the 2 years following the introduction of our curriculum the mean score

(SD) on the summative clerkship OSCE was significantly higher than in the 2 years prior to

our curriculum (67.12 (5.3) vs. 62.44 (4.93), p \ 0.001, d = 0.91). With the exception of

the surgical clerkship MCQ, performance on all summative MCQ examinations (including

the MCCQE Part 1) was significantly higher following the introduction of our curriculum.

These data are shown in Table 2.

Discussion

According to Aristotle, ‘‘what we have to learn to do, we learn by doing’’. For this reason,

in addition to being congruent with dominant theories of learning, the clinical clerkship

model seems appropriate for training senior medical students to become residents and,

ultimately, practicing physicians (Brown et al. 1989; Miller and Dollard 1941; Tulving and

Thomson 1973). But, based upon data highlighting deficiencies in the clinical skills of

residents, graduating medical students appear to have learning gaps that are not being met

by the current clerkship model (Fred 2005; Mangione and Nieman 1997; Mangione and

Nieman 1999; Mangione 2001). These gaps do not necessarily imply that the objectives of

the clinical clerkship are misguided. A more likely explanation is that the unpredictable

nature of clinical practice—where the supply of clinical findings is erratic and other

demands frequently take priority over clinical skills training for clerks (e.g., the primacy of

providing patient care and meeting the learning needs of residents)—causes the delivered

clerkship curriculum to deviate from the intended curriculum (Cuban 1992).

Table 1 Educational strategies for delivering content

Content area Didactic Small group Team-based learning

Diagnostics and therapeutics X X

Virtual patients X

Simulation X

Standardized patients X

Procedural skills X

Chronic disease management X

Patient safety X

Conflict resolution X

P. Veale et al.

123

Motivated by an accreditation review, we analyzed our clerkship curriculum and

identified areas where the intended curriculum was not being delivered predictably or

effectively. Based upon this needs assessment, we then designed a comprehensive clinical

skills curriculum to try and bridge the learning gaps of the existing clerkship model.

Working within the existing timeframe for clerkship, and keeping the same objectives and

evaluations of learning outcomes, we changed the delivery of the learning experiences by

reducing the clinical experience by 5 % and replacing this with a clinical skills curriculum.

When we evaluated learning outcomes after changing our delivery model we found sig-

nificantly better performance on knowledge and clinical skills evaluations—suggesting that

in the revised clerkship model students are more likely to meet the learning objectives of

clerkship.

As there were multiple interventions involved in our clinical skills curriculum it is not

possible for us to tease out which of these facilitated improved performance. Based upon

the existing education literature, we could speculate on the relative contribution of adding

dispersed learning (Kerfoot et al. 2007; Raman et al. 2010), technology-enhanced learning

(Cook et al. 2010, 2011) and test-enhanced learning (Roediger and Karpicke 2006) to a

large dose of deliberate practice (Ericsson and Lehmann 1994)—but in reality the success

of our curriculum is more likely to be due to the process of curriculum design that allowed

us to identify our specific problems and then devise solutions to target these (Kern et al.

1998). According to performance indicators, the LCME, and our students, our previous

curriculum had significant gaps, including failure to meet accreditation standards ED2 and

ED27, and our revised curriculum is helping to bridge these gaps.

Our study has some limitations that we should highlight. Our pre/post study design used

to evaluate learning outcomes is more susceptible to biases, such as allocation bias and

performance bias, compared to a randomized controlled trial. We used performance on

summative evaluations in clerkship to gauge the impact of curriculum, but improved

performance on these evaluations does not guarantee improved clinical performance in

residency and beyond, which, one could argue, should be the goal of an undergraduate

curriculum. Each medical school faces different challenges, so our clerkship solution might

not address the problems of other schools. For example, we have a 3-year undergraduate

curriculum and some of the deficiencies that we identified may not be so obvious in a

longer undergraduate program (although the previously published data on deficiencies in

Table 2 Student performance on knowledge evaluations before and after the introduction of the clinicalskills curriculum

Content Pre-curriculum(n = 257)

Post-curriculum(n = 332)

p value Cohen’s d

Mean SD Mean SD

Emergency medicine 81.96 6.36 84.08 6.08 \0.001 0.34

Family medicine 75.92 6.17 77.22 6.16 0.01 0.21

Internal medicine 73.54 7.8 78.74 7.64 \0.001 0.67

Obstetrics and gynecology 74.87 8.89 76.64 5.43 0.005 0.24

Pediatrics 72.13 7.04 79.26 6 \0.001 1.09

Psychiatry 83.85 4.8 86.49 4.64 \0.001 0.56

Surgery 74.78 6.54 75.13 6.52 0.5 0.05

MCCQE Part 1 514.85 64.73 536.55 64.74 \0.001 0.34

Filling in the gaps of clerkship

123

clinical skills were not restricted to three-year curricula) (Mangione and Nieman 1997,

1999; Mangione 2001). Similarly, we cannot claim that our curriculum is the optimum way

to improve clinical skills training in our medical school as there are many questions that

our study does not address. For example, what is the ideal balance of clinical experience

and clinical skills training? What is the best learning experience in which to learn different

clinical skills—e.g., is training on a virtual patient with chest pain as effective as a high

fidelity simulator or standardized patient? Clearly there is still a long way to go before we

can describe the optimum clinical skills curriculum to complement the clinical training of

clerkship.

Implications for medical education

Real clinical experiences are, and should be, the core learning experience in clerkship. Yet,

the traditional clerkship model appears to fall short of meeting the learning needs of

students. It is unlikely that our medical school is alone in having struggled with meeting

accreditation standards and providing students with the types of learning experiences

during clerkship that allow them to develop their clinical skills. The supply of teaching

resources, such as available clinical teachers with willing patients who have good clinical

findings, is unpredictable in all clinical rotations—so the clinical clerkship will always be a

risk of failing to deliver the intended clerkship curriculum. As such, each medical school

should consider and evaluate ways of changing the delivery of clerkship training to

improve learning outcomes. In this study we have described how we designed a clinical

skills curriculum to run alongside the clinical rotations and how performance on clerkship

evaluations improved following the introduction of this curriculum. Our curriculum is

clearly not a panacea, but we hope that the description of how we devised this might help

others struggling to deliver their clerkship curriculum.

Appendix

See Table 3.

Table 3 Schedule for curricular content

Time Topic

12.00–12.45 Diagnostics and therapeutics (n = 160)

13.00–14.45 Virtual patients (n = 80) Simulation (n = 10)

Standardized patients (n = 20)

Procedural skills (n = 10)

Diagnostics and therapeutics OR

Chronic disease management OR

Patient safety OR

Conflict resolution OR

Back to basic science OR

Formative evaluations (n = 40)

15.00–16.45 Groups switch Groups switch

P. Veale et al.

123

References

Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. EducationalResearcher, 18, 32–42.

Coderre, S., Woloschuk, W., & McLaughlin, K. (2009). Twelve tips for blueprinting. Medical Teacher, 31,322–324.

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: LawrenceEarlbaum Associates.

Cook, D. A., Erwin, P. J., & Triola, M. M. (2010). Computerized virtual patients in health professionseducation: A systematic review and meta-analysis. Academic Medicine, 85, 1589–1602.

Cook, D. A., Hatala, R., Brydges, R., Zendejas, B., Szostek, J. H., Wang, A. T., et al. (2011). Technology-enhanced simulation for health professions education: A systematic review and meta-analysis. Journalof the American Medical Association, 306, 978–988.

Cuban, L. (1992). Curriculum stability and change. In P. W. Jackson (Ed.), Handbook of research oncurriculum. New York: Macmillan.

Ericsson, K. A., & Lehmann, A. C. (1994). Expert performance: Its structure and acquisition. AmericanPsychologist, 49, 725–747.

Fred, H. L. (2005). Hyposkillia: Deficiency of clinical skills. Texas Heart Institute Journal, 32, 255–257.Glenberg, A. M., & Lehmann, T. S. (1980). Spacing repetitions over 1 week. Memory & Cognition, 8,

528–538.Heidenreich, C., Lye, P., Simpson, D., & Lourich, M. (2000). The search for effective and efficient

ambulatory teaching methods through the literature. Pediatrics, 105, 231–237.Irby, D. M. (1995). Teaching and learning in ambulatory care settings: A thematic review of the literature.

Academic Medicine, 70, 898–931.Irby, D. M., Aagaard, E., & Teherani, A. (2004). Teaching points identified by preceptors observing one-

minute preceptor and traditional preceptor encounters. Academic Medicine, 79, 50–55.Kerfoot, B., DeWolf, W., Masser, B., Church, P. A., & Federman, D. D. (2007). Spaced education improves

the retention of clinical knowledge by medical students: A randomized controlled trial. MedicalEducation, 41, 23–31.

Kern, D. E., Thomas, P. A., Howard, D. M., & Bass, E. B. (1998). Curriculum development for medicaleducation: A six-step approach. Baltimore, MD: Johns Hopkins University Press.

LCME. http://www.lcme.org/functions.pdf (June 2008).Mandin, H., Harasym, P., Eagle, C., & Watanabe, M. (1995). Developing a ‘‘clinical presentation’’ cur-

riculum at the University of Calgary. Academic Medicine, 70, 186–193.Mangione, S. (2001). Cardiac auscultatory skills of physicians-in-training: A comparison of three English-

speaking countries. American Journal of Medicine, 110, 210–216.Mangione, S., & Nieman, L. Z. (1997). Cardiac auscultatory skills of internal medicine and family practice trainees:

A comparison of diagnostic proficiency. Journal of the American Medical Association, 278, 717–722.Mangione, S., & Nieman, L. Z. (1999). Pulmonary auscultatory skills during training in internal medicine

and family practice. American Journal of Respiratory and Critical Care Medicine, 159, 1–6.McLaughlin, K., Bates, J., Konkin, J., Woloschuk, W., Suddards, C. A., & Regehr, G. (2011). A comparison

of performance evaluations of students on longitudinal integrated clerkships and rotation-basedclerkships. Academic Medicine, 86(10 Suppl), S25–S29.

Michaelsen, L. K., Knight, A. B., & Fink, D. L. (2002). Team-based learning: A transformative use of smallgroups. Westport, CT: Praeger Publishers.

Miller, N. E., & Dollard, J. C. (1941). Social learning and imitation. New Haven, CT: Yale University Press.Neher, J. O., Gordon, K. C., Meyer, B., & Stevens, N. (1992). A five-step ‘‘microskills’’ model of clinical

teaching. Journal of the American Board of Family Practice, 5, 419–424.Raman, M., McLaughlin, K., Violato, C., Rostom, A., Allard, J. P., & Coderre, S. (2010). Teaching in small

portions dispersed over time enhances long-term knowledge retention. Medical Teacher, 32, 250–255.Roediger, H. L., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-

term retention. Psychological Science, 17, 249–255.Sisti, H. M., Glass, A. L., & Shors, T. J. (2007). Neurogenesis and the spacing effect: Learning over time

enhances memory and the survival of new neurons. Learning & Memory, 14, 368–375.Sutkin, G., Wagner, E., Harris, I., & Schiffer, R. (2008). What makes a good clinical teacher in medicine? A

review of the literature. Academic Medicine, 83, 452–466.Thurmond, V. A. (2001). The point of triangulation. Journal of Nursing Scholarship, 33, 253–258.Tulving, E., & Thomson, D. M. (1973). Encoding specificity and retrieval processes in episodic memory.

Psychological Review, 80, 352–373.

Filling in the gaps of clerkship

123