1
Art, craft or science? Regehr G. It’s NOT rocket science: rethinking our metaphors for research in health professions education. Med Educ 2010; 44:31–39. It seemed appropriate to feature this article from Medical Education in The Clinical Teacher given that it touches on some of the factors that differentiate the two journals. It also addresses an issue that resonates inter- nationally: the ‘Cinderella’ status of medical education research. Clinical teachers tend to be clinicians first and foremost, and as such are accustomed to their practice being underpinned by the best available medical evidence. When the time comes to do a bit of teaching, however, some clinicians are dismayed to find that the educational approaches they are expected to follow are not based on published screeds of double-blinded intervention trials. But does this make their teaching methods any less valid? Dr Glenn Regehr from the University of Toronto in Canada acknowledges that those engaged in health professional education (such as clinical teachers) tend to view the physical sciences as the ideal model for research, and many dismiss educational research as failing to reach a scientific standard of proof. Consider medical school as a pharmaceutical intervention: surely any drug that costs billions of dollars worldwide to deliver, and has such a signifi- cant impact on young people’s lives, would need to demonstrate indisputable beneficial outcomes over the long-term? How many medical courses would survive a pharmaceutical standard cost– benefit analysis? Regehr identifies this sort of thinking as indicative of the clinician’s desire for the ‘impera- tive of proof’ coupled with the ‘imperative of simplicity’. We’re not that comfortable with con- cepts we can’t disassemble into their component facts to see exactly how they work. I recall being sent to a management course years ago where the old chestnut about frogs and bicycles was trotted out: education is more like a frog than a bicycle. You can pull a bike apart and put it back together again so that it works fine. Try that with a frog. The article explains how requiring proof of efficacy as the goal of educational research lim- its our understanding of what’s actually going on. This is com- pared with subjecting our teach- ing to summative assessment: it either passes or fails, never mind the depth of understanding. Reg- ehr deplores our repeated at- tempts to reduce the complexities of medical education – dependent as they are on wildly chaotic variables in teachers, learners, patients and contexts – to a series of simple and reproducible facts. He draws parallels with early physicists (including such wor- thies as Newton and Einstein) attempting to explain the uni- verse in a few elegantly simple rules, or even a single equation. Then quantum physics came along and everything became chaoti- cally messy again. Regehr really hits the mark in the last few pages, when he gives us permission to stop worrying about the answer and concentrate more on understanding; this is an approach we’ve been trying to instil in our students through problem-based learning for years, so it seems fair to take some of our own medicine. He wants us to shift from the imperative of proof to an imperative of understanding, and to read the medical education research literature not to find sci- entifically proven models to copy but to better understand how something might work in our own context. He’s right: research in health professional education isn’t rocket science. It’s much more complex and organic than that. Requiring proof of efficacy as the goal of educational research limits our understanding of what’s actually going on Digest Ó Blackwell Publishing Ltd 2010. THE CLINICAL TEACHER 2010; 7: 69–71 69

Art, craft or science?

  • View
    219

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Art, craft or science?

Art, craft or science?Regehr G. It’s NOT rocket science:

rethinking our metaphors for research in

health professions education. Med Educ

2010; 44:31–39.

It seemed appropriate tofeature this article from MedicalEducation in The Clinical Teachergiven that it touches on some ofthe factors that differentiate thetwo journals. It also addressesan issue that resonates inter-nationally: the ‘Cinderella’status of medical educationresearch.

Clinical teachers tend to beclinicians first and foremost, andas such are accustomed to theirpractice being underpinned by thebest available medical evidence.When the time comes to do abit of teaching, however, someclinicians are dismayed to findthat the educational approachesthey are expected to follow arenot based on published screedsof double-blinded interventiontrials. But does this maketheir teaching methods any lessvalid?

Dr Glenn Regehr from theUniversity of Toronto in Canadaacknowledges that those engagedin health professional education(such as clinical teachers) tendto view the physical sciencesas the ideal model for research,and many dismiss educational

research as failing to reach ascientific standard of proof.Consider medical school as apharmaceutical intervention:surely any drug that costsbillions of dollars worldwide todeliver, and has such a signifi-cant impact on young people’slives, would need to demonstrateindisputable beneficial outcomesover the long-term? How manymedical courses would survive apharmaceutical standard cost–benefit analysis?

Regehr identifies this sort ofthinking as indicative of theclinician’s desire for the ‘impera-tive of proof’ coupled with the‘imperative of simplicity’. We’renot that comfortable with con-cepts we can’t disassemble intotheir component facts to seeexactly how they work. I recallbeing sent to a managementcourse years ago where the oldchestnut about frogs and bicycleswas trotted out: education ismore like a frog than a bicycle. Youcan pull a bike apart and put it backtogether again so that it worksfine. Try that with a frog.

The article explains howrequiring proof of efficacy as thegoal of educational research lim-its our understanding of what’sactually going on. This is com-pared with subjecting our teach-ing to summative assessment: it

either passes or fails, never mindthe depth of understanding. Reg-ehr deplores our repeated at-tempts to reduce the complexitiesof medical education – dependentas they are on wildly chaoticvariables in teachers, learners,patients and contexts – to a seriesof simple and reproducible facts.He draws parallels with earlyphysicists (including such wor-thies as Newton and Einstein)attempting to explain the uni-verse in a few elegantly simplerules, or even a single equation.Then quantum physics came alongand everything became chaoti-cally messy again.

Regehr really hits the mark inthe last few pages, when he givesus permission to stop worryingabout the answer and concentratemore on understanding; this is anapproach we’ve been trying toinstil in our students throughproblem-based learning for years,so it seems fair to take some of ourown medicine. He wants us to shiftfrom the imperative of proof to animperative of understanding, andto read the medical educationresearch literature not to find sci-entifically proven models to copybut to better understand howsomething might work in our owncontext. He’s right: research inhealth professional education isn’trocket science. It’s much morecomplex and organic than that.

Requiring proofof efficacy asthe goal ofeducationalresearch limitsourunderstandingof what’sactually goingon

Digest

� Blackwell Publishing Ltd 2010. THE CLINICAL TEACHER 2010; 7: 69–71 69