1
Saying what you mean Huntley CD, Salmon P, Fisher PL, Fletcher I, Young B. LUCAS: a theoretically in- formed instrument to assess clinical communication in OSCEs. Med Educ 2012;46:267–276. Many clinical teachers who vol- unteer to act as examiners in OSCEs will have noticed that the tight structure of the marking sheet (where a point is awarded for each carefully demonstrated skill) often gives way to an ill- defined bag of marks at the end of the station for ‘communication and rapport’. Whereas it is a relief to be able to mark a student on a general impression after the sty- lised constraint of the earlier marking sheet, the looseness of the structure raises questions about the meaning of the score. This group of British researchers from Liverpool (some of whose work on the creativity of communication has been the subject of a previous Digest) 1 report on an instrument that they have developed to better assess communication in their under- graduate medical students. In keeping with their earlier work, however, it is not simply a checklist of discrete skills; rather, it attempts to measure whether the student has crafted their communication in a way that has met the needs of that particular ‘patient’. After reviewing and dismissing other published examples of sim- ilarly-intended tools, the authors describe their own 10-item rating scale for use in objective struc- tured clinical examinations (OSCEs). Dubbed LUCAS (Liverpool Undergraduate Communication Assessment Scale), the instrument moves the examiner away from ticking off pre-defined behaviours and allows them to judge whether the student’s communication ‘works’ for the patient. Having a skill is one thing; using it effec- tively is quite another and often requires a departure from the script. If this tool requires examiners to judge whether a student is meeting their patient’s needs, then surely the simulated patient is best placed to make that judgement. The authors have anticipated this response by rea- soning that high-stakes examina- tions require faculty members to take responsibility for scoring, and that the factors being as- sessed often go beyond satisfying the patient’s expectations. De- spite this, the simulated patients’ ratings of the perceived warmth and caring demonstrated by stu- dents, and their ability to instill confidence, were used to correlate their ratings on LUCAS. Inter-rater reliability was measured through a separate process, and showed a fair to good result. The 10 items on the instru- ment include a couple of proce- dural requirements (introducing oneself and checking identity), six items to identify communica- tion behaviours that might inhibit the doctor–patient relationship and two more heavily-weighted items relating to professional behaviour. Psychometric analysis demonstrated good construct validity, especially for the more creative components of commu- nication. The authors are refreshingly pragmatic in recognising that an instrument like LUCAS will only be acceptable within a medical school if it doesn’t completely challenge the orthodoxy, and if it can be used efficiently within the tight time frames of the OSCE environment. There is still a need to tick-off observable behaviours, but examiners should be spending more of their time evaluating whether the purpose of clinical communication has been achieved. REFERENCE 1. (2011), Meaning what you say. The Clinical Teacher, 8: 68–69. doi: 10.1111/j.1743-498X.2010. 00428_2.x doi: 10.1111/j.1743-498X.2012.0548.x The instrument moves the examiner away from ticking off pre-defined behaviours 130 Ó Blackwell Publishing Ltd 2012. THE CLINICAL TEACHER 2012; 9: 127–130

Saying what you mean

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Saying what you meanHuntley CD, Salmon P, Fisher PL, Fletcher

I, Young B. LUCAS: a theoretically in-

formed instrument to assess clinical

communication in OSCEs. Med Educ

2012;46:267–276.

Many clinical teachers who vol-unteer to act as examiners inOSCEs will have noticed that thetight structure of the markingsheet (where a point is awardedfor each carefully demonstratedskill) often gives way to an ill-defined bag of marks at the end ofthe station for ‘communicationand rapport’. Whereas it is a reliefto be able to mark a student on ageneral impression after the sty-lised constraint of the earliermarking sheet, the looseness ofthe structure raises questionsabout the meaning of the score.

This group of Britishresearchers from Liverpool (someof whose work on the creativity ofcommunication has been thesubject of a previous Digest)1

report on an instrument that theyhave developed to better assesscommunication in their under-graduate medical students. Inkeeping with their earlier work,however, it is not simply achecklist of discrete skills; rather,it attempts to measure whetherthe student has crafted theircommunication in a way that hasmet the needs of that particular‘patient’.

After reviewing and dismissingother published examples of sim-ilarly-intended tools, the authorsdescribe their own 10-item ratingscale for use in objective struc-tured clinical examinations(OSCEs). Dubbed LUCAS (LiverpoolUndergraduate CommunicationAssessment Scale), the instrumentmoves the examiner away fromticking off pre-defined behavioursand allows them to judge whetherthe student’s communication‘works’ for the patient. Having a

skill is one thing; using it effec-tively is quite another and oftenrequires a departure from thescript.

If this tool requires examinersto judge whether a student ismeeting their patient’s needs,then surely the simulated patientis best placed to make thatjudgement. The authors haveanticipated this response by rea-soning that high-stakes examina-tions require faculty members totake responsibility for scoring,and that the factors being as-sessed often go beyond satisfyingthe patient’s expectations. De-spite this, the simulated patients’ratings of the perceived warmthand caring demonstrated by stu-dents, and their ability to instillconfidence, were used to correlatetheir ratings on LUCAS. Inter-raterreliability was measured through aseparate process, and showed afair to good result.

The 10 items on the instru-ment include a couple of proce-dural requirements (introducingoneself and checking identity),six items to identify communica-

tion behaviours that might inhibitthe doctor–patient relationshipand two more heavily-weighteditems relating to professionalbehaviour. Psychometric analysisdemonstrated good constructvalidity, especially for the morecreative components of commu-nication.

The authors are refreshinglypragmatic in recognising that aninstrument like LUCAS will only beacceptable within a medicalschool if it doesn’t completelychallenge the orthodoxy, and if itcan be used efficiently within thetight time frames of the OSCEenvironment. There is still a needto tick-off observable behaviours,but examiners should be spendingmore of their time evaluatingwhether the purpose of clinicalcommunication has beenachieved.

REFERENCE

1. (2011), Meaning what you say. The

Clinical Teacher, 8: 68–69. doi:

10.1111/j.1743-498X.2010.

00428_2.x

doi: 10.1111/j.1743-498X.2012.0548.x

The instrumentmoves the

examiner awayfrom ticking off

pre-definedbehaviours

130 � Blackwell Publishing Ltd 2012. THE CLINICAL TEACHER 2012; 9: 127–130