6
RADIATION ONCOLOGY—ORIGINAL ARTICLE Multisource feedback for radiation oncologists Shalini Kavita Vinod 1,2,3 and Denise Margaret Lonergan 2,3,4 1 Liverpool Cancer Therapy Centre, Liverpool Hospital, 2 University of NSW, 3 University of Western Sydney, and 4 Macarthur Cancer Therapy Centre, Campbelltown Hospital, Sydney, New South Wales, Australia SK Vinod MBBS, MD, FRANZCR; DM Lonergan MBBS, FRANZCR. Correspondence A/Professor Shalini Vinod, Liverpool Cancer Therapy Centre, Liverpool Hospital, Locked Bag 7103, Liverpool BC, Sydney, NSW 1871, Australia. Email: [email protected]. Conflict of interest: The authors of this paper have no actual or potential conflicts of interest. Oral presentation at RANZCR Annual Scientific Meeting, 2012. Submitted 24 August 2012, accepted 18 December 2012. doi:10.1111/1754-9485.12037 Abstract Introduction: Multisource feedback (MSF) is an assessment of performance through evaluation of an individual’s competence from multiple perspectives. It is mandated in many specialist training schemes in medicine. The aim of this study was to test the feasibility of implementing MSF for consultant radiation oncologists. Methods: A validated tool consisting of a self-assessment questionnaire, medical colleague questionnaire, co-worker questionnaire and patient ques- tionnaire was used for MSF. Statements were rated on a 5-point Likert scale with 1 being a low rating and 5 a high rating. Seven radiation oncologists volunteered to undergo MSF. They each nominated 10 medical colleagues, 10 co-workers and 10 patients to be surveyed. Clinician feedback was provided as an individual report with a mean score and range for each data item. Results: Two hundred ten surveys were mailed out and seven self- assessments were completed. The response rate was 87% for medical col- leagues, 89% for co-workers and 79% for patients. The mean feedback scores averaged for the radiation oncologists ranged from 4.4 to 4.9, significantly higher than self-assessments scores which ranged from 3.2 to 3.7. MSF identified areas for potential improvement including communication and col- laboration with co-workers and accessibility to and adequacy of clinic space for patients. All radiation oncologists found the MSF a positive experience, and five planned to make changes in their practice in response to this. Conclusions: The high response rate to the surveys has shown that it is feasible to implement MSF for radiation oncologists. This could potentially be used as a method for ongoing revalidation. Key words: clinical competence; feedback; physician performance; quality assurance; radiation oncology. Introduction Continuous Professional Development (CPD) is an essen- tial component of medical practice. It aims to ensure that medical practitioners are clinically competent and main- tain professional standards related to their particular discipline. Currently within the Faculty of Radiation Oncology, Royal Australian and New Zealand College of Radiologists (RANZCR), there is a CPD programme based on the Canadian Medical Education Directions for Special- ists (CanMEDs) principles. 1 This assesses the roles of Medical Expert, Radiation Oncology Professional, Com- municator, Collaborator, Manager, Health Advocate and Scholar. 2 The activities counted towards demonstration of these skills are clinician lead and self-directed. They are based primarily on clinician participation (peer review audit, multidisciplinary team meeting, clinical trials, teaching), attendance (conferences, journal clubs) and self- directed learning. There is no external assessment of clinicians’ performance by observers. Multisource feedback (MSF) seeks to do this by asking for feedback on performance provided from multiple perspectives within an individual’s sphere of influence. It broadly covers all the CanMEDS attributes. 3,4 The sources of feedback include patients, co-workers, medical colleagues and referring clinicians as well as a self-assessment. It also allows evaluation of humanistic Journal of Medical Imaging and Radiation Oncology 57 (2013) 384–389 © 2013 The Authors Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists 384

Multisource feedback for radiation oncologists

Embed Size (px)

Citation preview

Page 1: Multisource feedback for radiation oncologists

RADIATION ONCOLOGY—ORIGINAL ARTICLE

Multisource feedback for radiation oncologistsShalini Kavita Vinod1,2,3 and Denise Margaret Lonergan2,3,4

1Liverpool Cancer Therapy Centre, Liverpool Hospital, 2University of NSW, 3University of Western Sydney, and 4Macarthur Cancer Therapy Centre,

Campbelltown Hospital, Sydney, New South Wales, Australia

SK Vinod MBBS, MD, FRANZCR;

DM Lonergan MBBS, FRANZCR.

CorrespondenceA/Professor Shalini Vinod, Liverpool Cancer

Therapy Centre, Liverpool Hospital, Locked Bag

7103, Liverpool BC, Sydney, NSW 1871,

Australia.

Email: [email protected].

Conflict of interest: The authors of this paper

have no actual or potential conflicts of interest.

Oral presentation at RANZCR Annual Scientific

Meeting, 2012.

Submitted 24 August 2012, accepted 18

December 2012.

doi:10.1111/1754-9485.12037

Abstract

Introduction: Multisource feedback (MSF) is an assessment of performancethrough evaluation of an individual’s competence from multiple perspectives.It is mandated in many specialist training schemes in medicine. The aim ofthis study was to test the feasibility of implementing MSF for consultantradiation oncologists.Methods: A validated tool consisting of a self-assessment questionnaire,medical colleague questionnaire, co-worker questionnaire and patient ques-tionnaire was used for MSF. Statements were rated on a 5-point Likert scalewith 1 being a low rating and 5 a high rating. Seven radiation oncologistsvolunteered to undergo MSF. They each nominated 10 medical colleagues, 10co-workers and 10 patients to be surveyed. Clinician feedback was providedas an individual report with a mean score and range for each data item.Results: Two hundred ten surveys were mailed out and seven self-assessments were completed. The response rate was 87% for medical col-leagues, 89% for co-workers and 79% for patients. The mean feedback scoresaveraged for the radiation oncologists ranged from 4.4 to 4.9, significantlyhigher than self-assessments scores which ranged from 3.2 to 3.7. MSFidentified areas for potential improvement including communication and col-laboration with co-workers and accessibility to and adequacy of clinic spacefor patients. All radiation oncologists found the MSF a positive experience, andfive planned to make changes in their practice in response to this.Conclusions: The high response rate to the surveys has shown that it isfeasible to implement MSF for radiation oncologists. This could potentially beused as a method for ongoing revalidation.

Key words: clinical competence; feedback; physician performance; qualityassurance; radiation oncology.

Introduction

Continuous Professional Development (CPD) is an essen-tial component of medical practice. It aims to ensure thatmedical practitioners are clinically competent and main-tain professional standards related to their particulardiscipline. Currently within the Faculty of RadiationOncology, Royal Australian and New Zealand College ofRadiologists (RANZCR), there is a CPD programme basedon the Canadian Medical Education Directions for Special-ists (CanMEDs) principles.1 This assesses the roles ofMedical Expert, Radiation Oncology Professional, Com-municator, Collaborator, Manager, Health Advocate andScholar.2

The activities counted towards demonstration of theseskills are clinician lead and self-directed. They are basedprimarily on clinician participation (peer review audit,multidisciplinary team meeting, clinical trials, teaching),attendance (conferences, journal clubs) and self-directed learning. There is no external assessment ofclinicians’ performance by observers.

Multisource feedback (MSF) seeks to do this by askingfor feedback on performance provided from multipleperspectives within an individual’s sphere of influence.It broadly covers all the CanMEDS attributes.3,4 Thesources of feedback include patients, co-workers,medical colleagues and referring clinicians as well as aself-assessment. It also allows evaluation of humanistic

bs_bs_banner

Journal of Medical Imaging and Radiation Oncology 57 (2013) 384–389

© 2013 The AuthorsJournal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists384

Page 2: Multisource feedback for radiation oncologists

attributes such as integrity, compassion, communica-tion, collegiality and professional responsibility, whichare difficult to measure.

MSF is a component of many registrar trainingprogrammes including radiation oncology. Its use toprovide specialist feedback is not as common. Certainregions such as the province of Alberta in Canada havemandated this for all practicing clinicians, to be per-formed once every 5 years. There have been no spe-cific reports of the use of MSF for radiation oncologists.The aim of this study was to pilot MSF within a radia-tion oncology department to assess its feasibility andacceptance.

Methods

Seven radiation oncologists (out of 11 employed staffspecialists) at Liverpool and Macarthur Cancer TherapyCentres volunteered to participate in the pilot study. Ofthe four radiation oncologists who did not participate,one had been at the centre for less than a year and wasdue to go on maternity leave, one was on maternityleave, one had previously participated in MSF twice in hisrole as a manager and one declined to participate. Eachclinician identified 10 patients, 10 co-workers and 10referring clinicians to be surveyed, in addition to com-pleting a self-assessment survey. Respondents com-pleted the surveys anonymously

The method of choosing the survey participants was atthe discretion of the radiation oncologist. Some chose arepresentative sample of patients according to tumoursite, age and gender and co-workers in a variety ofdifferent roles, while others deliberately sought outpatients who had problems during treatment and staffwith whom there had been a previous negative interac-tion. The method of choosing the participants, either bythe clinicians themselves or at random, does not appearto significantly affect MSF feedback.5

The survey instrument used was the PhysicianAchievement Review (PAR) available at http://www.par-program.org/.6 This instrument has been shown tohave consistency, reliability and validity,7,8 and is usedfor mandatory MSF in Alberta, Canada. It was devel-oped by the Physician Performance Advisory Committeeof the College of Physicians and Surgeons of Albertawith input from focus groups including patients, clini-cians and health-care workers.3,7 The statements onthese questionnaires are rated on a 5-point Likert scale.Patients rate each statement from 1 meaning ‘stronglydisagree’ (or a low rating) to 5 meaning ‘strongly agree’(or a high rating). For the other questionnaires includ-ing the self-assessment, 1 means ‘among the worst’and 5 means ‘among the best’ compared with othermedical specialists.

The surveys were mailed from the department with aself-addressed stamped return envelope to be returnedto an independent research unit, within the hospital for

data collation and analysis. A second survey was sent outif a response was not received within 3 weeks of theinitial mail-out.

Each participant received an individualised feedbackreport reporting the mean and range of responses toeach question. Responses scoring less than 3 were high-lighted for reflection. For patients, a score of 1 or 2meant the patient ‘strongly disagreed’ or ‘disagreed’ withthe statement, respectively. For medical colleagues andco-workers, a score of 1 meant ‘among the worst’ and 2‘in the bottom half’ compared with other medical spe-cialists the participant knew. All responses were deiden-tified and combined for an overall report, the results ofwhich are discussed here. The radiation oncologists par-ticipating in the study were surveyed after receiving theirindividual reports to evaluate acceptance of the MSFprocess.

Analyses were performed using SPSS v18.9 Compari-son of mean scores was performed using the studentt-test. This study was approved by the institutionalhuman research and ethics committee.

Results

Of the 210 surveys sent, the response rate was 79%(55/70) for patients, 89% (62/70) for co-workers and87% (61/70) for referring clinicians. All seven radiationoncologists completed a self-assessment.

Ninety-eight per cent of patient surveys were com-pleted by the patient themselves and 2% by a carer.Fifty-seven per cent of patients were male, 43% wereaged 46–65 years and 45% were older than 65 years.There were six major domains assessed by patients:patient education, patient interaction, patient informa-tion, patient satisfaction, office staff and physical office.The mean scores were all 4.5 or higher (Table 1). Patientsatisfaction was particularly high with a mean score of4.9. Although the average of the ranges for all radiationoncologists was 3 or higher, there were seven itemswhere a minimum score of 2 was recorded by at leastone patient (Table 2).

The co-workers comprised radiation therapists (31%),administration staff (22%), nursing (13%), registrars

Table 1. Feedback from patient questionnaire†

Attribute No of questions Mean score‡ SD Range‡

Patient education 4 4.6 0.5 3–5

Patient interaction 12 4.7 0.4 3.6–5

Patient information 10 4.5 0.4 3.8–5

Patient satisfaction 4 4.9 0.3 3.8–5

Office staff 5 4.7 0.5 3.8–5

Physical office 5 4.5 0.5 3–5

†Statements were scored on a 5-point Likert scale from 1 (low score) to

5 (high score). ‡Average for all radiation oncologists. SD, standard

deviation.

MSF for radiation oncologists

© 2013 The AuthorsJournal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists 385

Page 3: Multisource feedback for radiation oncologists

(13%), physicists (8%) and other (13%). Eighty-nineper cent of co-workers felt they knew the radiationoncologist well or very well, 9% somewhat and 2% notwell. There were three domains assessed: co-workercommunication, patient interaction and co-worker colle-giality. Mean scores were between 4.4 and 4.6 (Table 3).The average range was below 3 for co-worker commu-

nication and 3 and above for the other domains. Therewere seven items where a minimum score of 2 wasrecorded by at least one co-worker (Table 4).

The medical colleagues surveyed comprised non-oncology clinicians (54%), radiation oncologists (23%),medical oncologists (15%) and haematologists (8%).Ninety-four per cent knew the radiation oncologist beingassessed well or very well, 4% somewhat and 2% notwell. There were four domains assessed: clinical compe-tency, coordination and care of resources, professionaldevelopment and interaction with others. The meanscores averaged among the radiation oncologists were4.4 and higher and the range 3 and higher. There wereno individual items that scored less than 3 (Table 5).

The mean self-assessment scores were lowerthan those of survey participants. (Table 6) The self-assessment questionnaire contained nine questions incommon with the patient questionnaire, 14 questions in

Table 2. Statements from patient questionnaire with a minimum score <3

Statement Mean

score†

SD Range†

The doctor talks to me about preventative care. 4.6 0.7 2–5

The doctor asks details about my personal life, when appropriate. 4.5 0.8 2–5

I can reach the office by phone during the day. 4.6 0.6 2–5

I receive an appropriate explanation if my appointment is delayed. 4.4 0.7 2–5

The office is easy to get into (e.g., wheelchair accessible, parking). 4.4 0.8 2–5

The office has appropriate waiting areas. 4.5 0.7 2–5

The examining rooms are appropriately sized and have adequate equipment. 4.5 0.7 2–5

†Average for all radiation oncologists. SD, standard deviation.

Table 3. Feedback from co-worker questionnaire†

Attribute No of questions Mean score‡ SD Range‡

Co-worker communication 3 4.4 0.6 2.5–5

Patient interaction 11 4.6 0.5 3.3–5

Co-worker collegiality 8 4.6 0.5 3–5

†Statements were scored on a 5-point Likert scale from 1 (low score) to 5

(high score). ‡Average for all radiation oncologists. SD, standard

deviation.

Table 4. Statements from co-worker questionnaire with a minimum score <3

Statement Mean score† SD Range†

The doctor is accessible for appropriate communication about patients. 4.4 0.8 2–5

The doctor accepts responsibility for patient questionnaire. 4.4 0.8 2–5

The doctor is reasonably accessible to patients. 4.4 0.8 2–5

The doctor responds appropriately in emergency situations. 4.5 0.7 2–5

The doctor is able to verbally communicate effectively with other health professionals. 4.7 0.6 2–5

The doctor respects the professional knowledge and skills of co-workers. 4.5 0.8 2–5

The doctor collaborates well with co-workers. 4.6 0.6 2–5

†Average for all radiation oncologists. SD, standard deviation.

Table 5. Feedback from medical colleague questionnaire†

Attribute No of

questions

Mean

score‡

SD Range‡

Clinical competency 6 4.5 0.4 3.5–5

Coordination of care and resources 16 4.4 0.4 3.5–5

Professional development 4 4.5 0.5 3.3–5

Interaction with others 12 4.5 0.4 3.6–5

†Statements were scored on a 5-point Likert scale from 1 (low score) to 5 (high score). ‡Average for

all radiation oncologists. SD, standard deviation.

SK Vinod and DM Lonergan

© 2013 The AuthorsJournal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists386

Page 4: Multisource feedback for radiation oncologists

common with the co-worker questionnaire and 37 ques-tions with the medical colleague questionnaire. For eachof these items, the mean of the self-assessment scoreswas lower than the mean of the observer’s scores reach-ing statistical significance for all items except for one.The statement that did not reach statistical significancewas involvement in professional development withsimilar ratings given by medical colleagues and by theradiation oncologists themselves.

Following receipt of the individualised feedback report,five of the radiation oncologists planned to changeaspects of their practice. This included being more acces-sible to co-workers, discussing preventative care andpsychosocial support with patients and improving time-liness of letters back to referring doctors. All radiationoncologists felt this was a positive experience and sup-ported repeating MSF in 2–3 years.

The process took approximately 4 months from initialmail-out of surveys to receipt of individualised feedbackreport. The cost of survey collation, data entry andanalysis and provision of feedback report was AUD900for the whole group. Costs of the survey mail-out wereseparate.

Discussion

This study has shown that MSF in the setting of radiationoncology is feasible with excellent survey response ratesand is acceptable to the clinicians. It provides informa-tion not readily asked in other setting, and the anonym-ity allows greater freedom for respondents. It canidentify areas in which the clinician is performing well inas well as areas for improvement. Although the meanscores averaged for all radiation oncologists were high inthis pilot study, there were still aspects of performancewhere at least one respondent had given a low score.This provides a focus for reflection and consideration ofchanges in practice to address this.

There are limitations to MSF. The MSF questionnairesin common use are all in English, so non-English-speaking patients are automatically excluded. This isa major limitation in South West Sydney, the regionserviced by Liverpool and Macarthur Cancer TherapyCentres. Approximately half of the resident population is

born overseas and a quarter speaks poor or no English.10

Feedback from these patients is important if we are toimprove our care towards patients from culturally andlinguistically diverse backgrounds. One possible solutionmay be the use of English-speaking carers to aid thesepatients in filling in these surveys although this is notideal. Translation of surveys is time consuming andcostly and would require further validation.

Some of the feedback relates to accessibility, officestaff and the physical environment, areas that may beout of the immediate control of a clinician in a publichealth setting. Nevertheless, if a consistent trend isidentified, this information can be fed back to hospitalmanagement for action.

The instrument chosen for MSF is a general one nottailored for radiation oncologists. There are tailored PARinstruments for specialties such as anaesthetists,11

pathologists,12 psychiatrists,13 surgeons14 and occupa-tional therapists.15 No specific instruments exist foroncologists. Lockyer and Violato have evaluated the useof the general PAR instrument for internal medicine phy-sicians, paediatricians and psychiatrists and found thatthe instrument is reliable and valid for each specialty andis able to discriminate for specialty differences.8 Thissuggests that the use of the general PAR survey isappropriate for radiation oncologists.

The method of choosing participants to provide feed-back is a vexed one. Self-selecting participants may be apotential source of bias. In a study of PAR in rural familyphysicians, a positive correlation was found betweenfamiliarity ratings (between the participant and clinicianbeing rated) and the mean rating scores.16 Having saidthat, some familiarity is necessary for rating of perform-ance, otherwise many questions will not be assessable.Ramsey et al. conducted an MSF study where one groupof participants was chosen by the physician being ratedand another group chosen at random from a list providedby the physician’s supervisor.5 They found no significantdifference in the ratings by the two groups and con-cluded that peer ratings were not biased substantially bymethod of participant selection. However, these resultswere limited to surveys completed by physicians andnurses and did not include patients. A minimum of 11participants was suggested for reliable assessment bypeers and co-workers. In practical terms, self-selectionof participants is the easiest method of conducting MSFand should give reliable feedback provided that anadequate number of surveys are returned. In the settingof radiation oncology, clinicians are also able to identifypatients who would be inappropriate to survey, based ontheir performance status, life expectancy and English-language skills.

An interesting finding of this study is that the self-assessment scores were lower than the observer ratingsfor all domains. A study of PAR for internal medicinephysicians, psychiatrists and paediatricians has shownthat clinicians are inaccurate in assessing their own per-

Table 6. Self-assessment questionnaire†

Attribute No of

questions

Mean

score‡

SD Range‡

Clinical competency 6 3.4 0.5 3–4

Coordination of care and resources 16 3.2 0.6 2.5–4

Professional development 4 3.7 0.7 2.8–4.3

Interaction with others 11 3.6 0.7 2.6–4.5

†Statements were scored on a 5-point Likert scale from 1 (low score) to

5 (high score). ‡Average for all radiation oncologists. SD, standard

deviation.

MSF for radiation oncologists

© 2013 The AuthorsJournal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists 387

Page 5: Multisource feedback for radiation oncologists

formance.17 They found that clinicians who were in thelowest and highest quartiles as assessed by their col-leagues tended to rate themselves 30–40 percentileranks higher and lower, respectively. This study is inkeeping with this as all the radiation oncologists involvedwere rated highly by medical colleagues but rated them-selves lower. However, this represents a select group ofclinicians who volunteered to undergo MSF and would notnecessarily apply to a larger sample of radiation oncolo-gists. A longitudinal study of general practitioners (GPs)who underwent PAR twice, 5 years apart has shown thatself-assessment ratings significantly improved at thesecond time point but were not related to the initialfeedback provided.18 They concluded that clinician self-assessment is driven by fairly stable perceptions theclinicians had of themselves and changes little over time.

Although it is valuable to get MSF, the more importantquestion is if it results in behaviour change in responseto any issues identified. Violato et al. evaluated PARson 250 GPs performed 5 years apart.19 Mean ratingsby medical colleagues, co-workers and patients allimproved over time although only reached statisticalsignificance for the former two. A meta-analysis of MSFhas shown that it can lead to improvements over timealthough the effect size is small.20 Improvement wasmore likely when feedback indicated a change was nec-essary, recipients accepted the need for change, feltchange was feasible and reacted positively to feed-back.20,21 A recent systematic review of MSF found thatwhile most doctors felt that MSF had educational valuethe evidence for practice change was conflicting.22

There are challenges to implementing MSF. In order tobe completely independent, the data should be collectedand analysed by a third party not connected to theclinicians undergoing MSF. This incurs a cost of AUD900for this study. In Alberta, MSF is administered by anindependent research company.18 The cost of adminis-tering MSF (CAD200 per clinician in 1999) is covered bythe College of Physicians and Surgeons in Albertaresulting in a slight increase in annual licensing fee forclinicians.3 Implementation of MSF at a college or insti-tutional level is likely to be cheaper than at a depart-mental level. We also need to consider the time taken tofill in these surveys. Patients and referring clinicians onlyfilled in one survey, but there were co-workers who weresurveyed by more than one clinician seeking feedback.The maximum number of survey forms that an individualfilled in was three. The fact that the response rate washigh reassures us that this was a not too arduous a task.The whole process of MSF took 3 months from surveysbeing sent to reports being received.

MSF also requires oversight of process to ensureidentification of any clinician deficiencies that can beaddressed with necessary support. In this pilot study,the MSF reports were only sent to the clinician beingassessed and it was up to the individual whether theychose to discuss the findings with their manager. In

Alberta, Canada, where the process is run through theCollege of Physicians and Surgeons, the feedback reportis sent to the clinician and the Physician PerformanceCommittee, who have the power to investigate andremediate any serious clinician deficiencies identified.3

This does raise the question whether MSF is betterimplemented at the institutional or college level toprovide this degree of oversight.

It may be argued that MSF has no intrinsic value unlikeother CPD activities such as research and education.However, all current RANZCR CPD activities are gearedtowards measuring clinical competence and nonemeasure the attributes of communication, collaborationand professionalism which all form part of the CanMEDSprinciples. Poor communication in particular has beenassociated with increased likelihood of malpracticeclaims.23 These attributes could be assessed throughdirect observation; however, this would be time consum-ing and costly. The alternative would be through surveysof people who have had some contact with the clinician.In Australia, patient satisfaction surveys can alreadycount towards CPD for RANZCR and the Royal Australa-sian Colleges of Physicians and Psychiatrists. This wouldbe considered as one facet of MSF. RANZCR reviews theperformance of all radiation oncology trainees with anMSF tool. The results are collated by the college andsummary sent to the trainee and Director of Training forreview, feedback and discussion. If we think that this isa useful tool for assessing trainees, then by extrapola-tion, it could also be used for consultant radiationoncologists as a means of assessing ongoing professionalcompetence.

Although this study has shown that MSF is feasible atthe departmental level, its feasibility at the college levelhas not been tested. This could be assessed by recruit-ing a random sample of radiation oncologists throughthe college with an invitation to participate in MSF andevaluating the participation rate, survey response rateand acceptability of the process. The resources requiredfor data collation, analysis and feedback would alsoneed to be measured to assess sustainability of such aprogramme. The impact of MSF on change could onlybe assessed in a longitudinal study where participantsare followed up to see if they have implemented prac-tice change in response to MSF. One of the aims of MSFwould be to identify underperforming clinicians or thosewhose results are significantly below average comparedwith their peers. However, such clinicians may choosenot to take part in MSF, hence identification of theseclinicians requires this to be a compulsory rather thanvoluntary recurrent activity. MSF is not a tool to beused in isolation and should be considered with allother CPD activities when assessing any clinician. AnMSF programme would also need some oversight by anappropriate college committee to ensure ongoingquality assurance and to maintain validity of theprogramme.

SK Vinod and DM Lonergan

© 2013 The AuthorsJournal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists388

Page 6: Multisource feedback for radiation oncologists

In our department, the radiation oncologists haveagreed to undergo MSF every 3 years on a voluntarybasis. Individualised reports will be sent to the clinicianand discussed at the annual performance review withtheir manager. Currently, this does not count towardsthe RANZCR CPD programme, but we hope it will beincorporated in the future. One of the aims of CPD is to‘maintain professional standards with the goal of provid-ing better health care’,2 and MSF may be one method toassess this.

Acknowledgements

We acknowledge Dr Grahame Simpson, Dr Lauren Gillettand Ms Samantha Bzishvilli for data collation and analy-sis. We would like to thank all the radiation oncologistswho participated in this study.

References

1. Frank JR, Danoff D. The CanMEDS initiative:implementing an outcomes-based framework ofphysician competencies. Med Teach 2007; 29: 642–7.

2. RANZCR [homepage on the Internet]. CanMEDSFramework Used to Inform the Radiation OncologyCPD Program. Royal Australian and New ZealandCollege of Radiologists, Australia, 2011. [Cited 15July 2012.] Available from URL: http://www.ranzcr.edu.au/cpd/overview/learning-a-development-framework

3. Hall W, Violato C, Lewkonia R et al. Assessment ofphysician performance in Alberta: the physicianachievement review. Can Med Assoc J 1999; 161:52–7.

4. Ramsey PG, Carline JD, Blank LL, Wenrich MD.Feasibility of hospital-based use of peer ratings toevaluate the performances of practicing physicians.Acad Med 1996; 71: 364–70.

5. Ramsey PG, Wenrich MD, Carline JD, Inui TS, LarsonEB, LoGerfo JP. Use of peer ratings to evaluatephysician performance. JAMA 1993; 269: 1655–60.

6. PAR [homepage on the Internet]. Pivotal ResearchInc.; c2006–2011. Physician Achievement Review,Canada, 2004. [Cited 1 June 2012.] Available fromURL: http://www.par-program.org/

7. Violato C, Marini A, Toews J, Lockyer J, Fidler H.Feasibility and psychometric properties of using peers,consulting physicians, co-workers, and patients toassess physicians. Acad Med 1997; 72: S82–4.

8. Lockyer JM, Violato C. An examination of theappropriateness of using a common peer assessmentinstrument to assess physician skills acrossspecialties. Acad Med 2004; 79: S5–8.

9. SPSS statistical program. Version 18. Chicago, IL:SPSS Inc; 2010.

10. Census data [homepage on Internet]. 2011 CensusCommunity Profiles. Australian Bureau of Statistics,Australia, 2011. [Cited 4 Aug 2012.] Available fromURL: http://www.censusdata.abs.gov.au/census_services/getproduct/census/2011/communityprofile/127?opendocument&navpos=230

11. Lockyer JM, Violato C, Fidler H. A multi sourcefeedback program for anesthesiologists. Can JAnaesth 2006; 53: 33–9.

12. Lockyer JM, Violato C, Fidler H, Alakija P. Theassessment of pathologists/laboratory medicinephysicians through a multisource feedback tool. ArchPathol Lab Med 2009; 133: 1301–8.

13. Violato C, Lockyer JM, Fidler H. Assessment ofpsychiatrists in practice through multisourcefeedback. Can J Psychol 2008; 53: 525–33.

14. Violato C, Lockyer J, Fidler H. Multisource feedback:a method of assessing surgical practice. BMJ 2003;326: 546–8.

15. Violato C, Worsfold L, Polgar JM. Multisourcefeedback systems for quality improvement in thehealth professions: assessing occupational therapistsin practice. J Contin Educ Health Prof 2009; 29:111–8.

16. Sargeant JM, Mann KV, Ferrier SN et al. Responsesof rural family physicians and their colleagueand coworker raters to a multi-source feedbackprocess: a pilot study. Acad Med 2003; 78:S42–4.

17. Violato C, Lockyer J. Self and peer assessment ofpediatricians, psychiatrists and medicine specialists:implications for self-directed learning. Adv Health SciEduc Theory Pract 2006; 11: 235–44.

18. Lockyer JM, Violato C, Fidler HM. What multisourcefeedback factors influence physicianself-assessments? A five-year longitudinal study.Acad Med 2007; 82: S77–80.

19. Violato C, Lockyer JM, Fidler H. Changes inperformance: a 5-year longitudinal study ofparticipants in a multi-source feedback programme.Med Educ 2008; 42: 1007–13.

20. Smither JW, London M, Reilly RR. Does performanceimprove following multisource feedback? A theoreticalmodel. Pers Psychol 2005; 58: 33–66.

21. Sargeant J, Mann K, Sinclair D, Van V, MetsemakersJ. Understanding the influence of emotions andreflection upon multi-source feedback acceptance anduse. Adv Health Sci Educ Theory Pract 2008; 13:275–88.

22. Miller A, Archer J. Impact of workplace basedassessment on doctors’ education and performance:a systematic review. BMJ 2010; 341: c5064.

23. Levinson W, Roter DL, Mullooly JP, Dull VT, FrankelRM. The relationship with malpractice claims amongprimary care physicians and surgeons. JAMA 1997;277: 553–9.

MSF for radiation oncologists

© 2013 The AuthorsJournal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists 389