10
Radiography (2000) 6, 151–159 doi:10.1053/radi.2000.0255, available online at http://www.idealibrary.com on EDUCATION Auditing the clinical placement experience Richard Price, MSc, FCR Nicola Hopwood, MSc, PgCert, HDCR and Vivian Pearce, TDCR, DMU Department of Radiography, Faculty of Health & Human Sciences, University of Hertfordshire, College Lane, Hatfield, Herts, AL10 9AB, U.K. (Received 9 September 1999; revised 23 March 2000; accepted 14 April 2000) Purpose: An audit process was developed to enable the evaluation of clinical education in undergraduate radiography. The audit tools were designed to evaluate the delivery of clinical education against identified standards and criteria that evolved from a framework of generic quality measures. This paper will report the findings of the audit process in respect of the perceived performance of clinical education support staff as determined by first and second year students. Additionally it will confirm the effectiveness of the audit process in monitoring and developing standards within radiography education. Methods: A quality process based on the adult learning model was introduced in 1992. The initial informal quality process was formalized in 1996 as a retrospective organizational audit. Tools were developed to: i. evaluate and monitor students’ experience and satisfaction with the clinical education component of the course; ii. evaluate and monitor the clinical staffs’ satisfaction with the clinical education component of the course; iii. update the information held about clinical sites. Results: All sites have the facilities necessary to ensure appropriate clinical education and training for students. In evaluating the support for students within the clinical education situation the performance of the key groups of staff were shown to be influential and interrelated. The cultures of the clinical department and the effectiveness of the clinical lecturer within the clinical department were pivotal in the students’ perception of quality. Achievement of standards varied within and between clinical sites demonstrating that the audit tools were able to discriminate objectively between acceptable and unacceptable standards and performances in the clinical education programme. Conclusions: The involvement of as wide a range of staff as possible in the monitoring process has confirmed that the audit of clinical placements is an effective tool that can be used successfully in radiographic education. © 2000 The College of Radiographers Key words: education; quality; radiography; standards; students. Introduction Radiography education at the University of Hertfordshire dates from 1991 following the award of a contract by the former North West Thames Regional Health Authority (NWTRHA). It was envisaged that the clinical practicum of the scheme would ensure the students’ development of independent learning skills, which are the funda- mental elements in producing critical, analytical thinkers [1]. As part of NWTRHA Working Paper 10 strategy, the contract specified that: ‘The provider will be responsible for organising all aspects of clinical placements and shall ensure that the 1078–8174/00/030151+09 $35.00/0 © 2000 The College of Radiographers

Auditing the clinical placement experience

Embed Size (px)

Citation preview

Radiography (2000) 6, 151–159

doi:10.1053/radi.2000.0255, available online at http://www.idealibrary.com on

EDUCATION

Auditing the clinical placement experience

Richard Price, MSc, FCR

Nicola Hopwood, MSc, PgCert, HDCR

and Vivian Pearce, TDCR, DMU

Department of Radiography, Faculty ofHealth & Human Sciences, University ofHertfordshire, College Lane, Hatfield,Herts, AL10 9AB, U.K.(Received 9 September 1999;revised 23 March 2000;accepted 14 April 2000)

Purpose: An audit process was developed to enable the evaluation of clinical education inundergraduate radiography. The audit tools were designed to evaluate the delivery ofclinical education against identified standards and criteria that evolved from a frameworkof generic quality measures. This paper will report the findings of the audit process inrespect of the perceived performance of clinical education support staff as determined byfirst and second year students. Additionally it will confirm the effectiveness of the auditprocess in monitoring and developing standards within radiography education.Methods: A quality process based on the adult learning model was introduced in 1992. Theinitial informal quality process was formalized in 1996 as a retrospective organizationalaudit. Tools were developed to:

i. evaluate and monitor students’ experience and satisfaction with the clinical educationcomponent of the course;

ii. evaluate and monitor the clinical staffs’ satisfaction with the clinical educationcomponent of the course;

iii. update the information held about clinical sites.

Results: All sites have the facilities necessary to ensure appropriate clinical education andtraining for students. In evaluating the support for students within the clinical educationsituation the performance of the key groups of staff were shown to be influential andinterrelated. The cultures of the clinical department and the effectiveness of the clinicallecturer within the clinical department were pivotal in the students’ perception of quality.Achievement of standards varied within and between clinical sites demonstrating that theaudit tools were able to discriminate objectively between acceptable and unacceptablestandards and performances in the clinical education programme.Conclusions: The involvement of as wide a range of staff as possible in the monitoringprocess has confirmed that the audit of clinical placements is an effective tool that can beused successfully in radiographic education. © 2000 The College of Radiographers

Key words: education; quality;radiography; standards; students.

Introduction

Radiography education at the University ofHertfordshire dates from 1991 following the awardof a contract by the former North West ThamesRegional Health Authority (NWTRHA). It wasenvisaged that the clinical practicum of the scheme

1078–8174/00/030151+09 $35.00/0

would ensure the students’ development ofindependent learning skills, which are the funda-mental elements in producing critical, analyticalthinkers [1]. As part of NWTRHA Working Paper10 strategy, the contract specified that:

‘The provider will be responsible for organising allaspects of clinical placements and shall ensure that the

© 2000 The College of Radiographers

152 Price et al.

standards and supervision of clinical placements, as aminimum, will conform to the national standards laiddown by approved bodies.’

This was seen as an important clause that recog-nized the clinical education component as anintegral element within the radiography scheme(diagnostic and therapeutic routes). Students onceaccepted at the university have a right to a soundclinical education and training which is the pivotalelement in preparing them for practice as proficientpractitioners upon qualification. The introductionof a degree scheme ensured that radiography wasplaced on a more solid academic base. However, aswith nursing [2], the need for high quality clinicalplacements has not diminished and can be pre-dicted to increase. Previous nursing researchemphasizes the important role that clinical staffplay in facilitating student learning and the signifi-cance of the social content of learning [3–6]. Theextent to which clinical placements are used inradiography education also requires clinical staff tocontribute to the students’ learning experience.

Monitoring the quality of clinical education wasconsidered essential. The university audits thestudent experience of the academic environmentand it is reasonable to expect that a similar exerciseshould take place in the clinical environment.

This paper will report the findings of the auditprocess in respect of the perceived performance ofclinical education support staff as determined byfirst and second year students. In addition, thepaper will demonstrate that the audit tools wereable to discriminate objectively between acceptableand unacceptable standards of performance in theclinical education programme.

Audit and quality

Audits have been shown to create conditions forimproved learning opportunities and professionaldevelopment for students and qualified staff [7, 8].The experiences of other professions in the healthcare environment recommend that the audit shouldtake place at regular intervals and that this shouldbe a joint exercise between the educational estab-lishment and service [9]. The decision was taken tocarry out a retrospective organizational qualityaudit based on a framework of quality measures.These measures were developed to consider, holis-tically, the organization, management and structureof the clinical education experience of the students.

The essential structure of a quality assess-ment requires goals to be stated, standards to be

identified and criteria to be indicated [10]. Anorganizational quality audit (QA) is an examinationof an organization’s arrangement to control andassure the quality of its products or services. Auditsuse standards against which elements of a service’sorganization, system and performance can bejudged. The standards are based upon qualitycriteria and a framework which highlights areas ofan organization which are believed to be essentialto the organization’s ability to consistently providea quality service [11]. There are many recognizeddefinitions of a quality audit. The authors identifiedwith that used in Making Medical Audit Effective[12], where the purpose of quality audit wasdefined as:

‘a systematic quantified comparison against explicitstandards of current clinical education to improve thequality of clinical training for radiography students.’

The stated goals of our audit were to:

� enhance the quality of education and training;� enable analysis of the organizational aspects

supporting clinical education of students duringplacements;

� ensure systems exist to enable clinical depart-ments’ to meet the agreed specifications forclinical education;

� check existing quality systems against an agreedstandard, thus highlighting weak points and thencorrecting them;

� reassure students, lecturers and clinical staff thatthe best quality had been achieved;

� ensure that the existing procedures were beingfollowed and providing training for personnelwhere it is required;

� comply with the quality standards of theeducation purchaser which had demanded aplacement audit.

Although all the above points were covered in theaudit this paper focuses on one aspect, the analysisof the organizational aspects supporting clinicaleducation during placements.

The organization of clinical education

Clinical education is organized by clinical lecturersbut relies on the commitment of a number ofclinical and university-based staff for its success.The scheme also depends on an extensive com-munications network in order to support thestudents. The personnel involved in clinicaleducation include:

Clinical placement audit 153

� link tutor, a university based lecturer whose roleis identified as maintaining and enhancing thelinks between a clinical department used for theeducation of student radiographers and provid-ing support to students, clinical lecturers anddepartmental staff as required;

� clinical lecturer, a university employed lecturerwho is based predominantly at clinical sites andwho is responsible for the organization anddelivery of clinical education;

� clinical co-ordinator, a member of the clinicaldepartment’s staff who acts as a first point ofreference for the students in the absence of theclinical lecturer.

In addition, an overview is maintained by the:

� scheme tutor, a university based lecturer who isresponsible for the overall performance of thescheme.

Figure 1 represents the communication linksbetween individuals and the two establishments.

Development of the tools

In setting up the degree scheme a memorandum ofco-operation between each clinical site and theuniversity was agreed. The memorandum set outthe responsibilities of each placement and theuniversity with respect to clinical education andsupporting features. This, whilst enabling thestructural factors of the clinical education to be

monitored, did not provide data on the influence ofthe department or the social context of the learningand this working party was established to devise atool for collecting this information. It drew on awide range of experience of radiography educationand the research skills then available. This includedstaff from the department (diagnostic and thera-peutic) plus a senior colleague in physiotherapywho joined the group to give a related butoutsiders’ view point.

The introduction of the quality monitoringprocess was based on the adult learning model [13].This learning model is essentially a strategy forgrowth rather than the enforcement of a brand newsystem. It recognized that the writing of standardsand criteria can be an inhibitor to the acceptanceand development of the audit process and thatearly development of quality monitoring is feasibleahead of rigorous documentation.

A written survey instrument, in the form of aself-administered questionnaire, was chosen to col-lect descriptive data on the degree of satisfaction ofthe participants involved in the clinical educationprogramme. Despite the known limitations of theuse of questionnaires [14], this method was selectedbecause of its ease of use and because there is abelief that ‘people are more likely to give acomplete and truthful information on sensitivetopics if a self-administered questionnaire ratherthan an interview is used’ [15]. It can also give afirst approach to the problem with acceptablereliability [14–16], giving suggestions for improve-ment measures and further investigations. Thequestionnaires continue to be refined with thecooperation of the clinical departments. Analysis ofthe data was limited and non-quantitative due tothe constraints of time and facilities available. Basedupon the knowledge of previous years’ quality ofthe current practice/systems, a formal audit wasdeveloped in 1996. This adopted a factorialapproach with the criteria and standards of clinicaleducation being developed from a framework ofgeneric quality measures within six identifiedperformance indicators, namely:

� clinical experience;� clinical assessment;� academic support;� student welfare;� departmental influences;� living and social aspects.

Questionnaires were designed so that the resultscould be integrated with the established academic

Clinicallecturer

Schemetutor

StudentLinktutor

Clinicalstaff

Departmentalclinical

co-ordinator

Clinicalmanager

University staff

Placement staff

Key:

Figure 1. Support for clinical education.

154 Price et al.

audit. After consultation with the staff at theuniversity’s computer centre the use of opticalmark reading (OR) was adopted to improve speedand the quality of data produced from the ques-tionnaires, thus enabling a detailed and quantitativeanalysis. Also, as integral components of the audit,it was decided to review the first and second yearclinical folders (records of formative assessment)and update the data on the imaging/treatmentfacilities at each site (departmental profiles).

Data analysis

The objectives of formal audit were identified as:

� monitoring of student satisfaction with the clini-cal education programme;

� monitoring the satisfaction of clinically basedstaff with the clinical education programme;

� assessment and evaluation of current sitesagainst the requirements of the memorandum ofco-operation between clinical placement sitesand the educational establishment;

� a review of the information held about clinicalsites, for example, treatment units and imagingfacilities.

The questionnaires and students’ clinical folderswere collected towards the end of the academicyear to include the whole year’s clinical educationexperience. At the same time the departmentalprofiles were up-dated. The analysis is detailedbelow.

Questionnaires

Questionnaires were distributed to clinical staff andall first and second year students. The clinical staff

group included departmental managers, clinicalsuperintendents, clinical co-ordinators, mentors,supervisors and radiographers. Additional com-ments were encouraged from all participants withconfidentiality assured. The first section of thequestionnaire enabled coding of the site, whether itwas a diagnostic or therapeutic department and thestudent year group or staff type identification. Theremaining sections, one for each of the previouslyidentified performance indicators, used attitudescales (scored 1�5, ranging from strongly agree tostrongly disagree) in order to measure the respond-ent’s opinion to positively and negatively phrasedstatements. An example is shown in Table 1.

The questionnaires were subjected to descriptivestatistics and content analysis [17]. Following dis-cussion among the working party it was agreedthat at least 60% (�60%) of a population mustagree/strongly agree with the positive form of aquestionnaire statement for the standard to havebeen met. This gave more than a simple majorityand thus provided the standard for the audit.

To evaluate the complexities and to identify keymeasures associated with the learning experienceand support provided for students a Pearson’scorrelation matrix was calculated for all studentsresponses. The questionnaires used a continuousscale of measurement and as the number ofstudents, with all observations recorded, exceeded30, parametric statistics were considered appropri-ate [14]. The number of responses with no missingvalues was 42 and when these cases were con-sidered values r=0.304, P=0.05 were consideredsignificant, and, r=0.393, P=0.01 considered verysignificant. The questionnaires were not rated forreliability or controlled for non-response error, butit was recognized that errors occurred due toinherent biases in the question phrasing and anypersonal factors affecting the subjects.

Table 1. Example of questionnaire and OMR form

Questionnaire statement

OMR form attitude level scale and corresponding levelof agreement with questionnaire statement

1Strongly agree

2Agree

3Uncertain

4Disagree

5Strongly disagree

The students/I had plenty ofopportunity to gain enoughexperience to complete my record ofclinical experience

Clinical placement audit 155

Students’ clinical folders

The students’ clinical folders (first and secondyears) were analysed using descriptive statisticsand content analysis [17] for their record of clinicalexperience and clinical competency levels achievedfor individual students during the year.

Departmental profile

A proforma was used to collect factual informationon individual departments. These enabled clinicallecturers and link tutors to comment further onfactors that had or had not influenced the quality ofclinical education provided by that department.The data were subjected to content analysis [17].

The variation in data collection methods permit-ted a degree of concurrent validity in order toimprove confidence in the results. The third yearstudents were excluded from this audit as theiropinions had been reported within the university’sannual academic audit. This paper reports only theresults which relate to the performance of thoseresponsible for the learning experience and alsoidentifies the related issues highlighted by thestudents within the clinical education programme.The data collected from the students’ clinical folderand the departmental proformas are excluded fromthe results section as these did not relate directlyto the performance of those responsible for thelearning experience.

Results

The data are reported for the overall performanceand the individual performance of the clinicalplacement sites for each identified population (staff,first and second year students). The Pearson’scorrelation matrix was calculated and reportedfor the students’ data as a single population.Questionnaire returns are detailed in Table 2.

Clinical lecturers

Table 3 shows that the clinical lecturers consist-ently achieve three of these four standards. Clinicallecturers were shown to have visited their sites on2 days per week with students and staff demon-strating overall satisfaction with the support. Thelevel of support provided by the clinical lecturerwas shown to directly influence the students’ability to contact and confer with their link tutors.

The standard which was most frequently notachieved was related to the time that the clinicallecturer spent with individual students in thedepartment. Opinion on this issue varied betweensites and within sites. Overall, the first yearstudents recorded greater satisfaction than thesecond year students, whilst the staff were the leastcontent. For the clinical lecturers to have beenperceived as spending sufficient time in the depart-ment with individual students the following criteriahad to be met:

� the clinical lecturer visited the department ontwo days per week;

� the students attended tutorials weekly;� the link tutor visited the sites once per place-

ment;� the staff made a real effort to understand any

difficulties that the students experienced withtheir clinical practice;

� the clinical objectives were coordinated with theacademic programme;

� the students knew what was expected of them;� the formative assessment was an accurate reflec-

tion of their progress;� the students experienced no problems with

undertaking the summative assessments to theagreed timetable.

Clinical staff

The responses shown in Table 4 indicate thatopinions varied between clinical sites. No clinicalsite staff achieved the standard that ‘the staff spendsufficient time with individual students in thedepartment’ and although the first year studentswere satisfied, the second year students concurredwith the staff. The results in Table 4 suggest thatthe students on some sites did not consider that thestaff enjoyed teaching but the staff were perceivedgenerally as being welcoming and understanding.The weekly assignment of mentors was identifiedas problematic by all groups.

Table 2. Questionnaire returns for 1995–96

PopulationRadiography type

Diagnostic Therapeutic

Staff 111 38Year 1 Students 37 6Year 2 Students 31 5

156 Price et al.

Pearson’s calculations demonstrated very sig-nificant correlation between the staff orientedstatements:

� the amount of time the radiographers spent withme was sufficient;

� I believe the staff enjoyed teaching me;� I was made to feel welcome by the staff;� the staff made a real effort to understand any

difficulties that I experienced with my clinicalpractice.

Therefore, if a student agreed with one ofthese statements it can be inferred that they willagree with the others. Using the statement, ‘Ibelieve that the staff enjoyed teaching me’, as a centreof the net measure [10], these statements, wereshown to be linked directly with the question-naire statements:

� the clinical objectives were well co-ordinatedwith the academic programme;

� it is easy to know what is expected of me;� the postgraduate and departmental facilities were

available to me;� my clinical placement site was reasonably

accessible by public transport;� my residential accommodation was reasonably

accessible by public transport;� the busy workload did not restrict my ‘hands on’

experience;� I feel the formative assessment was an accurate

reflection of my progress;� I normally attended tutorials weekly.

The problem with the weekly assignment ofmentors was inversely linked to the support thatthe students received from their clinical lecturer and

Table 3. Achievement of standards by the clinical lecturers

Questionnaire statement

Percentage of clinical sites groupsachieving the standard

Staff(n=13)

1st Years(n=13)

2nd Years(n=10)

The clinical lecturer spent sufficient time withindividual students/me in the department 31 69 40

The clinical lecturer normally visited this site on2 days per week 92 77 70

The clinical lecturer had an office suitable forstudent counselling 69 77 80

The students/I had sufficient support fromtheir/my clinical lecturer 92 92 80

Table 4. Achievement of standards by the clinical staff

Questionnaire statement

Percentage of clinical sites groupsachieving the standard

Staff(n=13)

1st Years(n=13)

2nd Years(n=10)

The amount of time the radiographers/I spent withthe students was sufficient 0 62 40

I/The students experienced no problems in beingassigned a mentor each week 15 24 20

I believe I/the staff did enjoy teaching 92 54 60I was made to feel welcome by the staff not asked 70 70The staff made a real effort to understand any

difficulties that I experienced with my clinical practice not asked 77 50

Clinical placement audit 157

the student’s ability to contact and confer withtheir link tutor whilst on clinical placement. Whenthe students experienced problems with mentors,they were significantly less likely to know whatwas expected of them within the department.

Link tutors and clinical coordinators

Table 5 suggests that the overall consensus of staffand student opinion that neither the link tutor northe clinical co-ordinator achieved the standards setby the audit. Second year students, however, weresatisfied, with the performance of their link tutors.This confirmed the statistically significant findingthat when the link tutor visited the site on a regularbasis the link tutor was perceived as more access-ible by the students when they were on clinicalplacement. The performance of the link tutor wasshown to be related directly to the studentsexperiencing sufficient support from ‘my’ clinicallecturer and experiencing problems being assigneda mentor each week.

The support provided by the clinical coordinatorreflected the student’s perception of there beingsufficient special examinations and or treatments togain the appropriate clinical experience and theindividual student being welcomed by the staff.

Discussion

Audits from previous years had suggested that ‘thestaffing situation can reflect directly on the qualityof support and supervision’. It is known that whenan individual works in an atmosphere of trust, theywill put themselves at risk: only through risk isthere growth, reward, self-confidence and leader-

ship [18]. When staff made time for and enjoyedteaching the students were more satisfied. Thisconfirmed the view of Hart [3] that the positiverelationship with departmental staff can be vital. Indepartments where the workload restricts ‘handson’ experience, the students’ satisfaction with theoverall performance of the radiographers wasdiminished. There were no problems in depart-ments where the staff were student oriented. It maybe more important that the individual discussingthe weekly learning contract is familiar with theclinical area and is able to identify learning oppor-tunities, rather than the emphasis being placed ontraining as a mentor.

Clinical lecturers provide an essential linkbetween the university and the hospital sites. Theirprimary role being to ensure the effective organiz-ation and delivery of the clinical educationalprogramme. The audits of previous years havesuggested that ‘the students feel the need for moresupport from the clinical lecturers’. The studentsrecorded satisfaction (78%) with the support theyreceived from their clinical lecturers. This wasenhanced when the clinical lecturer visited thedepartment on 2 days per week and tutorials weredelivered weekly. The regularity of visits to theclinical site was shown to improve the linksbetween the university and the students. The lackof satisfaction reported with the length of time theclinical lecturer spent with individual students inthe department when evaluated with the satisfac-tion in student support, suggests, that the clinicallecturers concentrate on group orientated tasks.These include the delivery of tutorials and thefacilitation of the clinical education rather thanspending time in the department with individualstudents. Reasons for this were not being con-sidered within this audit, however, within nursing

Table 5. Achievement of standards by the link tutors and clinical co-ordinators

Questionnaire statement

Percentage of clinical sites groupsachieving the standard

Staff(n=13)

1st Years(n=13)

2nd Years(n=10)

My link tutor visited this site once a placement not asked 46 60I was able to contact and confer when necessary

whilst on clinical placement with my link tutor not asked 23 60Additional support was provided for me by the

clinical co-ordinator 46 23 30

158 Price et al.

the value of the clinical lecturers is seen as facili-tation rather than instruction [3] and this case maybe made in radiography.

In evaluating the support for students withinclinical education the performance of the keygroups of staff were shown to be influential andinterrelated. The support from the link tutors andclinical coordinators is considered essential inensuring continuity of student welfare. Their per-formance was seen as being related to the level ofsupport provided by the clinical lecturer and thestaff as a whole respectively. The students’ percep-tion of the staff was enhanced when the clinicallecturer was present in the department more often.

In setting up the quality audit the universityneeded a convenient way of proving that clinicaleducation, as with the academic education, wasmanaged, co-ordinated, productive and account-able. The importance of the audit was in producinga one-time summary that would generate ‘mes-sages’ that could be considered and acted upon.Concurrent validation between the student ques-tionnaires, student folders and the departmentalproformas reinforced the perception that the qual-ity of the clinical experience of the student was notrelated to the opportunities available but connectedto the culture of the department and the effective-ness of the clinical lecturer. For the audit to beeffective the tools needed to be able to discrimi-nate objectively and identify acceptable andunacceptable performance.

One issue that needs further reflection is the useand appropriateness of five levels within the ques-tionnaire’s continuous scale (see Table 2). Manyrespondents opted for the central ‘uncertain’ boxand this lack of opinion proved unhelpful whendata were analysed. Nevertheless, despite this,Tables 3–5 illustrated that standards were notachieved uniformly allowing specific areas to beidentified for further evaluation prior to consulta-tive implementation of change. The tools, there-fore, were shown to be appropriate and effective.

The success of this quality audit was whollydependent upon the motivation and participationof the students, the clinical and the university-based staff. Problems have arisen with the formalaudit which include the size and timing of thedelivery of the questionnaires, the ambiguity ofsome questions on the questionnaires and themotivation of some individuals due to a previouslack of feedback. The ideal quality audit wouldensure that only significant data were reportedwith just enough measurements taken to establish a

conclusion. However for this first quality audit dataoverload occurred. It was envisaged that the extratime involved in collecting an excessive amount ofdata and the subsequent statistical analysis wouldconfirm the limitations of descriptive statisticswhen ‘standards’ were not applied. In addition, ithas enabled the identification of ‘centre-of-the-net’measures and the relative importance of issueswithin the students’ clinical education experienceto be assessed. In future this knowledge will enablethe number of measures and data collected to bereduced. This should reduce the time element ofthe process for all participants and improve com-mitment to the process.

Conclusions

The purpose of any quality audit is to improve anorganization within its business environment [11].For this to occur the organization must understandhow key behaviours relate to different elements ofthe service. Service quality measures are expensiveto develop, requiring an investment of staff time(both clinical and university) to design the tools,to collect and analyse the data. Without adequatedata analysis sufficiently directed corrections andimprovements cannot be made, accordingly, time,expense and effort are wasted. An effective auditprocess must not only set standards and targets onmeasures that make clear to all what is expectedof them but must also identify trends that maynot otherwise be apparent. The audit processundertaken to assess the clinical education pro-gramme within the University of Hertfordshire’sundergraduate radiography scheme was to ensureunbiased reporting of problems and to developan understanding of the complexities that existwithin it. Change can then be managed moreeffectively.

The experience of the audit has confirmed it tobe an effective tool in monitoring and developingstandards within radiography education. Crucially,it raises the awareness of all those involved inclinical education and encourages staff and stu-dents to evaluate the quality of the clinicalexperience. Elements of good and not so goodeducational practice can be identified and dis-cussed on a group or individual basis and re-medial action can be taken where appropriate. Itgives clinical staff and students the opportunityto express their views on the scheme and henceinfluence decision making.

Clinical placement audit 159

Acknowledgements

Our thanks to Alan Hanslow and Ken Ryder at theUniversity of Hertfordshire and colleagues in the originalworking party—Sue Farmer, Tony Higgs, Janet High, MarinaMalaspina and Jacqui Potter.

References

1. McKay J. Developing Curriculum for IndependentLearning. Radiol Technol 1994; 66: 113–18.

2. Lauder W. Auditing Clinical Placements. Senior Nursing1993; 13: 34–6.

3. Hart G. The best and worst students’ experience ofclinical education. Austr J Adv Nursing 1994; 11: 26–33.

4. Fretwell JE. Creating a ward learning environment: thesister’s role. Nursing Times 1983; 79: Occasional Papers37–9.

5. Smith P. The relationship between the quality ofnursing care and the ward as a learning environment: amethodology. J Adv Nursing 1987; 12: 413–20.

6. Orton HD. Discussion and Conclusion. In: Ward learningclimate—a study of the role of the ward sister in relation tostudent nurses learning on the ward. London: RCN, 1981:60–7.

7. Spouse J. Performance Indicators. In: An Ethos for Learning.London: RCN, 1990: 38–50.

8. Royal College of Nursing. Placements in Nursing Education.London: RCN, 1988.

9. Hughes J, Humphrey C. The policy context. In: MedicalAudit in General Practice—A Guide to the Literature.London: King’s Fund Centre, 1990: 1–4.

10. St. Ledger AS, Schneden H, Walsworth-Bell JP. Anoverview of evaluation. In: Evaluating health services’effectiveness—A guide for health professionals, servicemanagers and policy makers. Milton Keynes: OpenUniversity Press, 1992: 1–8.

11. Øvretveit J. Auditing to measure determinants ofservice quality-measuring for the long terms. In: Measur-ing Service Quality: Practical Guideline. Letchworth:Technical Communications (Publishing) Ltd, 1993:59–77.

12. Glover S, ed. Preparing for Audit. In: Making MedicalAudit Effective. Milton Keynes: Joint Centre for Educationin Medicine, 1992: 1–2.

13. Wilson CRM. Introducing QA: An adult learning model.In: Hospital Wide QA. Models for Implementation andDevelopment. London: Saunders & Co., 1987: 27–37.

14. Niewiadomy RM. Data collection methods. In: Founda-tions of Nursing Research, 2nd edn. London: Prentice-Hall,1993: 215–44.

15. Bourque LB, Fielder EP. Overview of self-administeredquestionnaires. In: How to Conduct Self-administered andMail Surveys. London: Sage Publication Inc., 1995: 1–22.

16. Jayawickramarajh PT. How to evaluate educational pro-grammes in the health professions. Medical Teacher 1992;14: 159–66.

17. Brophy J. Content analysis: making sense of qualitativedata. In: Aspects of Enquiry: Research, Design and Methodol-ogy. Thames Polytechnic School of In-Service Educationand Training, 8: 1–8: 10.

18. Peters T, Austin N. Bone-deep beliefs. In: A passion forexcellence. Glasgow: HarperCollins, 1985: 199.