6
Quality in Health Care 1992;1:225-230 Development of indicators for quality assurance in public health medicine Nicholas H Johnston, K M Venkat Narayan, Danny A Ruta Department of Public Health Medicine, Grampian Health Board, Aberdeen AB9 3QP Nicholas H Johnston, research assistant K M Venkat Narayan, consultant in public health medicine University of Aberdeen, Foresterhill, Aberdeen Danny A Ruta, lecturer in public health Correspondence to: Mr Johnston Accepted for publication 12 October 1992 Abstract Objectives - To develop structure, process, and outcome indicators within a quality rating index for audit of public health medicine. Design - Development of an audit matrix and indicator of quality through a series of group discussions with public health physicians, from which self administered weighted questionnaires were constructed by a modified Delphi technique. Setting - Five Scottish health boards. Subjects - Public health physicians in the five health boards. Main measures - Indicators of quality and a quality rating index for seven selected service categories for each of seven agreed roles of public health medicine: assessment of health and health care needs in information services, input into managerial decision making in health promotion, fostering multisectoral collaboration in environmental health services, health service research and evaluation for child services, lead responsibility for the development and/or running of screening services, and public health medicine training and staff development in communicable disease. Results - Indicators in the form of questionnaires were developed for each topic. Three types of indicator emerged: "global," "restricted," and "specific." A quality rating index for each topic was developed on the basis of the ques- tionnaire scores. Piloting of indicators showed that they are potentially general- isable; evaluation of the system is under way across all health boards in Scotland. Conclusion - Measurable indicators of quality for public health medicine can be developed. (Quality in Health Care 1992;1:225-230) Introduction Medical audit has been defined as the "systematic and critical analysis of the quality of medical care."' Its aim is to contribute to improving the health of the population by specifically examining and improving the quality of doctors' work.2 Quality assurance is a more general term, most commonly applied to the managerial orientation of an organ- isation or activity, in which routine practice incorporates the assessment and improvement of that practice. In the health care environment such improvement can involve the efforts of managers, doctors, nurses, or other medical and support staff; quality assurance must therefore be the responsibility of all members of staff and be active "throughout the whole, total organisation."' Public health medicine has been described as "an organised response to the protection and promotion of human health,"4 in which the responsibilities of the specialty centre around disease prevention, health promotion, and health care deployment at population level.5 Long term goals are strategically tackled by multidisciplinary consultation and action, frequently encouraging public participation. Consequently, quality assurance within public health medicine presents inherent difficulties. The traditional audit model of structure, process, and outcome analysis as outlined by Donabedian6 does not readily conform to the activities and responsibilities of the specialty. With outcomes that can be long term, involving several medical and non- medical agents, measurement of the quality of activity in public health medicine is difficult. The principles of audit and quality assurance remain relevant to the discipline, but in order to assess strategically and then assure quality through structure, process, and outcome indicators, new tools are required.7 An index approach has been acknowledged as often the only means to tackle the kind of problem that requires "measures or techniques that do not yet exist."' We describe the development of measurable indicators for the audit of public health medicine. Methods DEVELOPMENT OF AUDIT MATRIX We devised a conceptual model to enable the entire sphere of activities of public health medicine to be mapped out in diagrammatic form. This model consisted of a matrix in which the y axis described the roles and responsibilities of public health medicine, as outlined in the Acheson report,5 and the x axis the various services provided by a health authority or board to which the roles and responsibilities of public health medicine may be applied. The matrix was revised through consultation with 40 public health physicians from five Scottish health boards, culminating in a consensus meeting, which led to the formulation of the final content, structure, and wording of the matrix framework (box 1). This stage of the project will be described in detail elsewhere. 225 on March 14, 2021 by guest. Protected by copyright. http://qualitysafety.bmj.com/ Qual Health Care: first published as 10.1136/qshc.1.4.225 on 1 December 1992. Downloaded from

Development of indicators quality in public health medicine · department of public health medicine and health promotion or education represen-tatives?" is a restricted concept. It

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Development of indicators quality in public health medicine · department of public health medicine and health promotion or education represen-tatives?" is a restricted concept. It

Quality in Health Care 1992;1:225-230

Development of indicators for quality assurance

in public health medicine

Nicholas H Johnston, K M Venkat Narayan, Danny A Ruta

Department of PublicHealth Medicine,Grampian HealthBoard, AberdeenAB9 3QPNicholas H Johnston,research assistantK M Venkat Narayan,consultant in publichealth medicineUniversity ofAberdeen,Foresterhill, AberdeenDanny A Ruta, lecturerin public healthCorrespondence to:Mr JohnstonAccepted for publication12 October 1992

AbstractObjectives - To develop structure,

process, and outcome indicators within aquality rating index for audit of publichealth medicine.Design - Development of an audit

matrix and indicator of quality through aseries of group discussions with publichealth physicians, from which selfadministered weighted questionnaireswere constructed by a modified Delphitechnique.

Setting - Five Scottish health boards.Subjects - Public health physicians in

the five health boards.Main measures - Indicators of quality

and a quality rating index for sevenselected service categories for each ofseven agreed roles of public healthmedicine: assessment of health andhealth care needs in information services,input into managerial decision making inhealth promotion, fostering multisectoralcollaboration in environmental healthservices, health service research andevaluation for child services, leadresponsibility for the development and/orrunning of screening services, and publichealth medicine training and staffdevelopment in communicable disease.Results - Indicators in the form of

questionnaires were developed for eachtopic. Three types of indicator emerged:"global," "restricted," and "specific." Aquality rating index for each topic wasdeveloped on the basis of the ques-tionnaire scores. Piloting of indicatorsshowed that they are potentially general-isable; evaluation of the system is underway across all health boards in Scotland.

Conclusion - Measurable indicators ofquality for public health medicine can bedeveloped.(Quality in Health Care 1992;1:225-230)

IntroductionMedical audit has been defined as the"systematic and critical analysis of the qualityof medical care."' Its aim is to contribute toimproving the health of the population byspecifically examining and improving thequality of doctors' work.2 Quality assurance isa more general term, most commonly appliedto the managerial orientation of an organ-isation or activity, in which routine practiceincorporates the assessment and improvementof that practice. In the health care

environment such improvement can involvethe efforts of managers, doctors, nurses, orother medical and support staff; qualityassurance must therefore be the responsibilityof all members of staff and be active"throughout the whole, total organisation."'

Public health medicine has been describedas "an organised response to the protectionand promotion of human health,"4 in whichthe responsibilities of the specialty centrearound disease prevention, health promotion,and health care deployment at populationlevel.5 Long term goals are strategically tackledby multidisciplinary consultation and action,frequently encouraging public participation.

Consequently, quality assurance withinpublic health medicine presents inherentdifficulties. The traditional audit model ofstructure, process, and outcome analysis asoutlined by Donabedian6 does not readilyconform to the activities and responsibilities ofthe specialty. With outcomes that can be longterm, involving several medical and non-medical agents, measurement of the quality ofactivity in public health medicine is difficult.The principles of audit and quality assuranceremain relevant to the discipline, but in orderto assess strategically and then assure qualitythrough structure, process, and outcomeindicators, new tools are required.7 An indexapproach has been acknowledged as often theonly means to tackle the kind of problem thatrequires "measures or techniques that do notyet exist."' We describe the development ofmeasurable indicators for the audit of publichealth medicine.

MethodsDEVELOPMENT OF AUDIT MATRIXWe devised a conceptual model to enable theentire sphere of activities of public healthmedicine to be mapped out in diagrammaticform. This model consisted of a matrix inwhich the y axis described the roles andresponsibilities of public health medicine, asoutlined in the Acheson report,5 and the x axisthe various services provided by a healthauthority or board to which the roles andresponsibilities of public health medicine maybe applied. The matrix was revised throughconsultation with 40 public health physiciansfrom five Scottish health boards, culminatingin a consensus meeting, which led to theformulation of the final content, structure, andwording of the matrix framework (box 1).This stage of the project will be described indetail elsewhere.

225

on March 14, 2021 by guest. P

rotected by copyright.http://qualitysafety.bm

j.com/

Qual H

ealth Care: first published as 10.1136/qshc.1.4.225 on 1 D

ecember 1992. D

ownloaded from

Page 2: Development of indicators quality in public health medicine · department of public health medicine and health promotion or education represen-tatives?" is a restricted concept. It

6Johnston, Narayan, Ruta

- - - _ - _ ~~~~~~~~~~~~~~ServzicResponsibilities of public health A B C D E Fmedicine Board Information Health Screening Communi- Environ-

health senices promotion services cable mentalpolicy and disease healthmanage- servicesment

1 Assessement of health and health careneeds

2 Input into managerial decision making

3 Fostering multisectoral collaboration

4 Facilitating the development of qualityassurance including clinical audit

5 Health service research and evaluation

6 Lead responsibility for the developmentand/or running of the service*

7 Public health medicine training and staffdevelopment _

* Cells in this row may or may not be applicable and can be shaded in.

Box I Audit matrix framework

CHOICE OF TOPICS

The public health physicians within the fiveboards were asked to identify "cells" for whichaudit indicators could be developed as part ofa pilot study. Seven cells were selected torepresent each of the seven agreed roles andresponsibilities of public health medicine. Theservice categories represented were diverse,including information services, healthpromotion, and screening services. Eachhealth board developed one set of indicators,with the exception of Grampian Health Board,which undertook responsibility for three cells(box 2).

DEVELOPMENT OF INDICATORS

After an extensive literature review of audit,quality assurance, public health medicine, andaspects of the service categories for each cellpreliminary meetings were held at each board;they typically comprised four or five localpublic health physicians and two members ofthe research team. The meetings were

semistructured brainstorming sessions chairedby the researchers, lasting for around one anda half hours, and designed to produce a

preliminary set of structure, process, andoutcome indicators of quality of practicewithin public health medicine for one cell ofthe matrix. The objective and quantifiablecriteria of quality of practice arising from thesemeetings were used to construct selfadministered questionnaires, each boarddeveloping a questionnaire for its own cell.These were returned to each local group ofpublic health physicians four to six weeks later.During a second round of meetings, whichlasted some two to two and a half hours, thequestionnaires were assessed for face andcontent validity before reaching their finalform.

SCORING SYSTEM

Without a quantifiable measure of quality theusefulness of the indicator method as a means

of comparative audit is diminished. In Qrder tofulfil the aim of developing an auditframework that yields quantifiable results we

developed a scoring system for the auditquestionnaires, in which each question andresponse was weighted according to its relative

Box 2 Seven cells identified for development of auditindicators

importance as measures of quality. Aconsensus was required if the scoring systemwas to avoid individual biases. The scoringsystem was based on a Delphi method,6 inwhich individual physicians in each boardascribed scores or weights to each questionand possible response in their cell, sendingtheir recommendations to the researchers.Each physician scored the questionnaireindependently, allocating a weight of one tothe question considered to make the leastcontribution to the measurement of quality.This question was used as a baseline, and allthe other questions were weighted inproportion. The individual weightings werethen converted to percentages, to enable directcomparisons. Thus if a total of 34 points hadbeen allocated to the indicators for structureby an individual physician the scores for thosequestions and responses relating to structurewere recalculated as a percentage of 34, the

Cell Health board DescriptionB 1 Forth Valley Assessment of health and

health care needs ininformation services

C2 Borders Input into managerialdecision making in healthpromotion

F3 Highland Fostering multisectoralcollaboration inenvironmental healthservices

14 Grampian Facilitating thedevelopment of qualityassurance includingclinical audit foroutpatients services

N5 Grampian Health service researchand evaluation for childservices

D6 Argyll and Lead responsibility for theClyde development and/or

running of screeningservices

E7 Grampian Public health medicinetraining and staffdevelopment incommunicable disease

226 on M

arch 14, 2021 by guest. Protected by copyright.

http://qualitysafety.bmj.com

/Q

ual Health C

are: first published as 10.1136/qshc.1.4.225 on 1 Decem

ber 1992. Dow

nloaded from

Page 3: Development of indicators quality in public health medicine · department of public health medicine and health promotion or education represen-tatives?" is a restricted concept. It

Quality assurance indicators in public health medicine

I I K L M N 0 P,rices Care group programmes MajorOutpatients Elderly Mentally ill Mcntallv Physically Children Maternal incident

handi- handi- health and planningcapped capped family

planning

V

1 I----

maximum available. A similar method was

used for the process and outcome scores.

Individual physicians' scores were aggregated,and mean scores and ranges were fed back tothe contributors at local board level, who hadthe opportunity to alter the scores to bestreflect their importance in terms of qualitymeasurement. Final scores and weightingswere ascribed after this second round ofconsultation, leading to the development of a

weighted questionnaire in which the threesections - structure, process, and outcome -

each totalled 100 points. Measuring auditedpractice against the 100 maximum pointsgenerates a "quality rating index," rangingfrom 0 to 100 points, from lowest to highestquality."Not appropriate" responses, an option

provided for indicator questions that mightnot be relevant to a given department incertain circumstances, were not scored.Instead, when such a response was selectedthe score for that section (structure, process,or outcome) was adjusted by removing thescore for that question from the denominatorof the rating calculation before calculating thepercentage rating.

GENERALISABTI ITY

To assess the extent to which this auditmethod can provide generalisable qualitymeasures the questionnaires were piloted afterthis stage. Each indicator questionnairedeveloped by one health board was assignedrandomly to one of the four other boards andused in a departmental audit. A member of theresearch team was present at each pilotsession, and suggested improvements were

recorded and used to reformulate indicators as

necessary.

ResultsCHOICE OF TOPICS

Box 2 shows the seven topics selected for thestudy. Each cell required considerable timeand effort to develop fully, and it becamenecessary to halt the development of one cell(I4 "facilitating the development of qualityassurance, including clinical audit in

outpatients services") in order to concentrateon the remaining six.

DEVELOPMENT' OF INDICATORS

As indicators emerged for each cell it becameapparent that they fell naturally into threeconceptual types: (a) global indicators, (b)restricted indicators, and (c) specificindicators.

Global indicators extend beyond one cell, inthat they are relevant in other rows or columnsof the matrix. With the audit topic "Input intomanagerial decision making in health promo-tion" as an example, an indicator such as:

"Is there a designated public healthphysician with responsibility for publichealth medicine input?"

represents a global concept. It would beappropriate to determine the existence of adepartmental representative in auditing anyresponsibility of public health medicine inrelation to any service category.

Restricted indicators may apply within othercells, but their relevance is limited or"restricted" to one row (responsibility ofpublic health medicine) or one column(service category) of the matrix. They are notuniversally or globally applicable. Again, withthe previous audit topic as an example, anindicator such as:

"Is there a forum within the health boardfor the development and coordination ofhealth promotion policy, which includesdepartment of public health medicine andhealth promotion or education represen-tatives?"

is a restricted concept. It does not applythroughout the matrix but would be commonto the audit of any public health function inrelation to health promotion. Similarly, adevelopmental or coordinating forum mightbe required for public health medicine to fulfilits input into the managerial decision makingresponsibility for service categories other thanhealth promotion.

Specific indicators are relevant to only onecell, applying solely to one responsibility withrespect to one service category. With anexample from the audit topic "leadresponsibility for the development and/orrunning of screening services," an indicatorsuch as:

"Has a time period been determined withinwhich all screen positive patients are to befollowed up?"

refers specifically to the target setting, plan-ning, and monitoring function of the specialtyin relation to screening programmes.A cell, therefore, can be graphically

represented as a model comprising three coretypes of quality indicator: global, restricted,and specific (figure). Although thesesubdivisions have not been formally includedin the full questionnaires, this model hasproved useful in introducing physicians to theconcept of the project as a whole, in acting asa focus for the brainstorming approach, and inclarifying where and how a given indicatormight fit into a complete cell.

INDICATOR QUESTIONNAIRE CONCEPTIn developing the indicators several practicalproblems had to be overcome. Some questions

227 on M

arch 14, 2021 by guest. Protected by copyright.

http://qualitysafety.bmj.com

/Q

ual Health C

are: first published as 10.1136/qshc.1.4.225 on 1 Decem

ber 1992. Dow

nloaded from

Page 4: Development of indicators quality in public health medicine · department of public health medicine and health promotion or education represen-tatives?" is a restricted concept. It

8Johnston, Narayan, Ruta

Matrixcell

Global

F- - IrRestricted Structure Process

Conceptual cell model

and responses which were suggested had to beextensively adapted or even omitted, forvarious reasons. Some questions were seen as

impractical, being either too specific to certainprogrammes or so fundamental as not to needinclusion. Others were difficult or impossibleto quantify; for example, an indicator exam-

ining the "proportion of people contacted inthe policy pathway or route" proved difficultto measure, with practice being too variable toallow useful interdepartmental comparison.More generally, any ambiguity or overlap ofquestions and responses had to be eliminated.The piloting phase contributed significantly tothis process of refinement.

In its final form each questionnairerepresents a protocol for the audit of a specificpart of public health medicine, correspondingto the appropriate cell of the matrix. Theseprotocols aim at facilitating the evaluation ofpractice in the context of a quality ratedquestionnaire where a score may be derivedand quality measured. In completing thequestionnaire a public health department isprovided with an objective and quantifiedaudit of its practice, measured against a

standard set by consensus.

The questionnaire for each cell comprisesstructure, process, and outcome sections, eachwith several questions designed to assess andmeasure the quality of practice. Thequestions, or indicators, are all of the "closed"type, calling for responses which are strictlylimited. Within this closed typology there are

two forms of question: simple alternativequestion and multiple choice questions.Each response carries a score, such that

those responses which reflect the highestquality provide the highest score. Responsestherefore may represent a scale of quality fromhigh scoring (good quality) to low scoring(poorer quality) or may be complementary,where the maximum score may only bederived if all responses are selected.

EXAMPLE OF A QUALITY RATING INDEX

QUESTIONNAIREFocusing on the indicators derived for one

cell, we describe principles and problems thatare representative of all the cells. Reproducingthe full questionnaire is not practicable; box 3gives examples of structure, process, andoutcome questions selected from cell D6

("lead responsibility for the development and/or running of screening services."Box 4 shows abbreviated questions and

scores for the entire questionnaire. As with thefull questionnaire, these are divided intostructure, process, and outcome sections.Although the first seven questions (box 4)relate to the resources available to thedepartment of public health medicine withregard to screening services generally, theremaining questions can be answeredmeaningfully only with reference to one

specific screening programme selected by thedepartment (for example, breast, cervical, or

prenatal screening). The scores reflect therelative importance of each question as a

means of measuring quality within thedepartmental screening role.

DiscussionWe have shown that it is possible to developindicators for public health medicine and toderive a quality rating index for practice withinthis specialty. Aspects of the weighted score

system are unique in the specialty, andtherefore certain elements of the method meritfurther consideration.

Box 3 Selected indicator questions for one cell (D6)

Structure2 When writing a screening review, or (6)making a case for the development of aservice, which of the following resourcesare available? (Tick all that apply)

Clerical/secretarial support ] 2within the Department ofPublic HealthAccess to at least two of the D 2following:

Word processorDatabaseSpreadsheetGraphics packageLaser printer

Statistical specialist within the D 1.5Department of Public Health

Statistical specialist outwith the F 0.5Department of Public Health

Process5 By what means is/was patient feedback (6)sought? (Tick one only)

Systematically from patients F 6

Systematically by proxy D 4through health professionalsUnsystematically (for example, 0 3poster or leaflets in clinics)Feedback not sought 0o

Outcome1 Have 100% of screened positives been (22)followed up within the predeterminedtime period: (Tick one only)

Yes F 22

No DoNot appropriate (for example, n N/Atime period not yet passed)

228

on March 14, 2021 by guest. P

rotected by copyright.http://qualitysafety.bm

j.com/

Qual H

ealth Care: first published as 10.1136/qshc.1.4.225 on 1 D

ecember 1992. D

ownloaded from

Page 5: Development of indicators quality in public health medicine · department of public health medicine and health promotion or education represen-tatives?" is a restricted concept. It

Quality assurance indicators in public health medicine

Nature of question

Structure1 Topics to be considered in a screening review2 Resources available for producing review3 Pathway for the screening policy process4 Public health physician responsible for screening5 Secretarial support for designated physician6 Resources available for literature search7 Accessibility of the identified literature

The following relate to a specified screening programme8 Target population specification9 Plan for monitoring positives10 Plan for monitoring interval cases11 Plan for monitoring non-responders12 Plan for contacting target group13 Screening targets set14 Plan for action on unmet targets15 Time frame for follow up of positives16 "Acceptable" level of interval cases set

Process

The following relate to the screening programme specified1 Are/were screening targets monitored?2 Are/were screening targets met?3 Is/was the incidence of interval cases monitored?4 Methods employed in contacting target population5 Means of seeking screened patient feedback6 Is/was mortality monitored?7 Is/was morbidity monitored?8 Is/was incidence monitored (prenatal only)9 Were screening programme recommendations made?10 Screening experience shared between boards?11 Screening programme copes with throughput?12 Can screening programme be readily modified?13 Were adverse effects of screening considered?14 Literature reviewed before recommendations?15 Time deadline for recommendations met?16 Did recommendations follow policy pathway?17 Were economic appraisals undertaken in review?18 Was statistician consulted in review?19 Quality of presentation of recommendations?

OutcomeThe following relate to the screening programme specified1 100% of positives followed up in time frame?2 Public health recommendations implemented?3 Percentage of target population contacted4 Percentage of eligible target population contacted5 Public health recommendations accepted by board?6 Interval cases below "acceptable" level?

Maximumscore

possible

9666453

887776765

Total 100

887766665555

44444

33

Total 100

22201619149

Total 100

Box 4 Abbreviated questions and scores for entire questionnaire on one cell (D6)

SCORING

The Delphi approach used in ascribing scoresto the questions and answers in the indicatorsets allowed consensus without confrontation,ensuring that every physician had equalopportunity to make a contribution.The aim of scoring the questions and

responses was to find a system which, throughthe audit method described, could improvepublic health practice. By using the indicatorquestionnaire in a departmental audit it ispossible to arrive at an index figure for qualityof public health medicine between 0 and 100for the structure, process, and outcomeindicators - essentially, a percentage measureof practice quality for that specific dimension

of public health medicine activity. The derivedscore can be compared either with themaximum score obtainable or with the scoresobtained by other health boards for the samequestionnaire. Thus the quality rating indexcan be used as a first step in adopting theStanding Medical Advisory Committee'srecommendation of comparative qualityassessment of the specialty across Britain.9The 0-100% quality rating index represents

an attempt to quantify the quality of practice.Although 0 represents the lowest quality and100 the highest, it is not a ratio scale - that is,an audit score of 80% does not necessarilymean that practice is twice as good as activitywith an audit score of 40%. The value of thequality rating index lies in closely examiningeach indicator and relating its importance orweighting to the overall audited score,identifying areas of strength and weakness.The index should be used to prompt anddirect a discussion of strategies on where andhow to alter practice. Repeating the auditallows the effects of changes in practice to bemeasured.

GENERALISABILITYTo ensure that the audit questionnairesproduced were valid, generalisable, andquantifiable pilot testing was necessary.During the pilot phase of the study eachindicator questionnaire was used by adepartment other than that which haddeveloped the questionnaire, as a basis foraudit of the appropriate area of public healthmedicine activity. These piloting .meetingswere attended by the researchers to observethe audit in practice, assess the questionnairesand instructions, and refine them asappropriate.A more formal evaluation exercise is

currently being undertaken on behalf of theFaculty of Public Health Medicine by theScottish Affairs Committee. This exercise aimsat assessing further the reaction to thisapproach to audit across all the Scottish healthboards and at acting as a clearing house anddissemination vehicle for existing andprospective work in audit questionnaires.

QUALITY IN PUBLIC HEALTH MEDICINEThe issue of quality in public health medicinepresents unique problems, requring a radicalappraisal of the tools used to measure andimprove quality in the diverse parts of thespecialty's activity. The matrix developedprovides an effective, flexible, andcomprehensive framework through whichquality can be assessed and improvedstrategically.Audit indicators should be generalisable,

quantifiable, and routinely measurable.9 Thecell method, in which specific areas of activitycan be pinpointed within the matrix and thendefined and critically examined, enablesparticipants to develop for themselvesindicators of good quality medical care. Bydrawing on the contribution of as manyphysicians as possible the process ofdeveloping quality indicators engenders a true

229

on March 14, 2021 by guest. P

rotected by copyright.http://qualitysafety.bm

j.com/

Qual H

ealth Care: first published as 10.1136/qshc.1.4.225 on 1 D

ecember 1992. D

ownloaded from

Page 6: Development of indicators quality in public health medicine · department of public health medicine and health promotion or education represen-tatives?" is a restricted concept. It

0Johnston, Narayan, Ruta

sense of ownership and acceptance. This isespecially important when experience clearlyshows that audit or quality assurance activitywill work only when it is voluntary.10 11

A MODEL FOR QUALITY ASSURANCE

This project provides a comprehensiveframework for the quantifiable evaluation ofquality. It does not seek to prescribe minimumstandards for the practice of public healthmedicine but to encourage appraisal of currentpractice, identifying areas where change canbe implemented. Subsequent quality ratingswith the same indicators provide a means ofmeasuring the effects of changes on the qualityof practice.The matrix and indicator approach to

quality assurance allows physicians tocomplete the "audit cycle" of settingstandards, observing practice and comparingwith standards, and implementing change.The quality of public health medicine indexcan identify areas of relative strength andweakness, with clear implications forredirecting effort and resources, therebyimproving health care. The principlesemployed in this approach to qualityassurance offer a flexible and systematicmethod of audit planning and evaluation forpublic health medicine.The continuing study aims at evaluating the

implementation of these techniques across

Scotland and exploring the extent to whichthis approach to audit may be adapted andapplied to other areas of health care.

We thank all those public health physicians who took part inthe initial survey and in developing the matrix, cells, andindicators, and the Grampian Public Health Medicine AuditCommittee for its continued support and advice. This projectis funded by the Clinical Resource and Audit Group. Full setsof the indicators are available from DAR.

1 Home and Health Department, Scottish Office. Workingforpatients. Medical audit - Scottish working paper 2.Edinburgh: HMSO, 1989.

2 Faculty of Public Health Medicine. Report of a workinggroup on audit of public health medicine. London: FPHM,1989.

3 Koch H. Obstacles to total quality in health care.International Journal of Health Care Quxalitv Assurancc1991;4:30-1.

4 University of Edinburgh Research Unit in Health andBehavioural Change. Changing the public health.Chichester: Wiley, 1989.

5 Committee of Inquiry into the Future Development of thePublic Health Function. Public health in England.London: HMSO, 1988. (Cmnd 2819.)

6 Donabedian A. Explorations in quality assessinten andlnonitonrng. Vols I, II. Ann Arbor, Michigan: HealthAdministration Press, 1980, 1982.

7 Nuffield Institute for Health Services Studies, University ofLeeds. Audit guidelines in public health medicine: anintroduction. Mersey Regional Health Authority: Audit inPublic Health Medicine Project, 1992.

8 Oppenheim NA. Questionnaire design and attitudenieasurement. London: Heinemann, 1966.

9 Standing Medical Advisory Committee. The quality ofmedical care. London: HMSO, 1990.

10 Robson M. Quality circles in action. Aldershot: Gower,1984.

11 Shaw C. Medical audit - A hospital handbook. London:King's Fund Centre, 1989.

230

on March 14, 2021 by guest. P

rotected by copyright.http://qualitysafety.bm

j.com/

Qual H

ealth Care: first published as 10.1136/qshc.1.4.225 on 1 D

ecember 1992. D

ownloaded from