Upload
morris
View
213
Download
1
Embed Size (px)
Citation preview
Assessment in postgraduate dental education:an evaluation of strengths and weaknesses
Z S Morris,1 A D Bullock,2 C R Bel®eld,3 S Butter®eld 2 & J W Frame4
Introduction This paper describes a study designed to
evaluate assessment in postgraduate dental education in
England, identifying strengths and weaknesses and fo-
cusing speci®cally on its relevance, consistency and
cost-effectiveness.
Methods A four-phase qualitative method was used: a
mapping of current career paths, assessment policy, and
issues (phase 1); more detailed studies of the practice of
assessment for a range of courses, and the systemic/
management perspective of assessment (i.e. quality
assurance) (phases 2 and 3), and analysis and reporting
(phase 4). Data were analysed from documents, inter-
views, group consultations and observations.
Results and discussion Five key issues may be distilled
from the ®ndings: (i) lack of formal assessment of
general professional training; (ii) trainer variation in
assessment; (iii) the extent to which assessments are
appropriate indicators of later success; (iv) the rela-
tionship between assessment and patient care, and
(v) data to assess the costs of assessment.
Conclusion Current assessment procedures might be
improved if consideration is given to: assessment which
supports an integrated period of general professional
training; training for trainers and inspection procedures
to address variation; more authentic assessments, based
directly on clinical work and grading cases and posts,
and better data on allocation of resources, in particular
clinicians' time given to assessment.
Keywords Cost effectiveness; curriculum; education,
dental, *standards; education, medical, graduate;
educational measurement; Great Britain; professional
competence; reliability and validity.
Medical Education 2001;35:537±543
Introduction
Postgraduate dental education has undergone radical
change in recent years, with profound implications for its
assessment. This is particularly true for specialist training
following the Calman Report.1 The Report of the Chief
Dental Of®cer (CDO)2 in 1995 proposed that higher
specialist training should be shorter, better structured
(`seamless', modular, aims-led), and more ¯exible,
whilst maintaining high standards which would be set by
the competent authority, the General Dental Council
(GDC). The CDO also accepted the introduction of a
new Certi®cate of Completion of Specialist Training
(CCST, as in medicine), consistent with European
Union regulations. It was required that all the CDO's
recommendations (`Calmanization') were applied to all
dental specialties by March 1997. The use of a range of
assessment instruments was encouraged, to re¯ect
changes in the nature of specialist training.1
The CDO's Report2 also made recommendations for
the pre-specialist or general professional training. It
suggested that young dentists should undertake an
initial 2-year period of general professional training in
both primary and secondary care settings, underpin-
ning all career options at the end of it. Although widely
supported,3±5 the proposal has unresolved implications
for assessment.
In short, postgraduate dental education has recently
been subject to considerable critical scrutiny and
modi®cation, involving shortened training courses
structured by prede®ned standards and modularized
(theoretically) to facilitate ¯exible entry and exit, and
choice, offering a combination of experiential learning
with formal didactic teaching, and supported by good
assessment which enables trainee development and
safeguards patients.
1Faculty of Social and Political Sciences, University of Cambridge,
Cambridge, UK2School of Education, University of Birmingham, Birmingham, UK3Teachers College, Columbia University, USA4Regional Postgraduate Dental Of®ce, School of Dentistry, University
of Birmingham, Birmingham, UK
Correspondence: A D Bullock, School of Education, University of Bir-
mingham, Birmingham B15 2TT, UK
Research papers
Ó Blackwell Science Ltd MEDICAL EDUCATION 2001;35:537±543 537
This paper reports on a study which evaluated the
strengths and weaknesses of the existing assessment
systems. The study concentrated on the assessment of
postgraduate dental training across primary and
secondary care, focusing on relevance, consistency, and
cost-effectiveness as factors essential to `good' assess-
ment. Based on the ®ndings, modi®cations to the
assessment system were suggested. These are consid-
ered in the Discussion, and are explored in more detail
in the ®nal project report.6
Assessment
By assessment we mean: `measuring progress against
de®ned criteria'. Such measurement is important
because it allows judgements to be made about the
effectiveness of training, and the monitoring and
maintenance of standards in training. It supports career
structures and progression, and can provide the public
with information.7 In evaluating assessment, atten-
tion was paid to its relevance, consistency and cost-
effectiveness.
Relevance concerns the issue of validity, of which there
are three aspects: content and curricular validity, con-
struct validity and predictive validity. Content validity
and curricular validity refer to how far the assessments
re¯ect periods of training. Construct validity relates to
the extent to which an assessment actually measures or
re¯ects the domains (skills, attributes, and types of
knowledge, understanding, or analysis) developed in a
period of training. Predictive validity is here de®ned as
the extent to which the assessments are appropriate
indicators of future success in the ®eld.
Consistency relates to the extent to which standards
are applied uniformly across settings and time. There
are two speci®c elements: comparability and reliability.
Comparability relates to how far the standards of dif-
ferent parts of the assessment system are capable of
comparison and involves the systemic issues of quality
control and inspection. Reliability concerns the extent
to which assessments match when they are carried out
by different assessors or at different times and places.
Cost-effectiveness involves the ef®cient use of
resources, recognizing that resources are scarce and
have alternative uses. It is a relative concept and a
secondary issue to relevance and consistency: irrelevant
and inconsistent assessment will not be cost-effective.
Methods
This qualitative study was conducted in four overlap-
ping phases over a period of one year starting in March
1998. The ®rst phase provided a mapping of the cur-
rent provision of postgraduate dental education and its
assessment in vocational, basic and specialist training
including examinations and inspection visits. Evidence
was obtained from existing data, previous research and
interviews with two postgraduate dental deans and a
representative from the Joint Committee for Specialist
Training in Dentistry (JCSTD). Representatives from
other national bodies were also interviewed later in the
study.
In the second phase a more detailed study of
assessment in practice was undertaken, in order to
explore the ways assessment is experienced by assessors
and trainees. A range of postgraduate training pro-
grammes and placements (in primary and secondary
care, and in general and specialist training) were
selected from the West Midlands Deanery for more
detailed study of the policy and practice of assessment.
Published course curricula and examination syllabi,
and trainee logbooks/portfolios, and other assessments
and records were gathered and analysed. Semistruc-
tured interviews were conducted with trainers and
trainees in the West Midlands Deanery. In secondary
care, this involved ®ve consultants, 13 house of®cers/
senior house of®cers (including those undertaking the
general professional training (GPT) `package'1 ), and 13
specialist registrars. In primary care evaluation meet-
ings with vocational trainees (vocational dental practi-
tioners (VDPs)) were observed, and interviews were
held with those responsible for the GPT package. In
addition, vocational training advisors were consulted at
their national conference, and four were followed up
individually.
In the third phase the systems used to ensure effec-
tive management of assessment, including inspection
procedures, were investigated. Opinion from those key
Key learning points
There is considerable scope for improving current
assessment procedures.
There is a need for assessment to support an
integrated period of general professional training.
Variation, particularly trainer variation, could
be addressed by training for trainers and inspec-
tion procedures.
Links between assessment and patient care could
be strengthened.
Better data are needed on the allocation of
resources to assessment, in particular clinicians'
time.
Ó Blackwell Science Ltd MEDICAL EDUCATION 2001;35:537±543
Assessment in postgraduate dental education · Z S Morris et al.538
informants identi®ed in phase 1 was sought through
semistructured interview and expert panels, in order to
investigate the current position and potential for
development of the systemic management of assess-
ment. Those consulted included representatives from
the General Dental Council (GDC), Postgraduate
Dental Deans and deans of dental schools, the Royal
College of Surgeons of England, the Hospital Recog-
nition Committee, two Specialist Advisory Committees
(SACs), vocational training advisors, and the Com-
mittee for Vocational Training (CVT).
Analysis relating to the earlier phases was provided in
phase 4, together with the preparation of reports and
recommendations.
Results
Phase 1 provided an overview of postgraduate dental
education, that is, career paths and assessment within
it. Brie¯y, on completion of a BDS, most young den-
tists undertake one year's vocational training within an
approved practice, during which time they keep a
professional development portfolio (PDP). The
experience is signed off at the end of the year by their
Postgraduate Dental Dean, allowing the trainee (VDP)
to enter independent NHS general practice. Others will
undertake a year as a house of®cer or senior house
of®cer, during which time they complete a manual or
logbook. They may choose to take a Membership of the
Faculty of Dental Surgery (MFDS) examination
(secondary care) or the Membership of the Faculty of
General Dental Practitioners (MFGDP) (primary
care). In either order, this experience forms a period of
general professional training. Those wishing to specia-
lize will spend at least one year as a house of®cer/senior
house of®cer and must pass the MFDS examination to
be eligible for specialist registrar training posts.8
Assessment processes used during their specialist
training include the use of logbooks, SAC/JCSTD
structured assessments, record of in-training assess-
ment (RITA) panels, and Membership and Fellowship
examinations. In this study, distinction was made
between the records (RITAs, log books, PDPs) and the
assessments (JCSTD and Membership exams).
Findings are presented here in three sections, relating
to the assessment of vocational training, house of®cer/
senior house of®cer training, and the training of
specialist registrars.
Vocational training
Vocational training is not assessed in any formal sense.
The vocational training year was described as providing
experience in clinical work, for which basic levels of
competence are already assessed by university ®nal
examinations.9 There are no pass/fail criteria as such,
and some respondents suggested that trainees receive
certi®cation based mainly on attendance. There are,
however, three educative elements of the vocational
training year included in this evaluation: the study days,
the PDP and the weekly tutorial hour with the trainer in
general dental practice.
VDPs have provision for 30 full study days during the
vocational training year which represent a substantial
resource commitment within the system. However,
views on the ef®cacy of these days as an opportunity for
ongoing education were varied. They were not linked to
preparation for quali®cations, as the house of®cer/
senior house of®cer study days are linked to the MFDS,
for example.
A range of views on the merits of the PDP were
found (and are reported in other work10±12). The idea
of the PDP as a re¯ective tool was welcomed by some
trainers, but many VDPs found it repetitive, lengthy to
complete and lacking in relevance. As a result many
admitted to `minimal compliance' in maintaining the
document. It was also noted that problems need to be
addressed as they arise rather than to be re¯ected on
some time later. This perhaps suggests a need for more
timely (continuous) formative assessment.
The success of the tutorial hour was dependent upon
the individual trainers who were likely to vary in: (a)
their prior knowledge; (b) their enthusiasm for the
educative role, and (c) their knowledge of the assess-
ment protocols and the instruments to use. It appears
that the tutorials themselves were not always under-
taken seriously or considered to be necessary.
There is also variety between regions, as evidenced in
documents, with regard to the content and format of
study day programmes and the involvement of formal
programme committees to plan and monitor standards.
Nor is there a standard format between regions for
selection of trainers and their training for the role.
Although local and regional ¯exibility can be defended,
issues of consistency should be considered.
Thus, there was considerable variety (or lack of
consistency) between training experiences, and many
trainers acknowledged a need to improve consistency.
This would also be welcomed by trainees.
House of®cer/senior house of®cer training
The formal assessment of house of®cer/senior house
of®cer training is by means of the MFDS examination,
although this is optional. House of®cers/senior house
of®cers now contractually receive half a day per week of
Assessment in postgraduate dental education · Z S Morris et al. 539
Ó Blackwell Science Ltd MEDICAL EDUCATION 2001;35:537±543
formal teaching which focuses on the MFDS syllabus.
This re¯ects the way in which training is becoming
more structured.
During the period of this study, a new national log-
book/manual for house of®cers/senior house of®cers
was introduced, but these were not yet being widely
used in the West Midlands. It is, however, a record and
not an assessment: it records activities undertaken, and
not a statement of quality measured against de®ned
criteria. The role of the house of®cer/senior house
of®cer logbook in career progression is not yet known.
Concerns were expressed by those expected to
implement the national logbook that it was overly
complex and might also meet with trainee resistance.
The clinical experience of house of®cers/senior house
of®cers varies considerably between different hospitals
and specialties making comparison dif®cult. Neverthe-
less, in the West Midlands, logbooks were used with
some success by those on the formal general profes-
sional training programme, for recording experience as
well as for planning future work.
The MFDS exam is in three parts which are related
to `a knowledge and understanding of the clinical
practice and science of dentistry suf®cient to enter
formal training in one of the dental specialties'.
Candidates must have passed Parts A and B before they
can take Part C, after a minimum of 20 months in
general professional training. Most house of®cers/
senior house of®cers interviewed considered the MFDS
to be relevant to their house of®cer/senior house of®cer
training, although, the experience gained in the primary
care element of general professional training was not
directly linked to the MFDS, nor were VDPs encour-
aged to prepare for it.
House of®cers/senior house of®cers undertaking
rotating posts felt that assessments of each part of the
rotation were generally unreliable, based on a non-
standard, often verbal report. The house of®cer/senior
house of®cer interviewees understood consultants'
references to be important to career progression but
were unclear about what factors would be highlighted.
As with vocational training and specialist registrar
training, there was believed to be considerable variation
between training experiences more generally. Both
trainers and trainees thought that the actual level,
nature and quality of consultant input varied by
individual, the trainee and the unit.
Respondents at the national level felt that the role of
the trainers in assessment could be more structured,
and therefore consistent, but were aware of intervening
factors: few were trained for their educational and
assessment role; nor were they paid or allowed time for
it. Trainers of specialist registrars also noted an increase
in the time spent `assessing' as a result of more struc-
tured training, and many of these would also have had
responsibility for house of®cers/senior house of®cers.
Inspection bodies have a role in maintaining stan-
dards of training and reducing variations. Members of
the Hospital Recognition Committee (HRC) felt ef®-
cacy was limited by the size of the task (it inspects
approximately 780 house of®cer/senior house of®cer
posts in the UK), the fact that inspections varied by
inspectors, who were not trained in inspection and, as
one member put it, by the `smoke, camou¯age and
fresh paint' applied by the unit undergoing inspection.
Inspection covers practical matters (accommodation)
in addition to training and educational issues.
General professional training
There was widespread support for the notion of general
professional training in both primary and secondary
care settings. One purpose is to allow more informed
career choice. The view was also expressed that trainees
with clear career goals should not be obliged to
undertake training in both primary and secondary care
settings as it merely serves to extend the period they
spend in training.
Many trainers supported the notion of a common
examination for general professional training, rather
than the current reciprocity between Part A of the
MFDS and Part 1 of the MFGDP. This is partly on
principle: a common examination articulates well with
the concept of general professional training. In practice,
many trainees were against the notion of reciprocity as
the two examinations have different purposes. The
MFDS was generally seen as being of higher status and
providing greater ¯exibility.
Assessment of specialist registrars
Trainees need an MFDS or `equivalent' to enter spe-
cialist training. Entry to specialist training is compet-
itive, and there is a serious bottleneck in some
specialties (orthodontics and restorative dentistry were
given as the worst examples). In recruitment, therefore,
additional strengths are sought. These include more
experience, experience of speci®c specialist areas, and
sometimes publications in refereed journals. Some
specialist registrars also held Masters degrees, but
trainers were concerned lest this should become the
norm. However, given the competitive nature of career
progression it is not clear how this situation could be
avoided.
On the whole the 13 respondents considered assess-
ment (however interpreted) to be relevant to the
Assessment in postgraduate dental education · Z S Morris et al.540
Ó Blackwell Science Ltd MEDICAL EDUCATION 2001;35:537±543
training and to career progression. Whilst the logbook
records quanti®ed experience, in some specialties it may
also contribute to the ®nal college exam. There appeared
to be a good match between the content of specialist
training and JCSTD/SAC assessments which feed into
the RITA process and require trainers to make quality
judgements. However, some concern was expressed that
the RITA referral process was not used enough.
Variation is also an issue in the assessment of spe-
cialist registrars. Few trainees considered that training
would be comparable between posts, but they did not
feel this to be a problem. As with house of®cer/senior
house of®cer training, the completion of logbooks var-
ies by departments; consultants vary in attitude and
aptitude, and assessment methods and procedures vary
by post and by individual trainers and the time available
to them. The relationship between trainer and trainee is
an intense one, which some thought could lead to a
weakening of the prospects for informal and non-
threatening appraisal. Some respondents also consid-
ered the reference process to be inconsistent.
Inspection at this level by a Specialist Advisory
Committee (SAC) was considered to be robust and
effective. The number of inspections required is
manageable, and believed to be rigorous. SACs do
withdraw approval of training posts. Consistency is
enhanced through the SAC Chairs' membership of the
JCSTD.
Discussion
Although there is evidence of good practice in the
assessment in postgraduate dental education, there are
clearly some problems with current approaches. Here
we take three aspects of training, that is, vocational,
house of®cer/senior house of®cer and specialist registrar
training, and consider them in terms of our three
evaluative concepts of relevance, consistency and cost-
effectiveness. It should be noted that there is no formal
assessment of the vocational training year, and very
little of house of®cer/senior house of®cer training, and
this has rami®cations through each of our evaluative
concepts. The assessment during general professional
training (vocational training and house of®cer/senior
house of®cer) may or may not have content, construct
or predictive validity, but information from which to
make judgements is sparse owing to the informal
arrangements associated with this assessment.
Relevance (content, construct or predictive validity)
The way the PDP is actually used in vocational training
renders its validity unveri®able. Most formal assess-
ments (logbooks, and examinations) were considered
relevant to training and to progression (content and
predictive validity). However, there were some issues
concerning construct validity. For example, the MFDS
does not identify a `good pair of hands', considered
essential to clinical practice, nor do current assessments
re¯ect performance needs, for instance the ability to
work under pressure. The national logbook for house
of®cers/senior house of®cers might be viewed as a step
towards agreement on what should be assessed during
this training; however it would need to be developed
further for this role and used more formatively, to
identify strengths, weaknesses, and gaps in training.
Specialist training shows a better match between
training and assessment (content validity), than the two
elements of general professional training: the curri-
culum is broadly and clearly de®ned (enabling con-
struct validity in assessment), and trainers are asked to
make quality statements about trainees.
There is both reason and scope to make assessments
within postgraduate dental education more authentic,
i.e. based directly on clinical work relevant to patient
care. A larger part of the assessment could, for ex-
ample, be based on real patient treatment histories over
a longer period and could include some assessment of
patient progress and recovery time. This would also
provide a more reliable signal of clinical competence,
that is enhance predictive validity. Other methods for
consideration might include the use of objective
structured clinical examinations,12 standardized
patients, and computer-based examinations, or simu-
lations for example. The Membership in General
Dental Surgery (MGDS), a continuing rather than
initial professional development quali®cation, open to
dentists with 5 years' experience, provides an example
of innovative assessment which includes a practice
visitation and examination of two cases with the patient
present. The Fellowship of the Faculty of General
Dental Practitioners (FFGDP) is a further advanced
quali®cation available to general dental practitioners
which, in addition to the use of patients in the assess-
ment, includes two other innovations worth noting.
The ®rst is the use of videorecording to demonstrate
interpersonal skills. The second is the requirement to
present audit, and more particularly, patient satisfac-
tion survey data.
Education quali®cations signal that workers have
the necessary skills (assuming the that assessment is
`relevant' or has `predictive validity'), and assessment is
the means by which there is a formal articulation
between training and career progression. Several issues
emerge from this. Trainees need clari®cation about the
career paths open to them from various quali®cations.
Assessment in postgraduate dental education · Z S Morris et al. 541
Ó Blackwell Science Ltd MEDICAL EDUCATION 2001;35:537±543
The current duplication of quali®cations or taking of
irrelevant examinations is not cost-effective. References
provided for the trainee provide a mechanism which
may ease or limit progression, and yet trainees are
largely ignorant of what matters. Moreover, some of the
trainers interviewed doubted the objective value of
references. This may provide further reason for national
agreement and guidance on what should be assessed.
A national agreement would also support ¯exibility.
A modular credit accumulation model would facilitate
more effectively movement between general practice
and specialist training at all career stages, than current
assessment systems.2
Consistency (comparability and reliability)
Variation, particularly that associated with trainers,
affects the consistency of assessment and, for the sys-
tem to be regarded as fair with standards being applied
equally, this issue should be addressed. Variation may
not negatively affect the relevance of the programmes,
but will render judgements made about individuals
unreliable. Variation in assessment could be addressed
through more formalized, standard periods of training;
the training of trainers (and inspectors), and the
development of more robust moderation and inspection
procedures. Speci®cally in vocational training, one
approach would be to develop the existing CVT
Guidelines further. A standardized system would sup-
port the competency assessment,13,14 the development
of which many vocational training advisors regarded as
inevitable. Such assessment methodology is being
developed, mainly in undergraduate dentistry, and its
feasibility and acceptability needs greater exploration. It
is also an approach which might serve to make the
length of training more ¯exible as it supports openness
within a structured assessment.15 However, as GDPs
play a greater role in the training of newly quali®ed
dentists their role as educational facilitators needs
supporting.
Cost-effectiveness
It is important that assessment systems offer value for
money and this can be interpreted broadly as `a concern
for ef®cient resource usage'. If assessment is not
undertaken ef®ciently, then it is not likely to be cost-
effective. This de®nition prompts a number of resource
issues.
One way of ensuring optimal resource use in
assessment has already been discussed, that is,
strengthening the link between what is taught and the
amount and content of what is assessed (i.e. ensuring
content validity). Establishing the optimal volume of
assessment is also fundamental, but is not straight-
forward.
Current assessment methods draw heavily on the
time of clinicians, both trainers and trainees. Yet such
clinicians are highly skilled and highly paid, and have a
high `opportunity cost'. Furthermore, their individual
performance varies. Therefore some consideration
might be given to redistributing the burden of assess-
ment. One approach might be to utilize physical
resources more, such as computerized methods, or to
use only those consultants with particular aptitude in
training and assessment.
There is also an issue concerning the number of
agencies which devise, implement and inspect training,
with each task absorbing resources. More standardized
large-scale (national) assessments would reduce costs.
Cooperative arrangements might be sought to reduce
the number of assessor agencies and the duplication of
effort (and in so doing, improve reliability).
Comment must be made about the paucity of data
and infrequent application of cost-effectiveness tests. In
order to improve resource use, it is essential that
monitoring, evaluation and inspection systems collect
costs and outcomes data and that individual agents
work within a system where ef®ciency and cost-effect-
iveness are encouraged.
Conclusion
In summary, ®ve key issues may be distilled from the
®ndings: (i) assessment of general professional training;
(ii) trainer variation in assessment; (iii) the extent to
which assessments are appropriate indicators of later
success; (iv) the relationship between assessment and
patient care, and (v) data to assess the costs of assess-
ment.
Vocational training is not assessed in any formal
sense, which makes it dif®cult to evaluate. The same is
true of house of®cer/senior house of®cer training. There
is a need for assessment to support an integrated period
of general professional training.
Specialist training, which has undergone consider-
able review recently, shows more relevance, but there
remain issues about its consistency, speci®cally con-
cerning reliability, and cost-effectiveness. Value for
money, that is, ensuring the ef®cient allocation of
resources, in particular, clinicians' time, requires
more data than are currently accessible.
There is considerable evidence of trainer commitment
to the principle of better assessment of more structured,
¯exible training, and some evidence of particular meas-
ures to achieve this aim. However, such aspirations are
Assessment in postgraduate dental education · Z S Morris et al.542
Ó Blackwell Science Ltd MEDICAL EDUCATION 2001;35:537±543
currently undermined by the `considerable confusion'8
within the system, perhaps as a result of the evolu-
tionary nature of the changes. There is a need to
overcome variation, particularly trainer variation in
assessment, by training for trainers and for inspection
procedures.
Links between assessment and patient care could be
strengthened. More authentic assessments, based
directly on clinical work, may contribute to improve-
ments in patient care, and grading cases and posts may
improve the robustness of the system. However, there is
a danger that attempts to make assessment more closely
related to working contexts and more patient-focused
(relevant) will produce additional problems in consis-
tency unless underpinned by a strong, coherent and
explicit assessment framework.
Acknowledgements
The project team is grateful for the ®nancial support of
the Department of Health through the Post-registration
Medical and Dental Education Research Initiative. The
views and opinions expressed are the authors' and do
not necessarily re¯ect those of the Department of
Health. Valued critical comment was provided by
Mr Ken Eaton at the National Centre for Continuing
Professional Education of Dentists, 4th Floor, 123
Gray's Inn Road, London, WC1X 8TZ. We acknow-
ledge the support of all those who gave us access to
their meetings and allowed us to interview them
including, trainees, trainers, Postgraduate Dental
Deans and representatives from other local and
national bodies.
Contributors3
The authors are members of the Centre for Research in
Medical and Dental Education (CRMDE), based in
the School of Education, University of Birmingham.
Zoe Morris, BSc PhD, is a Research fellow at the
Faculty of Social and Political Sciences, University of
Cambridge. She was the Research Associate on this
project. Alison Bullock, BA PhD PGCE, is a Senior
Research Fellow in the School of Education, University
of Birmingham. She was a co-investigator. Clive
Bell®eld, BA MA PhD, based at Teachers College,
Columbia University, USA, was a co-investigator. His
main interest is in the cost-effectiveness of programmes
of training. Sue Butter®eld, BA PGCE PhD Dip Psych
BSc, was co-investigator. She has a special interest in
assessment. John Frame, BDS FDS MSc PhD, was the
principal investigator. He is Professor of Oral Surgery
and the Regional Director of Postgraduate Dental
Education, School of Dentistry, University of Bir-
mingham.
Funding4
Financial support was provided by the Department of
Health through the Post-registration Medical and
Dental Education Research Initiative.
References
1 Working Group on Specialist Medical Training. Hospital
Doctors. Training for the Future (The Calman Report). London:
Department of Health; 1993.
2 Chief Dental Of®cer. UK Specialist Dental Training5 ± Report
from the Chief Dental Of®cer. London: NHS Executive; 1995.
3 General Dental Council. Preliminary Report of General Profes-
sional Training Committee of the GDC. London: GDC; 1997.
4 General Dental Council. The Next Two Years ± General
Professional Training. London: GDC; 1998.
5 Royal College of Surgeons. General Professional Training Work-
ing Party: Report. London: Royal College of Surgeons; 1995.
6 Frame JW, Bullock AD, Butter®eld S, Morris ZS, Bel®eld CR.
An Evaluation of the Assessment of Post-registration Dental
Education (Final Report). Birmingham: University of Bir-
mingham; 1999.
7 Irvine D. The performance of doctors: the new profession-
alism. Lancet 1999;353:1174±7.
8 Barnard D. Specialisation in Dentistry. London: Faculty of
Dental Surgery, The Royal College of Surgeons of England;
1999.
9 Committee for Vocational Training6 . Evaluation of the Profes-
sional Development Portfolio. CVT 98/15 Document I. London:
CVT; 1998.
10 Joint Centre for Education in Medicine. Evaluation of the
Vocational Training Record Book as Part of the Vocational
Training Year. London: Joint Centre for Education in
Medicine; 1995.7
11 SCOPME. The Early Years of Postgraduate Dental Training in
England. Dundee: The Centre for Medical Education
(SCRE); 1995.
12 Joint Centre for Education in Medicine. The Good Assessment
Guide: A Practical Guide to Assessment and Appraisal for Higher
Specialist Training. London: Joint Centre for Education in
Medicine; 19978 .
13 Batchelor P, Albert D. Issues concerning the development of a
competency-based assessment system for dentistry. Br Dent J
1998;185:141±4.
14 Mossey PA, Newton JP, Stirrups DR. De®ning, conferring
and assessing the skills of dentists. Br Dent J 1997;182:123±5.
15 Hager P, Gonczi A. Professions and competencies. In:
R Edwards, A Hanson, P Raggatt. Boundaries of Adult
Learning. London: Routledge; 1996.
9Received 11 May 2000; editorial comments to authors 22 June 2000;
accepted for publication 27 July 2000
Assessment in postgraduate dental education · Z S Morris et al. 543
Ó Blackwell Science Ltd MEDICAL EDUCATION 2001;35:537±543