8
March 2007 Journal of Dental Education 365 Educational Methodologies The Correlation of Student Performance in Preclinical and Clinical Prosthodontic Assessments Donald A. Curtis, D.M.D.; Samuel L. Lind, Ph.D.; Sheila Brear, B.D.S.; Frederick C. Finzen, D.D.S. Abstract: Tracking student performance in preclinical and clinical courses can be helpful in developing and refining a curricu- lum. Our objective was to correlate student performance on three fixed prosthodontic examinations taken by eighty junior dental students. Examinations included a knowledge-based objective structured clinical examination (OSCE), a manual skills exercise completed on a typodont (Typodont), and a competency casting exam (Casting CE) on a patient. Multiple regression analysis indicated that the OSCE and Typodont exam scores, as independent variables, were not statistically significant predictors (P=0.07; P=0.87, respectively) of Casting CE exam performance, which was the dependent variable. Correlations were weak for the OSCE (r=0.21) and nearly nonexistent for the Typodont exam(r=0.03) when compared to the Casting CE. Our results indicate a weak correlation between an OSCE-based knowledge exam measuring students’ knowledge of critical errors in preparations and castings and a competency exam involving the preparation of a full veneer crown. Results also indicate virtually no correlation between a typodont preparation examination designed to provide a measure of students’ clinical skill and a clinical competency exam involving the preparation of a full crown. Dr. Curtis is Professor, Department of Preventive and Restorative Dental Sciences, School of Dentistry, University of California, San Francisco; Dr. Lind is Associate Professor, School of Economics and Business Administration, Saint Mary’s College of Cali- fornia; Dr. Brear is Health Sciences Assistant Clinical Professor and Division Chair, School of Dentistry, University of California, San Francisco; and Dr. Finzen is Health Sciences Clinical Professor and Chair of Prosthodontics, School of Dentistry, University of California, San Francisco. Direct correspondence and requests for reprints to Dr. Donald A. Curtis, Department of Preventive and Restorative Dental Sciences, D-3212, School of Dentistry, University of California, San Francisco, 707 Parnassus, San Fran- cisco, CA 94143-0758; 415-476-5827 phone; 415-476-0858 fax; [email protected]. Key words: prosthodontics, objective structured clinical exam, student assessment, student performance Submitted for publication 8/14/06; accepted 10/30/06 A dental curriculum is a four-year opportunity for dental educators to progressively develop a student’s knowledge, skills, and attitudes with the hope of graduating a competent beginning dentist. As students progress through the dental cur- riculum, knowledge is incrementally gained from classroom, seminar, clinical, and peer interactions. Manual skills development occurs initially with simple preclinical bench-top typodont procedures and progresses to mannequin exercises before stu- dents enter clinics. There are threshold areas in a dental curriculum where assessments are completed to determine if a student has integrated knowledge and/or skills sufficiently before transitioning to the next level of the curriculum. An example of a threshold area in the fixed prosthodontics curricu- lum at the University of California, San Francisco (UCSF) is the examination to determine if a student has sufficient didactic knowledge and understand- ing of clinical procedures in fixed prosthodontics to transition into the clinics. A second example is for students to favorably complete a fixed partial denture (FPD) preparation on a typodont before transition- ing further into clinical care. The purpose of this study was to determine the correlation between two threshold examinations in preclinical fixed prosth- odontics and a clinical competency exam in fixed prosthodontics. Predicting future professional achievement has been an elusive goal in dental education. Most measures used to predict achievement in preclinical and clinical activities have shown weak predictive values. 1-12 Research has been completed evaluat- ing the correlation of preadmission measures with preclinical performance, 2,4,5,7,11,12 the correlation of preadmission measures with general clinical perfor- mance, 9 and the correlation of assessments within or to a specific discipline. 9,10

The Correlation of Student Performance in Preclinical and ... · in Preclinical and Clinical Prosthodontic Assessments ... ing of clinical procedures in fixed prosthodontics to

  • Upload
    leminh

  • View
    227

  • Download
    7

Embed Size (px)

Citation preview

March 2007 ■ Journal of Dental Education 365

Educational Methodologies

The Correlation of Student Performance in Preclinical and Clinical Prosthodontic AssessmentsDonald A. Curtis, D.M.D.; Samuel L. Lind, Ph.D.; Sheila Brear, B.D.S.; Frederick C. Finzen, D.D.S.Abstract: Tracking student performance in preclinical and clinical courses can be helpful in developing and refining a curricu-lum. Our objective was to correlate student performance on three fixed prosthodontic examinations taken by eighty junior dental students. Examinations included a knowledge-based objective structured clinical examination (OSCE), a manual skills exercise completed on a typodont (Typodont), and a competency casting exam (Casting CE) on a patient. Multiple regression analysis indicated that the OSCE and Typodont exam scores, as independent variables, were not statistically significant predictors (P=0.07; P=0.87, respectively) of Casting CE exam performance, which was the dependent variable. Correlations were weak for the OSCE (r=0.21) and nearly nonexistent for the Typodont exam(r=0.03) when compared to the Casting CE. Our results indicate a weak correlation between an OSCE-based knowledge exam measuring students’ knowledge of critical errors in preparations and castings and a competency exam involving the preparation of a full veneer crown. Results also indicate virtually no correlation between a typodont preparation examination designed to provide a measure of students’ clinical skill and a clinical competency exam involving the preparation of a full crown.

Dr. Curtis is Professor, Department of Preventive and Restorative Dental Sciences, School of Dentistry, University of California, San Francisco; Dr. Lind is Associate Professor, School of Economics and Business Administration, Saint Mary’s College of Cali-fornia; Dr. Brear is Health Sciences Assistant Clinical Professor and Division Chair, School of Dentistry, University of California, San Francisco; and Dr. Finzen is Health Sciences Clinical Professor and Chair of Prosthodontics, School of Dentistry, University of California, San Francisco. Direct correspondence and requests for reprints to Dr. Donald A. Curtis, Department of Preventive and Restorative Dental Sciences, D-3212, School of Dentistry, University of California, San Francisco, 707 Parnassus, San Fran-cisco, CA 94143-0758; 415-476-5827 phone; 415-476-0858 fax; [email protected].

Key words: prosthodontics, objective structured clinical exam, student assessment, student performance

Submitted for publication 8/14/06; accepted 10/30/06

A dental curriculum is a four-year opportunity for dental educators to progressively develop a student’s knowledge, skills, and attitudes

with the hope of graduating a competent beginning dentist. As students progress through the dental cur-riculum, knowledge is incrementally gained from classroom, seminar, clinical, and peer interactions. Manual skills development occurs initially with simple preclinical bench-top typodont procedures and progresses to mannequin exercises before stu-dents enter clinics. There are threshold areas in a dental curriculum where assessments are completed to determine if a student has integrated knowledge and/or skills sufficiently before transitioning to the next level of the curriculum. An example of a threshold area in the fixed prosthodontics curricu-lum at the University of California, San Francisco (UCSF) is the examination to determine if a student has sufficient didactic knowledge and understand-

ing of clinical procedures in fixed prosthodontics to transition into the clinics. A second example is for students to favorably complete a fixed partial denture (FPD) preparation on a typodont before transition-ing further into clinical care. The purpose of this study was to determine the correlation between two threshold examinations in preclinical fixed prosth-odontics and a clinical competency exam in fixed prosthodontics.

Predicting future professional achievement has been an elusive goal in dental education. Most measures used to predict achievement in preclinical and clinical activities have shown weak predictive values.1-12 Research has been completed evaluat-ing the correlation of preadmission measures with preclinical performance,2,4,5,7,11,12 the correlation of preadmission measures with general clinical perfor-mance,9 and the correlation of assessments within or to a specific discipline.9,10

366 Journal of Dental Education ■ Volume 71, Number 3

Of the research that has evaluated the value of preadmissions measures to predict student per-formance in preclinical courses, most have shown that the Perceptual Abilities Test (PAT) is a better predictor of preclinical technique course grades than preadmission college grade point average (GPA) or science grade point average (SGPA).1,2,4,7 However, the PAT remains only modestly correlated to student performance in preclinical courses.4,7

While investigators have determined modest correlations between various preadmissions as-sessments and preclinical performance, only weak correlations have been shown between a student’s preadmissions background and overall clinical performance or performance in a specific dental specialty.9,11 Potter et al., in a study of two gradu-ating dental school classes, showed that clinical performance correlated weakly with preadmission GPA, preclinical basic science GPA, or dental school lecture or laboratory performance.9 Bellanti et al., in a study of 344 students, correlated Dental Ad-mission Test (DAT) scores to performance in fixed prosthodontics and showed that the DAT carving test and preadmissions GPA were better predictors of success in preclinical prosthodontics than other DAT subscores but were weak predictors of clinical prosthodontic achievement.11 As part of a larger study comparing problem-based learning pedagogy to traditional teaching, Rich et al. reported that the vari-ance of the score on a clinical periodontal assessment was minimally explained by the preclinical midterm and final exams.10 Rich et al. found the inability of the preclinical assessments to predict clinical perfor-mance in the same specialty was perplexing.

The objective structured clinical examination (OSCE) is a standardized approach to evaluate per-formance in simulated situations.13-15 In many cur-riculums the OSCE is used as a formative assessment to provide feedback to students.16 Some educators have argued that there is sufficient validity to the OSCE that it can be used as a “gateway event,” where failing students must complete remedial education before progressing.17 Unfortunately, few investigators have examined the OSCE’s predictive validity.18

Although previous investigations have shown a modest predictive ability of preadmission assess-ments to professional achievement in dental school, few studies have evaluated the predictive value of preclinical performance in a specialty to clinical performance in the same specialty.10,11 Accordingly, the purpose of this investigation was to correlate two preclinical prosthodontic assessments (an OSCE as-

sessment and a manual skills typodont preparation) with a clinical assessment of student performance in fixed prosthodontics.

MethodsEighty-two students from the UCSF graduat-

ing class of 2005 were proposed for inclusion in this investigation. Two students did not participate in the study—one because of an extended illness and the other because of academic probation. Therefore, eighty students participated in this study. As part of their normal requirements, all students completed a knowledge-based objective structured clinical exam (OSCE), a skills-based typodont examination (Ty-podont), which included the preparation of a three-unit fixed partial denture (FPD), and a patient-based competency exam (Casting CE) that included a full veneer preparation, fabrication of a provisional, and delivery of a full crown.

Test results were collected with code numbers replacing names so confidentiality was maintained. The protocol for the study was reviewed and approved by the Committee of Human Research (CHR) at the University of California, San Francisco.

Objective Structured Clinical Exam (OSCE)

The OSCE typodont-based assessment in-cluded ten stations with twenty-seven total questions rendering sixty-three possible points. The OSCE assessment included two parts: five stations with questions about completed typodont FPD prepara-tions and five stations with questions about competed FPD castings. At each station, students were provided mylar strips, explorer, mirror, floss, and articulating paper. Typodonts were placed in mannequins oriented in a dental chair to a reclined position that allowed direct vision, but students benefited from using a mirror for indirect vision.

Five stations of the OSCE included basic ques-tions about FPD preparations completed on typodont teeth #18 and #20 with #19 as a pontic. Questions about taper of individual typodont teeth, draw, oc-clusal reduction, and margin placement were queried. Questions also included items related to an acrylic provisional restoration that was fabricated and in-cluded questions about why a provisional restoration may have fractured (narrow connector) or what was wrong with a provisional restoration (occlusion too high). Questions were formatted as multiple choice

March 2007 ■ Journal of Dental Education 367

questions having four possible answers, with the one correct answer to be identified.

The second part of the OSCE assessment included five stations where three unit cast FPDs had been fabricated and placed on prepared typodont teeth #18 and #20 with #19 as a pontic. Two or three critical errors were waxed and cast in each of the five FPDs that included the following: occlusion that lacked contact to the opposing arch, a casting with a facial margin that was short of the prepared margin, a contact that was binding and kept the FPD from seating, an open interproximal con-tact, damage of an adjacent tooth, and improper finishing of the restoration (Figure 1). The questions required a written response, and students were asked to evaluate each FPD problem and state if the casting was correct-able or not. If the casting error was thought to be correctable, such as a missing mesial contact that could be soldered, the corrective action needed to be identified. One of the five cast FPDs was an ideal casting that stu-dents were asked to identify. The three unit castings were cast from type 4 gold (Jelenko, San Diego, CA).

The completed OSCE exami-nations were collected and coded by a staff member so that the faculty member grading the examination was blinded to the student’s identity. One faculty member graded all OSCE exams.

Manual Skills Examination (Typodont)

The manual skills examina-tion included students’ preparing a three-unit posterior fixed partial denture (FPD) on typodont teeth followed by the fabrication of a provisional restoration. The FPD included full veneer crowns on #18 and #20 with #19 as a pontic. Typodont tooth #17 was present. Ty-podonts were placed in a mannequin and positioned in a reclined clinic

Figure 1. The objective structured clinical examination (OSCE)

1A. The first pre-clinical exam was a ten-station OSCE, where students were provided necessary instrumentation and asked to identify preparation errors and problems with cast FPDs.

1B. Example of OSCE station where students were asked to identify problems with completed typodont prepara-tions—in this case, a mesial undercut on the molar.

1C. An example of an open margin on the molar and open mesial contact students were asked to identify on the OSCE.

chair to simulate patient care as closely as possible (Figure 2). The examination protocol also included following standard infection control procedures. Students were given three hours to complete the FPD

368 Journal of Dental Education ■ Volume 71, Number 3

preparations and provisional restoration. Scores were given for eight subdivisions on the preparation and five on the provisional for a total possible score of 130 points.

The typodonts were removed by staff and graded by one of three calibrated faculty. Each exam was coded so the faculty grading the exam was blinded to the student’s identity. The completed preparations were evaluated for occlusal and axial reduction, margin placement, taper of individual preparations and presence of undercuts between preparations, assessment of adjacent teeth, and fin-ish. The provisional restoration was evaluated for marginal adaptation, occlusion, proximal contacts, pontic contour, embrasures, and finish.

Casting Competency Examination (Casting CE)

The casting competency assessment involved assessing a student’s skills in preparing a tooth for a full crown, fabricating and adjusting a provisional,

and delivering a permanent crown (Figure 3). The completed preparation and provisional restoration were assessed by calibrated faculty. Proper occlusal and axial reduction, retention and resistance form, appropriate taper and draw, margins, and assessment of adjacent teeth were completed. Additionally, the provisional was evaluated for proper contacts, oc-clusion, and finish. The score was a composite of ten sections of the exam with 100 points possible. The criteria for evaluating the casing competency examination were nearly identical to the criteria for evaluating the manual skills typodont examination.

Data AnalysisDescriptive statistical measures of the mean,

standard deviation, range of scores, and coefficient of variation were determined for describing the char-acteristics of the OSCE assessment, the manual skills (Typodont) assessment, and the casting competency examination (Casting CE) (Table 1). The Coefficient of Variation (CV) provided a basis for a comparison

Figure 2. The second preclinical examination: a typodont exercise on a mannequin positioned in a dental chair that included preparing a three-unit FPD and placing a temporary

Figure 3. The third examination, which included prepa-ration for a full coverage restoration, fabrication of a temporary, and placement of a completed casting on a patient

Table 1. Descriptive statistical measures

Variable Mean Standard Deviation Coefficient of Variation Min, Max

Clinical Casting Exam 92 6.1 7% 29, 100, OSCE Exam 38 7.2 19% 19, 55Typodont Exam 82 24.8 30% 22, 128

March 2007 ■ Journal of Dental Education 369

of the relative variation in the data. Scatter diagrams were plotted to visualize the distribution among the three variables (Figures 4 and 5).

Multiple regression analyses were completed with the OSCE and Typodont exams as independent variables and the Casting CE as the dependent vari-able. We tested several hypotheses using exhaustive iterations of the independent variables as predictors of the dependent variable, including 1) OSCE as a single independent variable, 2) Typodont as a single independent variable, and 3) OSCE and Typodont as multiple independent variables. Multiple correla-

tions were calculated among the three data sets using Pearson’s product-moment correlation coefficients (Tables 2 and 3).

The overall Kappa statistic to quantify the level of inter-rater reliability was calculated.

ResultsFor the OSCE examination, the mean score was

38 ±7.2 with a range of 19 to 55. For the Typodont preparation examination, the mean score was 82

Figure 4. Scatter plot showing a low correlation between the OSCE and Casting CE

Figure 5. Scatter plot showing a low correlation between the Typodont exam and Casting CE

0

20

40

60

80

100

120

0 10 20 30 40 50 60

OSCE

Casti

ng

CE

0

20

40

60

80

100

120

0 50 100 150

Typodont

Ca

sti

ng

CE

370 Journal of Dental Education ■ Volume 71, Number 3

±24.8 with a range of 22 to 128. For the casting competency clinical examination, the mean score was 92 ±6.1 with a range of 71 to 100. The Coeffi-cient of Variation, a comparison measure of relative dispersion, showed that the Casting CE series varied in a narrow range (7 percent) while the OSCE (19 percent) and Typodont (30 percent) data were much more volatile. Scatter plots provided a visual repre-sentation of these differences in the data variation (Table 1, Figures 4 and 5).

Multiple regression analysis indicated that the independent variables (OSCE P=0.07; Typodont P=0.87) were not statistically significant predictors for the dependent variable, Casting CE (Table 3). Alternative regression models of the independent variables also demonstrated that, individually, neither was a significant predictor of the dependent variable. Correlation coefficients between Casting CE and OSCE (r=0.21), between Casting CE and Typodont (r=0.04), and between OSCE and Typodont (r=0.08) further confirmed virtually no relationship among the variables (Table 2). Notwithstanding the lack of statistical significance for the independent variables, coefficient of determination values (R-Squared: 4.5% OSCE, 0.1% Typodont) suggested no explanatory power of the independent variables for the dependent variable. The lack of association among the variables can be observed on scatter diagrams of the variables (Figures 4 and 5).

The overall Kappa statistic to quantify the level of inter-rater reliability was completed for the Typodont exam (0.78). One faculty member graded all OCSE exams so a Kappa score was not calculated for the OSCE exam.

Table 2. Correlation analysis

Variable Clinical Casting Exam

Clinical Casting Exam 1.00OSCE Exam 0.21Typodont Exam 0.04

Table 3. Multiple regression analysis

Variable P Value

OSCE Exam 0.07Typodont Exam 0.87

Discussion Our results indicate a weak at best correlation

between a preclinical OSCE-based exam measur-ing a student’s ability to identify critical errors in preparations and castings and a clinical competency exam involving the preparation for and delivery of a full veneer crown. This finding was surprising to us. We would have expected that students doing well on the OSCE exam, with demonstrated skills to detect errors in castings and preparations, would be better prepared to complete a more favorable result on the casting competency examination. However, our findings are consistent with Potter et al., who found no significant correlation between preclinical grades and clinical grades.9 Additionally, our findings are consistent with Rich et al., who demonstrated a low correlation between preclinical assessments of student performance and clinical performance in the same specialty (periodontics).10 Although some educators have argued that the OSCE has sufficient validity to be used as a threshold examination,17 the results of our study would lead us to not consider us-ing the OSCE as a “must pass” before continuing in the curriculum. Our recommendations for the use of the OSCE in our curriculum would be consistent with Mavis et al., who felt the OSCE is best utilized as a formative assessment by which to provide feedback to students16 rather than as a threshold examination.

Our results also indicate a nearly nonexistent correlation between a preclinical typodont prepara-tion examination designed to provide a measure of a student’s psychomotor skill and a clinical competency exam involving the preparation of a full crown. This also surprised us. We expected that students who did well on a psychomotor-based skills examination in a preclinical exercise would be better prepared to do well on a clinical casting competency examination, especially examinations within the same specialty. However, others have found a low correlation be-tween preclinical and clinical performance in the same specialty.10 The reasons for a low correlation between student performance in preclinical labora-tory performance and in clinical assessments have been explored by Chambers, who suggested that the preclinical and clinical contexts differ substantially and factors that contribute to success in each may also differ.19 Chambers also suggested that overlearning, or practicing a skill beyond the point of initial correct performance, may be helpful to transfer skills from preclinical to clinical contexts.

March 2007 ■ Journal of Dental Education 371

The question has to be asked why the preclini-cal examinations are not more predictive of future clinical performance. Chambers addressed this ques-tion by suggesting that the preclinical and clinical contexts differ and that skills such as communication, decision making, and management are new issues that students entering the clinics must face and are not issues in the consistent and tightly controlled preclinical environment.19 This concept is supported by Zakay and Wooler, who found that training con-ducted under normal conditions improved decision performance under normal conditions but did not improve performance under conditions of reduced time.20 It could be that the preclinical and clinical conditions are sufficiently different, as Chambers has suggested,19 that the conditions for success require different skill sets.

One might expect a stronger correlation of the OCSE to the Casting CE if the format of our OSCE more closely simulated patient care. In a very in-teresting study of 137 medical students, Wilkinson and Frampton found that a clinically based OSCE was a stronger predictor of subsequent clinical per-formance than traditional essay and multiple choice questions.21 However, they also found that combin-ing the traditional essay and multiple choice exam increased predictive validity, something our results did not show.

Although the preclinical exams are not signifi-cant predictors of clinical performance, the preclini-cal exercises and exams still have merit in providing repetition on tasks similar to the clinical experience. Chaiken et al. have shown that psychomotor abilities do improve with practice and that processing speed, or the strength in the ability to complete visual extrapolation tasks, and time estimation tasks were the most important predictors for improvement of psychomotor skills.22 This may explain why inves-tigators have shown that the PAT is more predictive of preclinical student performance than many other measures.4,5

The coefficient of variation (CV) allows a comparison of the variation in the data and showed the Casting CE varied in a very narrow range (7 percent) while the OSCE (19 percent) and Typodont (30 percent) data were much more variable. The wide dispersion of the preclinical assessments may be a result of students not placing as much emphasis on preclinical exams and being willing to accept a lower grade. The narrow CV with the Casting CE could be related to faculty reluctance to give a low grade on

a high stakes assessment, as well as the possibility the students had become proficient by the time they took the Casting CE.

An important assumption of regression analysis of multicollinearity is that there should be no highly correlated relationship between two or more of the predictor variables. Multicollinearity makes it dif-ficult to obtain unique estimates of the regression coefficients because there are an infinite number of coefficients that would satisfy the regression model. Multicollinearity was tested for and ruled out by the multiple correlation analysis (r=.08) and by modeling the two independent variables separately. As such, no multicollinearity was observed.

Several variables may have confounded our results. The same faculty did not grade all the exami-nations although faculty participated in calibration activities. The order of testing can have an influence on results,23 which might have occurred because the OSCE exam was given to small groups over a month-long period. It is possible that test information may have been shared by students who had completed the OSCE exam.

Future work will include continual evalua-tion of student assessments in hopes of providing a balance of formative and summative criteria by which objective and valid feedback to students can be provided.

REFERENCES1. Boyle AM, Santelli JC. Assessing psychomotor skills:

the role of the Crawford Small Parts Dexterity Test as a screening instrument. J Dent Educ 1986;50:176-9.

2. Boyd MA, Wood WW, Conry RF. Prediction of preclini-cal operative dentistry performance in two instructional methods. J Dent Educ 1980;44:328-31.J Dent Educ 1980;44:328-31.

3. de Andres AG, Sanchez E, Hidalgo JJ, Diaz MJ. AppraisalAppraisal of psychomotor skills of dental students at University Complutense of Madrid. Eur J Dent Educ 2004;8:24-30.

4. Gansky SA, Pritchard H, Kahl E, Mendoza D, Bird W, Miller AJ, Graham D. Reliability and validity of a manual dexterity test to predict preclinical grades. J Dent Educ 2004;68:985-94.

5. Gray SA, Deem LP. Predicting student performance in preclinical technique courses using the theory of abil-ity determinants of skilled performance. J Dent Educ 2002;66:721-7.

6. Brown G, Manogue M, Martin M. The validity and reliability of an OSCE in dentistry. Eur J Dent Educ 1999;3:117-25.

7. Coy K, McDougall H, Sneed M. Issues regarding practical validity and gender bias of the Perceptual Abilities Test (PAT). J Dent Educ 2003;67:31-7.

372 Journal of Dental Education ■ Volume 71, Number 3

8. Kao EC, Ngan PW, Wilson S, Kunovich R. Wire-bending test as a predictor of preclinical performance by dental students. Percept Mot Skills 1990;71:667-73.

9. Potter RH, Sagraves GO, McDonald RE. Professional performance among dental students: a factor analysis. J Dent Educ 1982;46:216-20.

10. Rich SK, Keim RG, Shuler CF. Problem-based learning versus a traditional educational methodology: a compari-son of preclinical and clinical periodontics performance. J Dent Educ 2005;69:649-62.

11. Bellanti ND, Mayberry WE, Tira DE. Relation between selected predictor variables and grades in fixed prosth-odontics laboratory. J Dent Educ 1972;36:16-21.

12. Gray SA, Deem LP, Sisson JA, Hammrich PL. The predic-tive utility of computer-simulated exercises for preclinical technique performance. J Dent Educ 2003;67:1229-33.

13. Chambers DW. Competency-based dental education in context. Eur J Dent Educ 1998;2:8-13.

14. Zartman RR, McWhorter AG, Seale NS, Boone WJ. Us-ing OSCE-based evaluation: curricular impact over time. J Dent Educ 2002;66:1323-30.

15. Singer PA, Cohen R, Robb A, Rothman A. The ethics objective structured clinical examination. J Gen Intern Med 1993;8:23-8.

16. Mavis BE, Cole BL, Hoppe RB. A survey of student as-sessment in U.S. medical schools: the balance of breadth versus fidelity. Teach Learn Med 2001;13:74-9.

17. Martin IG, Jolly B. Predictive validity and estimated cut score of an objective structured clinical examination (OSCE) used as an assessment of clinical skills at the end of the first clinical year. Med Educ 2002;36:418-25.

18. Mavis BE, Henry RC. Between a rock and a hard place: finding a place for the OSCE in medical education. Med Educ 2002;36:408-9.

19. Chambers DW. Issues in transferring preclinical skill learning to the clinical context. J Dent Educ 1987;51:238-43.

20. Zakay D, Wooler S. Time pressure, training, and decision effectiveness. Ergonomics 1984;27:273-84.

21. Wilkinson TJ, Frampton CM. Comprehensive undergradu-ate medical assessments improve prediction of clinical performance. Med Educ 2004;38:1111-6.

22. Chaiken SR, Kyllonen PC, Tirre WC. Organization and components of psychomotor ability. Cognit Psychol 2000;40:198-226.

23. Blaskiewicz RJ, Park RS, Chibnall JT, Powell JK. The influence of testing context and clinical rotation order on students’ OSCE performance. Acad Med 2004;79:597-601.