22
cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

Embed Size (px)

Citation preview

Page 1: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

cap.org v. 1

Gynecologic Consensus ConferenceWorking Group 3, Topic 6: General Quality June 4, 2011

Page 2: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Joseph Tworek, MD (Senior Author)

• Lydia P. Howell, MD (Chair)

• Ritu Nayar, MD

• Sana O. Tabbara, MD

• Barbara Winkler, MD

• Lynnette Savaloja, SCT

• Nicole E. Thomas, MPH, CT(ASCP), (CAP Staff)

Working Group 6

© 2011 College of American Pathologists. All rights reserved. 2

Page 3: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Historical data and national benchmarksoUseful for smaller labsoHistorical data will identify trends within

labo Published benchmarks may identify lab

drift− National benchmarks not always available

Available methods for monitoring quality data

© 2011 College of American Pathologists. All rights reserved.

3

Page 4: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Justification: Survey and website

• Monitoring laboratory-wide data against national benchmarks may provide a baseline to identify and stratify lab performanceo Not valuable for labs with small numbers of primary

screeners

o Taken in context with other factors (eg, high-risk population)

• Comparing individual data to laboratory-wide data may help identify outlierso Retain with other QA documents

Statement: Selected metrics should be monitored individually, as well as globally for the laboratory.

© 2011 College of American Pathologists. All rights reserved.

4

Page 5: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

A. Agree with entire statement 95.92%

B. Only individual quality data should be monitored; no global monitoring. 0%

C. Only global laboratory monitoring; no individual monitoring. 0%

D. Disagree with entire statement (ie, quality data should not be monitored at all). 2.04%

Vote#56: Selected metrics should be monitored individually, as well as globally for the laboratory.

© 2011 College of American Pathologists. All rights reserved. 5

Page 6: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

A. Agree with entire statement. 92.9%

B. Only cytotechnologist quality data should be monitored. 3.57%

C. Only pathologist quality data should be monitored. 1.79%

D. Disagree with entire statement (ie, individual quality data should not be monitored at all). 1.79%

Vote#57: Monitoring of selected metrics for individuals should include both CTs and Pathologists

© 2011 College of American Pathologists. All rights reserved. 6

Page 7: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Justification: Survey

• Quality metrics should be shared with each CT and pathologist o From survey, 59% and 81% of labs facilitate comparison

of CT to other CTs and to laboratory data respectively

o 48% and 60% of labs facilitate comparison of pathologists to other pathologists and to laboratory data respectively

o Table 3 page 55

• Lab mean data and/or individual data could be shared openly or privately, identified or de-identified at the discretion of the lab

Statement: Results of quality metrics should be shared with individual CTs and pathologists.

© 2011 College of American Pathologists. All rights reserved.

7

Page 8: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

A. Agree with entire statement. 98.39%

B. Quality metrics should only be shared with CTs. 1.61%

C. Quality metrics should only be shared with Paths. 0%

D. Disagree with the entire statement (ie, quality metrics should not be shared at all). 0%

Vote#58: Results of quality metrics should be shared with individual CTs and pathologists.

© 2011 College of American Pathologists. All rights reserved.

8

Page 9: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Justification:o Survey

−Most common is monthly (62.3% prepare quality report monthly).

−Helpful in semi-annual evaluation of CT (eg, determining screening limits)

−Labs can decide whether to de-identify results when sharing: – Most labs (70% from survey) do not code CT and

pathologists results to maintain confidentiality.

Statement: Results of quality metrics should be shared at least twice a year with individuals.

© 2011 College of American Pathologists. All rights reserved.

9

Page 10: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

A. Agree with entire statement. 65.6%

B. Time frame is too frequent. 0%

C. Time frame is too infrequent. 9.4%

D. Time frame should be left to discretion of the lab. 25%

E. Disagree with entire statement (ie, quality metrics should not be shared with individuals at all). 0%

Vote#59: Results of quality metrics should be shared at least twice a year with individuals.

© 2011 College of American Pathologists. All rights reserved.

10

Page 11: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Justification: Survey

• Multi-head review of difficult cases ranked second most useful quality metric

• 60% of labs conduct in-house reviewo Share interesting caseso Review of educational program slideso Hone diagnostic criteriao Review cases identified from QAo Review laboratory generated study material

Statement: Reviewing selected cases for educational purposes is a useful quality tool.

© 2011 College of American Pathologists. All rights reserved. 11

Page 12: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

A. Strongly Agree 86.4%

B. Agree 13.6%

C. Disagree 0%

D. Strong disagree 0%

Vote#60: Reviewing selected cases for educational purposes is a useful quality tool.

© 2011 College of American Pathologists. All rights reserved. 12

Page 13: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

Areas for future development

© 2011 College of American Pathologists. All rights reserved. 13

Page 14: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Justification: Survey

• Top 3 methods across (HPV, Cyto-histo cor., 5 yr look backs, and immediate re-screen)o Lab defined action limits/thresholds (46%-

66%)o Rate change (23% - 35%)o Lab defined action limits/thresholds from

literature (21% - 26%)

Insufficient data to determine best methods to identify variance in lab-wide quality metrics.

© 2011 College of American Pathologists. All rights reserved. 14

Page 15: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Justification: Survey

• Varies by metric

• More commonly done for CTs than Pathso 54%, identify outlierso 47%, user defined action limits (arbitrary)o 37%, rate changeo 22%, compare S.D. with historical meano 18%, user defined action limits based on

literature

Insufficient data to specifically describe how to identify variance in individual quality metrics.

© 2011 College of American Pathologists. All rights reserved. 15

Page 16: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• 85%, identify root cause analysis, address the cause and continue to monitor

• 42%, conduct in-lab re-education

• 33%, increase real time re-screen of NILM prior to sign out

• 32%, decrease slide work load

• 21%,retrospective re-screen of a defined number of previous NILM cases.

Common actions from survey to address variance in lab-wide performance

© 2011 College of American Pathologists. All rights reserved. 16

Page 17: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• 68% identify cause of variance and conduct focused review or education

• 53% increase real time re-screen of NILM prior to sign out

• 46% counseling and continued monitoring (warning shot)

• 45% decrease work load limits

• 37% conduct in-house tutorial

• 31% conduct an audit of previous cases

Common actions from survey to address variance in individual performance

© 2011 College of American Pathologists. All rights reserved. 17

Page 18: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

Working Group 3

Topic 6

General Quality

Additional Voting Questions

© 2011 College of American Pathologists. All rights reserved. 18

Page 19: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

VotingWG3 - General Quality

68. Low-volume methodologies should have a higher-level of quality oversight/control.

A. Yes, screened by designated “experts” 11.11%

B. Yes, automatically re-screened 6.67%

C. Both A&B 20%

D. Yes, I agree with statement, but left to discretion of lab 53.33%

E. No, I do not agree with statement 8.89%

19© 2011 College of American Pathologists. All rights reserved.

Page 20: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

• Justification: professional opinion and literature

• Need more data regarding practice patterns and their validity

• Consider use of slide set of unknowns to assess initial competency

• Rescreen percentage of cases for a period of timeo The percentage and time length depend upon

the laboratory’s resources and patient population (ie, abnormal rate)

o For labs with low volume or low abnormal rates, consider additional test of verified unknowns or if possible use “spiked” cases

WG3-Statement: Newly hired primary screeners should be monitored, but best method(s) is unclear.

© 2011 College of American Pathologists. All rights reserved. 20

Page 21: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011

69. Newly hired primary screeners should be monitored, but best method(s) is unclear.

A. Strongly agree 37%B. Agree 46.3%C. Disagree 14.8%D. Strongly disagree 1.8%

VotingWG3 – General Quality

© 2011 College of American Pathologists. All rights reserved. 21

Page 22: Cap.org v. 1 Gynecologic Consensus Conference Working Group 3, Topic 6: General Quality June 4, 2011