Upload
others
View
7
Download
0
Embed Size (px)
Citation preview
Medical Device Innovation Consortium
FDA/Xavier Quality Measures Initiative
May 20, 2015 1
Agenda
• Goal of the Initiative
• Process Overview
• How the Process Worked
• Our Data – Our Proposal – Data Credibility
• Next Steps
2
FDA Case for Quality Problem statement: Together, FDA and Industry are unable to achieve optimal product quality utilizing the current compliance approach as evidenced by the continuation of compliance violations, repeat failures and ineffective improvement. Goal of Quality Device Measures Initiative: To increase the assurance of product quality by collaboratively establishing a multidimensional system of predictive evaluation throughout the pre-production, production and post-production phases, coordinated with an increased visibility of comparative quality
How This Project Originated
3
Device Quality Measures Project
Charge: to identify and/or develop predictive internal measures of product quality
– Submit to FDA and MDIC – Include assessment across three phases of production – Potential to yield aggregated metrics to help FDA allocate resources based on
risk
Timeline: September 2014-May 2015
Participation: – Championed by FDA – Facilitated by Xavier University – Conducted by Work Group members from 15 firms
4
Proposal to MDIC
Formally adopt this project to:
– Establish metrics from the top measures
– Define the top metrics
– Pilot use of the top metrics
– Refine the Gold and Silver Competencies – link to AdvaMed
– Use all data to support other MDIC Strategies • Advanced Analytics • Increase Competency • Maturity Model
5
Team Members Steering Committee
Kathie Bardwell Steris
Gina Brackett FDA
Kristin McNamara FDA
Steve Niedelaman King&Spalding
Marla Phillips Xavier University
Monica Wilkins Abbott
Pre-Production Team Pat Baird
Baxter Rafael Bonilla
ScottCare Tom Haueter
Clinical Innovations Pete Palermo
CR Bard Susan Rolih
Meridian Bioscience Joe Sapiente
Covidien Gin Schulz
CR Bard Isabel Tejero
FDA Production Team
Anupam Bedi AtriCure
Kate Cadorette Steris
Kara Carter Abbott
Greg Jones BSI
Bryan Knecht AtriCure
Brian Motter J&J MD&D
Rhonda Mecl FDA
Post-Production Team Paul Andreassi Fisher & Paykel
Karen Archdeacon FDA
Patrick Caines Baxter
Joanna Engelke Boston Scientific
Jeff Ireland Medtronic
Scott Nichols FDA
Luann Pendy Medtronic 6
7
Process Overview
Not a “Call for Metrics”
• Not everything that can be measured is meaningful
• Need to make sure the ideas lead to ways to measure product quality = the target
8
Methodology and Timeline
Critical Systems
What systems are critical to
ensuring product quality?
Gold and Silver
Competencies
What activities BEYOND
COMPLIANCE in the Critical Systems help
ensure product quality?
Ways to Measure
Competencies
Survey: What
measures demonstrate
the effectiveness
of this competency?
Cause and Effect Matrix
Rank order the ability of the measures to give us an indication of
impact to product quality.
9/9 – 12/12 12/12 – 1/23
Finalize list
2/20 – 3/11
De-dupe list
3/16 – 4/7
11 97 208 124
9
Methodology and Timeline
Assessment
All members assess
outcome of Cause and
Effect Matrix
Face-to-Face Meeting
Meeting at Xavier to
agree upon proposed measures
FDA/MDIC
FDA to advance
measures through MDIC (work group +
stakeholder forum)
4/7 – 4/16 5/5 6/15 – 12/31
Prep for Xavier mtg. MDIC Mtg.
10
11
Methodology and Timeline
Critical Systems
What systems are critical to
ensuring product quality?
Gold and Silver
Competencies
What activities BEYOND
COMPLIANCE in the Critical Systems help
ensure product quality?
Ways to Measure
Competencies
Survey: What
measures demonstrate
the effectiveness
of this competency?
Cause and Effect Matrix
Rank order the ability of the measures to give us an indication of
impact to product quality.
9/9 – 12/12 12/12 – 1/23
Finalize list
2/20 – 3/11
De-dupe list
3/16 – 4/7
11 97 208 124
12
11 Critical Systems
We identified 11 Critical Systems that would help us ensure we did not have any gaps in our thinking and that we assessed each system for predictive measures:
1. CAPA
2. Change Control
3. Complaint Handling
4. Customer-Related/VOC
5. Design Controls
6. Distribution
7. Management Controls
8. Post-Launch Surveillance
9. Production and Process Controls
10. Servicing
11. Supplier Controls
13
Critical System Definitions
Defined each Critical System to consistently capture ideas
Product and Process Controls
CAPA Management Controls
• Validation (Appropriate IQ/OQ/PQ)
• Equipment (Maintenance, Calibration, Qualification)
• Process Capability • Product Testing • Acceptance criteria • Document Control • Identification of critical
processes • Scrap alert and action items • Yield criteria • Storage • Device History records • Investigation of all non-
conformances/failures
• Risk assessment • Acceptance criteria and
effectiveness checks • Evaluation of Impact-
Regulatory, Design, Validation, Packaging, etc.
• Analysis of all quality data • Trending • Investigations • Communication • Identification of data sources • Threshold/trigger limits for
action once product is launched
• Emphasis on preventive actions
• Training, Audits, Reviews • Resource management and
Employee Development • Quality Culture Improvement • Risk assessment • Acceptance criteria • Remediation activities • Organizational and
Management Structure • Effectiveness of Quality
Management System • Quality Plans • Reviews - includes who
participates and detail level • Quality policy
14
Methodology and Timeline
Critical Systems
What systems are critical to
ensuring product quality?
Gold and Silver
Competencies
What activities BEYOND
COMPLIANCE in the Critical Systems help
ensure product quality?
Ways to Measure
Competencies
Survey: What
measures demonstrate
the effectiveness
of this competency?
Cause and Effect Matrix
Rank order the ability of the measures to give us an indication of
impact to product quality.
9/9 – 12/12 12/12 – 1/23
Finalize list
2/20 – 3/11
De-dupe list
3/16 – 4/7
11 97 208 124
15
Gold and Silver Competencies
Silver Competencies: – Competencies that are beyond compliance (reactive) and
demonstrate what would be considered as good solid practices (proactive).
– View this as a stepping stone between compliance and Gold – Enhanced use of preventive actions to ensure sustainable quality
systems
Gold Competencies: – Competencies that are beyond Silver and demonstrate what would
be considered as best practices to proactively prevent failure – Could include shifts in current approaches (out of the box) – Should include intensive, granular understanding of what is driving
the results being measured – Systems and people actively identify previously undetected issues
before failure (finger on the pulse)
16
Pre-Production Competency Example
Critical System: Supplier Controls
Exchange of Quality Analytics with Suppliers through a partnership of true continuous improvement.
– Manufacturer receives and reviews in-process data from key suppliers.
– Engagement beyond the scorecard report.
Production Competency Example
Critical System: Change Controls
A quality design should require minimal changes. Changes should be enhancements or due to technological advancement.
– Design team should identify issues during testing and remediate before design transfer.
Post-Production Competency Example
Critical System: Complaint Handling
Use of a mobile app to guide Sales team in providing complete complaint information.
– Ensures the right questions are asked such that complete information is received the first time the complaint is submitted.
– Increases the completeness of evaluation and investigation, speeds complaint evaluation and investigation and enables a quicker awareness of emerging trends in product performance.
Methodology and Timeline
Critical Systems
What systems are critical to
ensuring product quality?
Gold and Silver
Competencies
What activities BEYOND
COMPLIANCE in the Critical Systems help
ensure product quality?
Ways to Measure
Competencies
Survey: What
measures demonstrate
the effectiveness
of this competency?
Cause and Effect Matrix
Rank order the ability of the measures to give us an indication of
impact to product quality.
9/9 – 12/12 12/12 – 1/23
Finalize list
2/20 – 3/11
De-dupe list
3/16 – 4/7
11 97 208 124
20
Gold and Silver Survey Demographics
Respondents: 208 answered at least 1 question – 120 filled-in their demographics (gave option to be anonymous) – Average years of experience: 24.7 years – Titles:
• Vice Presidents: 18 • Directors: 62 • Managers: 30 • FDA Officials: 4 • All Other: 6
Companies: 13 participated – Abbott, Baxter, Boston Scientific, Carefusion, Clinical
Innovations, CR Bard, Davol, Fisher & Paykel, Hospira, Medtronic, Meridian Bioscience, Philips, and STERIS
21
Survey Results - Examples
Pre-Production: – Evidence that customer input (including that of internal
customers) is actively solicited, assessed and documented at all stages of project and as design evolves
Production: – No product failures due to inadequate design evaluations or
knowledge. Low percentage of NC, OOS, Complaint reports, and no recalls due to inadequate understanding of product design
Post-Production: – CAPA effectiveness across all systems
22
Methodology and Timeline
Critical Systems
What systems are critical to
ensuring product quality?
Gold and Silver
Competencies
What activities BEYOND
COMPLIANCE in the Critical Systems help
ensure product quality?
Ways to Measure
Competencies
Survey: What
measures demonstrate
the effectiveness
of this competency?
Cause and Effect Matrix
Rank order the ability of the measures to give us an indication of
impact to product quality.
9/9 – 12/12 12/12 – 1/23
Finalize list
2/20 – 3/11
De-dupe list
3/16 – 4/7
11 97 208 124
23
The C&E Method – How it is used
Requirement Descriptor: Patient Safety
Design Robustness
Process Reliability
Quality System
Robustness
Failure Costs
Importance Rating >>> of the Critical Requirement versus the Vision statement
5 4 3 2 1
Measure Linkage to
Critical System
Correlation to Critical Requirement? 0 = Highly Improbable; 3 = Slightly Improbable;
6 = Slightly Probable; 9 = Highly Probable Total
Evidence that customer input (including that of internal customers) is actively solicited, assessed and documented at all stages of project and as design evolves.
VOC 9 6 3 0 0 78
Post launch VOC information considered at start of new projects VOC 9 6 3 0 0 78
Evidence that design projects capture voice of targeted, diverse group of stakeholders
VOC 9 6 3 0 0 78
Evidence of analysis of issues, root causes and CAPAs across related projects with evidence of actions brought forward to new project where trends are observed.
CAPA 3 9 3 3 3 69
Evidence that manufacturability requirements are included in design inputs
PPC 3 6 3 3 6 60 24
25
Data Analysis
Cause & Effect Matrix Data Analysis by:
– Median Score of Rank Order – lowest score represents the measure that respondents felt was closest to the #1 measure
– Average Correlation Score – highest score represents the highest correlation for each measure to the Critical Customer Requirements
– By Critical Customer Requirement: • Patient Safety • Design Robustness • Process Reliability • Quality System Robustness • Failure Costs
– Any combination of 2 or more of the Critical Customer Requirements
26
Sorted by Lowest Median Rank
Measure Linkage to
Critical System
Min Rank
Median Rank TPLC
CAPA effectiveness across all systems. All 1 1.0 Post-Production
Evidence that customer input (including that of internal customers) is actively solicited, assessed and documented at all stages of project and as design evolves.
VOC 1 2.0 Pre-production
Post launch VOC information considered at start of new projects VOC 1 3.0 Pre-
production
% of new projects utilizing human factors as inputs
Post Launch Surveillance 2 3.0 Post-
Production
No product failures due to inadequate design evaluations or knowledge, low percentage of NC, OOS, and Complaint reports, no recalls due to inadequate understanding of product design
Change Controls 2 5.0 Production
27
Sorted by Highest Average Rank
Measure Linkage to
Critical System
Total TPLC
CAPA effectiveness across all systems. All 7.5 Post-Production
Service data (repair rates / MTBF, top problems found, top parts used) that is collected and analyzed in conjunction with other sources of product quality data result in better decision making around design improvements and CAPA activities.
Servicing 6.9 Post-Production
Number of field corrections post launch. Number of design changes needed post launch
Post Launch Surveillance 6.6 Post-Production
Evidence that customer input (including that of internal customers) is actively solicited, assessed and documented at all stages of project and as design evolves.
VOC 6.5 Pre-Production
Establishing a performance-based CtQ component lifecycle program which includes active monitoring of component performance and reliability (i.e. replacement rates, out of box failures, etc.)
Servicing 6.5 Post-Production
28
Patient Safety Ranking
Measure Linkage to
Critical System
Patient Safety TPLC
Evidence that design projects capture voice of targeted, diverse group of stakeholders VOC 8.4 Pre-Production
Evidence that customer input (including that of internal customers) is actively solicited, assessed and documented at all stages of project and as design evolves.
VOC 8.3 Pre-Production
Post launch VOC information considered at start of new projects VOC 8.0 Pre-Production
Minimal number of recalls triggered by design changes.
Change Controls 7.8 Production
CAPA effectiveness across all systems. All 7.7 Post-Production 29
Design Robustness Ranking
Measure Linkage to
Critical System
Design Robustness TPLC
% of new projects utilizing human factors as inputs
Post Launch Surveillance 8.3 Post-
Production
Evidence that customer input (including that of internal customers) is actively solicited, assessed and documented at all stages of project and as design evolves.
VOC 8.0 Pre-Production
Number of field corrections post launch. Number of design changes needed post launch
Post Launch Surveillance 7.8 Post-
Production
Evidence that design projects capture voice of targeted, diverse group of stakeholders VOC 7.6 Pre-
Production
CAPA effectiveness across all systems. All 7.3 Post-Production
30
Process Reliability Ranking
Measure Linkage to
Critical System
Process Reliability TPLC
a) % products or product families with component level CtQs established b) Implement spc or some other early detection mechanism for ctq at the supplier where the manufacturer is consulted in the event of an alert level being triggered
Supplier Controls 8.5 Post-
Production
CpK of all appropriate systems Management Controls 8.0 Post-
Production
# rejects/scrap cost Management Controls 7.7 Post-
Production
Failures during verification/validation Change Controls 7.5 Post-
Production
Establishing a performance-based CtQ component lifecycle program which includes active monitoring of component performance and reliability (i.e. replacement rates, out of box failures, etc.)
Servicing 7.5 Post-Production
31
Quality System Robustness Ranking
Measure Linkage to
Critical System
Quality System
Robustness TPLC
CAPA effectiveness across all systems. All 8.3 Post-Production
Service data (repair rates / MTBF, top problems found, top parts used) that is collected and analyzed in conjunction with other sources of product quality data result in better decision making around design improvements and CAPA activities.
Servicing 8.0 Post-Production
Root cause analysis should identify cause at both the product/process level as well as the system level; effectiveness checks should verify that cause has been eliminated, so in the aggregate, this discipline plus measuring "% of CAPAs that failed effectiveness check"
CAPA 7.8 Post-Production
Failures during verification/validation Change Controls 7.8 Post-
Production
% of audits with repeat findings CAPA 7.8 Post-Production 32
Failure Cost Ranking
Measure Linkage to
Crticial System
Failure Costs TPLC
Service data (repair rates / MTBF, top problems found, top parts used) that is collected and analyzed in conjunction with other sources of product quality data result in better decision making around design improvements and CAPA activities.
Servicing 8.5 Post-Production
CAPA effectiveness across all systems. All 8.0 Post-Production
Establishing a performance-based CtQ component lifecycle program which includes active monitoring of component performance and reliability (i.e. replacement rates, out of box failures, etc.)
Servicing 8.0 Post-Production
CpK of all appropriate systems Management Controls 7.7 Post-
Production
Number of field corrections post launch. Number of design changes needed post launch
Post Launch Surveillance 7.7 Post-
Production 33
Sorted by: Quality then Safety
Measure Linkage to Critical System
Patient Safety
Design Robustness
Process Reliability
Quality System
Robustness
Failure Costs TPLC
CAPA effectiveness across all systems. All 7.7 7.3 6.0 8.3 8.0 Post-
Production
Service data (repair rates / MTBF, top problems found, top parts used) that is collected and analyzed in conjunction with other sources of product quality data result in better decision making around design improvements and CAPA activities.
Servicing 5.5 6.5 6.0 8.0 8.5 Post-Production
Root cause analysis should identify cause at both the product/process level as well as the system level; effectiveness checks should verify that cause has been eliminated
CAPA 6.2 6.0 6.5 7.8 4.3 Post-Production
Failures during verification/validation
Change Controls 5.5 5.5 7.5 7.8 4.8 Post-
Production
% of audits with repeat findings CAPA 4.0 3.8 5.8 7.8 6.3 Post-Production
34
35
Pre-Production Measures
1. Identification of CtQ elements includes the analysis of Process and Product Quality Data (e.g. user needs, issues, root causes, corrections, etc.) that will be assessed throughout the project cycle to eliminate, reduce, and prevent repeated design mistakes or transfer failures. This should occur both within a single project and across multiple projects.
2. Evidence that Manufacturing and/or Serviceability requirements are defined early in the project and assessed at each subsequent design stage (including application to prototypes) to eliminate, reduce, and prevent repeated design mistakes or transfer failures.
3. Evidence that customer input, both internal and external, is actively solicited, documented, and assessed, at all stages of design project. This includes early feedback from controlled market release (pilot) and post-launch VOC data from related products.
4. Evidence that critical suppliers or service providers are engaged early in the design process in order to develop appropriate design inputs, manufacturing specifications, and oversight activities. 36
Production Measures
1. CAPA Sustainability = % of CAPAs with no recurrence. Establish lower limit where escalation is required. Must extend beyond verification of effectiveness, i.e. long term sustainability; Also must apply to all CAPA feeder systems including NCR, CC, INV.
2. Analysis of CAPA (INV, NCR, CC) rate by source/cause by product over time. Companies are trending by root cause, but perhaps not by root cause per product. Folds into the theme of meta-analysis.
3. Analysis of product/process issues related to human resources (including training, human factors) - not just tagging everything as poor training, but understanding the root/source of issues.
4. Tracking and trending of first run/pass yield data (right first time i.e. built with no non-conformance/rework/failed inspections).
37
Production Measures (cont’d)
5. Analysis of preventative maintenance (production equipment - not servicing) vs. repairs.
6. Analysis of complaints that are a result of defects caused by internal/external
manufacturing activities.
7. Improved supplier incoming inspection results (supplier first pass yield, this can apply to both inspections performed by the supplier, and the receiving organization)
8. Number of product/process changes post design transfer (changes to product specification that alters the form, fit, or function of product) that are the end result of CAPA (NCR, INV, CC)
38
Post-Production Measures
1. Identification of CtQ components and monitoring them throughout the product lifecycle. (continuance of pre-production CtQ work - on-going monitoring/analysis from all data sources)
2. Trending/analyzing of product and process changes post launch (risk, timeliness, frequency, QMS source) and ensuring they are addressed by appropriate and effective changes. Intended to identify needed improvements in product and/or QMS. Align across TPLC
3. Trending/analyzing all post market surveillance data (e.g., complaints, FCAs, PHOs, HHEs) for overall QMS performance and feed into holistic QMS scorecard. Intended to identify needed improvements in product and/or QMS. (e.g., human factors, CtQ control)
39
Overarching/Corporate Measures
1. Monitor holistic performance of overall QMS health by viewing conglomerated outcomes from all QMS subsystems. (e.g., heat map that elicits questions about why is that there?) – Cumulative Measures (Big Data)
2. Defined targets for improving product quality (e.g.,
frequency, severity)
40
High risk
Moderate-risk
Low risk
Site 1
Site 7 Site 5
Site 2
Site 4 Site 6
Site 3
0
10
20
30
40
50
60
70
80
90
100
0 10 20 30 40 50 60 70 80 90 100
Risk Preparedness
Risk Level
Risk Assessment: Division A
Coloring based on the risk levels typical for the industry
41
Quality Measures Diagram
Design Controls
Quality Implementation
Technology/ Knowledge
Transfer
Production Continuous Improvement
& Risk Mgmt.
Corporate Continuous Improvement & Risk Mgmt.
R&D Continuous Improvement
& Risk Mgmt.
Post-Market
Surveillance
42
43
Xavier/PwC Pharma Metrics
Production Metrics Cultural Metrics 1. Started with a “Call for Metrics” – 236
metrics • Pre-Production 37 • Production 120 • Post-Production 79
2. Mapped the metrics to the Critical Systems
3. Worked to de-dupe and remove overlap
4. Cause & Effect Matrix
5. Used the data to “Vote” on key metrics and assess the overall information
6. June: Will finalize the proposal
1. Used the SchellingPoint Alignment Optimization (AO) Process
2. Gathered 110 opinions of what evidence would be needed to know a company has a strong culture of quality
3. Had strong alignment and sentiment around 22 opinions
4. Categorized each opinion by Theme
5. Created an influence map of the themes to understand overarching opinions
6. June: Will convert the Top opinions into metrics
44
Xavier QARA Graduates
1. Mind Mapped Gold Competencies to ensure product quality
2. Mind Mapped the top Gold Competencies to determine how to identify and/or develop predictive internal measures of product quality
3. Put top measures into a C&E matrix and ranked them
45
Comparison
3 very different groups of participants: – Years of experience – Level of employee – Different industries – FDA officials included in device group only
3 very different methodologies:
– Device: started with critical systems – Pharma: started with a “call for metrics” – Xavier Graduates: started with mind mapping
Results? Convergence = Verification
46
Comparison
47
1. CAPA Effectiveness/ Sustainability
2. Design Robustness
3. Big Data from Post-Market
4. Right First Time
5. Focus on Suppliers
1. CAPA Effectiveness
2. Changes related to innovation
3. Follow-through Score
1. CAPA Effectiveness
2. Impact of Human Resources
Overlap of Device and Pharma
Overlap of Device and Xavier Graduates
Overlap of Pharma and Xavier Graduates
48
Convert
Measures to Metrics
49
Companies need to assure quality at the source by designing quality in
Formulate a description from the clarification above
Product Owner During Product and Process Development
Studies to demonstrate critical inputs and outputs (CQA, CPP)
Consistently safe and effective
Tested and Verified
100% success
Companies
Source
Designing Quality in
Quality
Assure
Need
Quality and R&D need to test and verify critical inputs and outputs during product and process development in order to demonstrate safety and efficacy for every product
50
Companies need to assure quality at the source by designing quality in
Formulate a description from the clarification above
Quality and R&D need to test and verify critical inputs and outputs during product and process development in order to demonstrate safety and efficacy for every product
Identify critical components that
need to be measured
Determine appropriate
measurement
Target Value and Reporting Frequency
Data Owner
Analytical and Process Tests, and Criteria for each test
Test: # of tests passed / # of tests executed x 100 Verify: # of changes, complaints, and product/process failures attributed to poor development.
Test: 100%; per development review stage. Verify: 0 changes, complaints and product/process failures due to poor development
Test: R&D. Verify: Quality 51
Define Metrics (Template Example)
52
Proposal to MDIC
Formally adopt this project to:
– Establish metrics from the top measures
– Define the top metrics
– Pilot use of the top metrics
– Refine the Gold and Silver Competencies – link to AdvaMed
– Use all data to support other MDIC Strategies • Advanced Analytics • Increase Competency • Maturity Model
53
54