20
Aaron Gadberry Director, Software Architecture Gathering and Leveraging Quality-Centric Metadata in Clinical Trials

Gathering and Leveraging Quality-Centric Metadata in Clinical Trials

Embed Size (px)

DESCRIPTION

Improving the quality of metadata in the clinical trials process can have profound effects on costs and efficiency. Gathering and leveraging the quality data can optimize your clinical trials.

Citation preview

  • 1. Aaron Gadberry Director, Software Architecture Gathering and Leveraging Quality- Centric Metadata in Clinical Trials
  • 2. Increasing Initial Quality Quality data on the first pass Problem personnel Historic performance Dynamic Risk Based Monitoring Cost model adjustments 2
  • 3. Lower Quality = Higher Cost Direct Cost More cleaning Indirect Cost Treatments could be less effective Treatments could be less safe 3 Data Entry Data Cleaning$$$ Clean Data $$$
  • 4. Responsibility? Who is managing site quality today? Who is reporting on Entry Personnel Who is holding them accountable CRAs Site Audits What are the key metrics? 4
  • 5. What Metrics Do We See? Completeness Page report Time to Entry Time to SDV Time to Sign Time to Resolve Queries Quality Query counts 5
  • 6. What Metrics Dont We See? Was the initial data accurate? If not, was intervention required? If so, how many interventions? What trends are visible from this? How often is initial data accurate? How often is a change self-initiated? 6
  • 7. Difficult to Collect and Report On Applicable data takes time to gather Hindered by turnover at sites Not preserved for the next trial Visible as points, but not aggregated We need to predict quality to realize the full benefits of Risk Based Monitoring 7
  • 8. We Have a Wealth of History Leverage historic quality data during site selection CTMS / Cloud Show quality over time Cross-trial, cross-sponsor Rapid visibility Hotspot reports over users 8
  • 9. Number of Users by Data Quality 9
  • 10. Volume of Data by Data Quality 10
  • 11. With Quality Increased by 3% 11
  • 12. With Quality Increased by 5% 12
  • 13. With Quality Increased by 10% 13
  • 14. Major Results from Minor Changes Increasing user initial quality By 3% reduces data changes by 31% By 5% reduces data changes by 48% By 10% reduces data changes by 79% 14
  • 15. Application of the Tools Measuring makes us aware But we cant really solve the problem Only the site can truly manage The personnel are theirs Why isnt this happening today? Lack of measurement, lack of incentive 15
  • 16. We Can Change These Measurement New Metrics Incentive Contract for quality For Sites and ClinOps Similar metrics for CRAs 16
  • 17. The Value of Quality Performance and consistency gain value beyond repeat business Additional Income Exposure Recruitment A Market Emerges for Quality-Centric Sites and Personnel 17
  • 18. The Market will Adapt We will pay more for quality data But The overall cost is lower Data is more accurate Confirm and adjust Dynamic Risk Based Monitoring Risk is user-based 18
  • 19. A Quality-Centric Model Expectation to manage quality is set Quality directly influences payment Performance impacts future business Sites are motivated and compensated 19
  • 20. from Concept to Cure with DATATRAK ONE DATATRAK International Cleveland, Ohio Bryan, Texas Cary, North Carolina 888.677.DATA (3282) Toll Free www.datatrak.com