27
Investing in CQI Investing in CQI Implementation Issues to Consider Kimberly Gentry Sperber, Ph.D.

Investing in CQI Implementation Issues to Consider Kimberly Gentry Sperber, Ph.D

Embed Size (px)

Citation preview

Investing in CQIInvesting in CQI

Implementation Issues to ConsiderKimberly Gentry Sperber, Ph.D.

Objectives of CQIObjectives of CQI

• To facilitate the Agency’s mission• To ensure appropriateness of services• To improve efficiency of services/processes

• To improve effectiveness of directing services to client needs

• To foster a culture of learning• To ensure compliance with funding and regulatory standards

Building a CQI ProcessBuilding a CQI Process

• Formal infrastructure• Core Elements

– Documentation Review– Indicators

• Process Versus Outcome• Performance Goals• Action Planning

– Customer Satisfaction• Clients, Staff, Stakeholders

– Program Evaluation

Creating InfrastructureCreating Infrastructure

• Dedicated position• Use of committees• Written CQI plan• Designated process requirements

• Inclusion in strategic plan• Positioning within agency• Role of Board of Trustees

Creating a CQI InfrastructureCreating a CQI Infrastructure

Executive CQI Committee

Risk Management Committee

Safety Committee Human Subjects Committee

Diversity Committee Corporate Compliance Committee

Cluster CQI Committees

Program Peer Review Committees

Morbidity & Mortality Conference

Written PlanWritten Plan

• Vision/purpose– Objectives

• Definitions• Authority to ensure compliance• Compliance procedures/definitions• Documentation of process• Peer Review• Committees

– Membership– Objectives

• Satisfaction– Clients– Employees– External stakeholders

• Choosing indicators• Use of data

Remaining InfrastructureRemaining Infrastructure

• Inclusion in strategic plan• Positioning within agency

– Marriage of clinical and quality

• Role of Board of Trustees– Annual approval of CQI plan– Quarterly reports on indicator performance

Why Examine Documentation?Why Examine Documentation?

• Clinical Implications– Documentation is not separate from service delivery.– Did the client receive the services he/she needed?

• Operational Implications– Good documentation should drive decision-making.– Means of communication

• Risk Management Implications– If it isn’t documented, it didn’t happen.– Permanent record of what occurred in the facility

• Source of Staff Training • Reflection of the provider and organization’s

competency:– EBP– Outcome of care

Peer Review CommitteesPeer Review Committees

• Requires standardized, objective method for assessing charts.

• Random selection of charts and monthly reviews

• Goal is to identify trends and brainstorm solutions

• These staff serve as front line for corporate compliance, risk management, and quality documentation

Peer Review MeasuresPeer Review Measures

• Completeness of Records checks– Assessment is present and complete.– Service plan present and complete.– Consent for Treatment present and signed.

• Quality Issues– Services based on assessed needs.– Progress notes reflect implementation of service plan.

– Documentation shows client actively participated in creation of service plan.

– Progress notes reflect client progress.

Peer Review ProcessPeer Review Process

• Identification of review elements– Creation of standardized checklist

• Assigning staff responsibilities– Workload analysis

• Creating process for selecting files for review

• Determining review rotation• Reporting and use of data

Establishing IndicatorsEstablishing Indicators

• Relevant to the services offered

• Align with existing research• Measurable

– No “homegrown” instruments– Reliable and valid standardized measures

Examples of IndicatorsExamples of Indicators

Process Indicators• Percentage of clients with a serious MH issue

referred to community services within 14 days of intake.

• Percentage of clients with family involved in treatment (defined as min. number of face-to-face contacts).

• Percentage of clients whose first billable service is within 72 hours (case mgt).

• Percentage of positive case closures for probation/parole.

• Percentage of high risk clients on Abscond Status for probation/parole.

• Percentage of restitution/fines collected.• Percentage of clients participating in treatment

services.

Examples of IndicatorsExamples of Indicators

Fidelity Indicators (Process)• Percentage of groups containing role-plays• Percentage of successful completers receiving

appropriate dosage based on risk/needs assessment

• Percentage of staff achieving 4:1 ratio• Percentage of groups observed where staff

modeled the skill prior to having clients engage in role-play

• Percentage of role-plays containing practice of the correctives

• Percentage of role-plays that required observers to identify skill steps and report back to the group

Examples of IndicatorsExamples of Indicators

Outcome Indicators• Clients will demonstrate a reduction in antisocial attitudes.

• Clients will demonstrate a reduction in ORAS scores.

• Clients will demonstrate an increase in treatment readiness.

• Clients will obtain a GED.• Clients will obtain full-time employment.

• Clients will demonstrate a reduction in Symptom Distress.

• Client will demonstrate sobriety.

Operationalizing IndicatorsOperationalizing Indicators

• Procedures for administering pre/post-tests

• Procedures for coding, storing, tabulating, reporting data

• Identifying numerator and denominator

• Being clear about the value of the information provided

Observation-Based RatingsObservation-Based Ratings

• Creation of audit sheets• Schedule for conducting the reviews• Staff qualified to conduct and rate the observations

• Time for staff to conduct observations

• Mechanism to record and use the data– Supervision and individual staff development

– QI and training initiatives

Client SatisfactionClient Satisfaction

• Identify the dimensions– Access– Involvement in treatment/case planning– Emergency response– Respect from staff– Respect from staff for cultural background

• All programs use the same survey• Items are scored on a 1-4 Likert scale

• Falling below a 3.0 generates an action plan

Operationalizing the Operationalizing the ProcessProcess

• Identification of items for inclusion

• Distribution and collection of surveys

• Coding, analysis, and reporting of data

• Use of data

Establishing ThresholdsEstablishing Thresholds

• Establish internal baselines• Compare to similar programs• Compare to state or national data

Minimum RequirementsMinimum Requirements

• Buy-in from staff at all levels of the organization

• Sufficient resources allocated for staff training

• Sufficient resources allocated for staff to participate in the process– Peer Review Meetings– Other relevant committee meetings– Data collection

• Sufficient information systems

Overcoming ResistanceOvercoming Resistance

• Administration must walk the walk• Insure early successes to increase buy-in

• Recognition of staff for using the process

• Openly acknowledge the extra work required

• Demonstrate front-end planning to minimize workload issues

Reducing Staff BurdenReducing Staff Burden

• Workload analysis• Use of technology to streamline

– Forms and databases– Spreadsheets for scoring pre/post-tests

• Assist with problem-solving around workload issues

• Allow flexibility where possible

Barriers to ImplementationBarriers to Implementation• Agency culture

– The “black hole” of data that leads to staff cynicism and burnout

– Conflicting messages about targets/goals in various work domains

– Problem letting go of old ways– “We’re clinicians not statisticians”

• Costs– Staff time– IS capabilities– Data collection instruments– Coordination of the process and dissemination of the data

• Multiple and sometimes conflicting demands of multiple funders– Different priorities– Don’t speak the same language causing confusion for line

staff

Common Barriers to Common Barriers to Assessing FidelityAssessing Fidelity

• Strength of conceptual understanding of the EBP to be measured

• Resources• Setting priorities• Understanding/skill sets required for measurement

• Conflicting philosophies (helper vs. evaluator)

• Time!

Potential StrategiesPotential Strategies

• Start small– For example, desk top review of assessments versus observation-based ratings

• Use technology to increase efficiencies– For example, videotape interactions for observation-based ratings

• Take the time to build expertise– Train on model– Train on evaluation methodology– Insure understanding of purpose (e.g., QI versus punishment)

Continuous Quality Continuous Quality ImprovementImprovement

Questions & Answers