21
The Modular Survey Initiative A Presentation to the SQM Initiative August 4th 2009

The Modular Survey Initiativesqm.mrc.ac.za/AugSQMPresentation.pdf · The Original Charge To develop and promote the adoption of a limited number of consumer perception of care items

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

The Modular Survey Initiative

A Presentation to the SQM Initiative

August 4th 2009

Goals for Presentation

Review consumer perception of care as a concept

Review success criteria for its measurement

Review the development, testing, & deployment of SAMHSA’s Modular Survey

Discuss potential utility of Modular Survey

The Concept

Measuring consumer response =

core business function

In healthcare tied to growth of consumerism

& CQI initiatives

A NOMS domain

Consumer perception of care ≠ satisfaction

Measuring Consumer Perception of Care

Approaches to measurement differ

depending on scope & purpose

For purposes of comparability, improvement over time, & benchmarking, measures must be : Meaningful

Scientifically-sound

Actionable

The Problems with Satisfaction

No evidence linking the measurement of

satisfaction to client outcomes

Few satisfaction surveys scientifically validated

Data is not actionable (ceiling effect)

The Modular Survey

SAMHSA –supported initiative

Conducted under the auspices of the Forum on Performance Measurement, MHSIP & the Washington Circle

Conducted in 2 phases Phase 1 - in conjunction with mental health Rx

Phase 2 - substance abuse–specific

The Original Charge

To develop and promote the adoption of a limited number of consumer perception of care items looking at access, quality, and outcomes

To employ a “modular” approach suited for digital environment (e.g. DS 2000+, EHR, web application)

To develop modules that addressed both common concerns (across populations & fields) and core concerns (population or field specific)

To support benchmarking & best practice dissemination

“Field-level”CommonMeasures

Adult Common Measures

Child/AdolescentCommon Measures

AdultMental Health

CoreMeasures

Adult Substance

Abuse Core Measures

AdolSubstance

AbuseCore Measures

C/AdolMental HealthCore Measures

Common DesignTemplate

Modular Survey Flow of Common Questions for Individual Respondent

Phase I Development

Collaboration with mental health treatment (i.e. MHSIP & YSS) through the Forum Largely voluntary effort (Forum as “force multiplier”)

Focus on commonality, not comprehensiveness

Strict design requirements Short

Scientifically sound

Actionable

Use of existing, widely-deployed, non-proprietary surveys

Consensus-driven

Approach to Phase 1

4 workgroups – Steering, Adult Content, Child/Adolescent Content, Design Specification

Selection of instruments

Identification of concerns

Identification of potential items

Ranking of items

Final item selection (modified Delphi approach)

Presentation of item pool at Carter Center Forum

Phase 1 Pilot Testing

Conducted during summer/fall of 2004 by Ann Doucette Ph.D.

Primary data collection in Cincinnati United Way agencies (N=1157)

Secondary analysis using MHSIP data (16 state & LA

County datasets)

Final N > 22,000 respondents

Pool of items reduced from 28 to 12 (subsequently to 11)

No difference between Level 1 and 2 items

All items common to both populations, both fields)

Approach to Phase 2

Separate MH & SUD initiatives

MH under MHSIP/YSS, SUD under Washington Circle (with Forum as “subcontractor)

New item development (no existing SUD Rx survey)

Content work group co-chaired by Tom McLellan (TRI) & Doreen Cavanaugh (Georgetown)

Methodological support from Forum Methods work group chaired by Ann Doucette (George Washington)

Public Provider & Consumer Advisory Groups

Phase 2 SUD Initiative

Closely coordinated with both NOMS & MH initiative

Identification of concerns

Relationship to treatment program

Self-awareness of problem/commitment to change

Recovery Status

Social connectedness

Generation of items (35 in testing pool)

Phase 2 SUD Pilot Testing

OMB & IRB approval spring 2006

Conducted in 3 rounds Round 1 – Adult & Adolescent (summer 2006)

14 programs, N=1207

Round 2 – Adult & Adolescent (winter 2006-7) 6 programs, N=585

Round 3 – Adolescent (spring-summer 2007) 8 programs, N=268

Final adult N=1549 (2 samples)

Final adolescent N=492 (1 sample)

Phase 2 Completion

Analysis & Recommendations by Ann Doucette Ph.D.

Review by Forum Methods Work Group (November 2007)

Review & Approval by SUD Content Committee (November 2007

Original SUD Modular Survey

21 items (11 from Phase 1, 10 from Phase 2) Quality – 6 items

Perceived Outcomes – 6 items

Social Connectedness – 7 items

Commitment to Change – 2 items

10 demographic & background items

Spanish translation available

Modular Survey Flow of Common Questions for Individual RespondentFinal Version

Phase 1Common Items *

Phase 2MentalHealth Items

Phase 2SUD

Items

Common DesignTemplate

* All populations, all fields

The Opioid Treatment Module

Development begun in July 2009 New content work group 1st meeting identified 2 new domains

StigmaIntegration with medical system

Pool of 28 items developedNext steps

Website Functions

Program enrollment Printing of surveys Data entry Reporting (including benchmarking)

Strategic Trajectory

Test & deploy website Implement in voluntary demonstration site(s) Revise administrative protocols & website Expand implementation Implement benchmarking Revise items based on predictive power

(LA County) Expand benchmarking & identify best

practices Provide TA based on best practices

So what…?

Modular Survey measures key area of policy concern Product of both expert consensus & state-of-the-art

science Provides info re: consumers’ opinions of programs Short, scientifically sound, and approved by OMB Easily accessible (web-based application) for both

distribution & reporting Supports benchmarking & best practices Its questions can be examined for linkage to

outcomes