Upload
charles-moss
View
214
Download
2
Tags:
Embed Size (px)
Citation preview
Developing Satisfaction Surveys: Integrating Qualitative and Quantitative Information
David Cantor, Sarah Dipko, Stephanie Fry, Pamela Giambo and Vasudha
NaraynanWestat
This material was prepared by Westat Inc., under contract with the Centers for Medicare & Medicaid Services (CMS), an agency of the U.S. Department of Health and Human Services. The contents presented do not necessarily reflect CMS policy.
Purpose of Paper
Illustrate issues related to pre-testing and evaluating satisfaction surveys with establishments
Integrating quantitative and qualitative methods
Opinion items on establishment surveys Not a great deal of information on
“opinion” from establishments. Response is not reliant on record
information, but knowledge of relevant experiences.
Questionnaire design issues become more important than surveys related to “factual” items (e.g., context; order; wording).
Idiosyncrasies of satisfaction surveys
Respondents tend to use upper end of the scale
Items are correlated
Medicare Contractor Provider Satisfaction Survey (MCPSS)
Survey sponsored by Centers for Medicare and Medicaid Services
Respondents: Medical providers
Topic: Satisfaction with Medicare contractors
Survey Procedures and Questionnaire Mixed mode survey: web and
telephone Satisfaction items cover 7 different
business areas
Scale used for ranking satisfaction
Thinking about all your interactions with your contractor in the last 12 months, how satisfied are you with your contractor’s performance overall?
Objective of AnalysisPre-test and evaluate satisfaction items
Recommend changes to the items add or delete items – concern especially
with shortening the questionnaire Reword – Sharpen focus of attitude object
Assess how respondents use the scale respondents using different criteria for
scoring opinions
Testing procedures used were complementary
Expert review Cognitive interviews Psychometric analysis:
Review of frequencies, missing data Correlation and factor analysis Rasch Analysis
Cognitive Interviews:Procedures
Conducted 2 rounds over the telephone Asked respondents to:
Answer questions first Explain how they came up with their answer
Probed on particular words or phrases Asked how the scale points were chosen Asked about items that were important
for evaluating the contractor Asked about items that could be dropped
Cognitive interviews results:Procedures
Reference Period – change from 6 months to 12 months Direct experiences were memorable enough
to recall for 12 months Some respondents were not using direct
recall for particular questions Respondent knowledge
Audit and reimbursement difficult Other sections varied. Most respondent had
at least some indirect, if not direct, experiences.
Cognitive interview results:Instructions and Introductions
CMS communications vs. contractor communications
In-person workshops vs. on-line material
First vs higher level appeals
Cognitive InterviewsVague and/or imprecise wording
Attitude object was not clear.
“The mechanisms that your contractor offers for exchanging information with them about inquiries.”
“The accuracy of first level appeals decisions”
“The consistency of your Contractor’s answers to questions throughout the Audit and Reimbursement process”
Cognitive InterviewsIdentify redundant items
Some items referred to overlapping domains:
“Receiving the correct information”
“Consistency of responses from staff”“Knowledge of Contractor’s staff”
All three were viewed by respondent as being important.
Cognitive interviewsIdentify redundant items (2)
“Detail in which topics were covered”
“Quality of education and training materials that you generally use”
Items were viewed as the same by some (but not all) of the respondents
Cognitive interviews results:Variation in how scale is used
Respondents use different criteria Using absolute criteria – decide along
an internal measure of satisfaction Using comparative criteria – compare
experience with other contractors Respondents use different anchors
Start with middle and move up or down Start at top point and move down
Psychometric Analysis - Methods Brings in additional external information
Performance under real survey conditions Has large data-base to draw from
Used to verify or point to qualitative results Item difficulty – how much missing data is
there? Redundant items – examine correlations
among items. What is the distance across scale points?
Deciding on Redundant Items:Using Psychometric Analysis Difficult to delete some items based
on expert and respondent feedback Items were viewed as “important” How do items perform “in practice”
Drew on correlation and factor analyses: Use prior year survey administration Deleted items that had highest
correlations
Number of items addedor deleted by method
Expert
Review
Cognitive
Interviews
Psychometric
Analysis
Add items 20 6 NA
Delete Items
19 10 2
Preliminary application of Rasch model
Assess use of the scale Is distance between scale points the
same?
Analysis suggests that distance between scale points is not uniform Greater distance between upper end
points (5 vs 6).
Summary:Establishment Surveys and Opinion items
Opinion surveys for establishments need to consider respondent selection from a different perspective than for factual items Does the respondent have any direct
experiences that are relevant? Does the respondent communicate with those
the have the target experiences?
Questionnaire design is important (instructions; item clarity; item relevance).
Summary:Pre-testing Methods
Qualitative methods are useful for evaluating domains, instructions, item clarity
For determining item relevance (importance), it is useful to have some quantitative measures of performance
Having both qualitative and quantitative data is useful for diagnosing item performance Quantitative – what is redundant? Qualitative – why is it redundant?