View
0
Download
0
Category
Preview:
Citation preview
Management Continuity from the Patient Perspective: Detailed Report
Management Continuity from the Patient Perspective:
Comparison of Primary Healthcare Evaluation
Instruments
Detailed Report of published article
Haggerty, J, F. Burge, R. Pineault, M.-D. Beaulieu, F. Bouharaoui, J.-F.
Lévesque, C. Beaulieu, D. Santor. 2011. “Management Continuity from the
Patient Perspective: Comparison of Primary Healthcare Evaluation Instruments.”
Healthcare Policy Vol 7 (Special Issue):154-166
Corresponding Author:
Jeannie L. Haggerty
Associate Professor
Department of Family Medicine
McGill University
Postal Address:
Centre de recherche de St. Mary
Pavillon Hayes – Bureau 3734
3830, av. Lacombe
Montréal (Québec) H3T 1M5
Canada
Contact:
Tel : (514) 345-3511 ext 6332
Fax : (514) 734-2652
jeannie.haggerty@mcgill.ca
Management Continuity from the Patient Perspective: Detailed Report
Management Continuity from the Patient Perspective:
Comparison of Primary Healthcare Evaluation
Instruments
Abstract
Management continuity, operationally defined as “the extent to which services delivered by
different providers are timely and complementary such that care is experienced as connected and
coherent,” is a core attribute of primary healthcare, usually expressed as care coordination or
integration.
Objective: To provide insight into how well validated coordination or integration subscales
measure the operational definition.
Method: Relevant subscales from the Primary Care Assessment Survey (PCAS), the Primary
Care Assessment Tool (PCAT), the Components of Primary Care Instrument (CPCI) and the
Veterans Affairs National Outpatient Customer Satisfaction Survey (VANOCSS) were
administered to 432 adult respondents who had at least one healthcare contact with a provider
other than their family physician in the previous 12 months. Subscales were examined
descriptively, by correlation and factor analysis and item response theory analysis. The
VANOCSS elicits coordination problems, is scored dichotomously and is not amenable to factor
analysis, but was used as a dependent variable in logistic regression models with the other
subscales.
Results: PCAS, PCAT and CPCI subscales’ values are positively skewed, suggesting very good
levels of management continuity, but 83% of respondents report having one or more problems on
the VANOCSS Overall Coordination subscale and 41%, on the VANOCSS Specialist Access
subscale. Exploratory factor analysis with the remaining instruments suggests two distinct
factors. The first (eigenvalue = 6.98) seems to capture coordination actions, observed behaviours
of the primary care physician to transition patient care to other providers (PCAS Integration
subscale and most of the PCAT Coordination subscale). The second (eigenvalue = 1.20) suggests
efforts by the provider to create coherence between different visits both within and outside the
regular doctor’s office (CPCI Coordination subscale). The PCAS Integration subscale was most
predictive of overall coordination problems: a 46% lower likelihood of problems for each point
increase in management continuity.
Conclusion: Ratings of management continuity correspond only modestly to reporting of
coordination problems. Challenges remain for the measurement of management continuity and
consequently for the evaluation of service integration and provider coordination.
Management Continuity from the Patient Perspective: Detailed Report
Background
Conceptualizing management continuity Continuity of care, often invoked as a characteristic of good care delivery, has different
meanings in different healthcare disciplines. A transdisciplinary literature review in 2001
identified three types of continuity of care that would be recognized by all healthcare disciplines:
relational, informational and management (Haggerty et al. 2003). In a consensus process
undertaken with primary healthcare (PHC) experts across Canada (Haggerty et al. 2007), they
agreed unanimously that, while management continuity is not specific to PHC, it is an essential
PHC attribute. They defined management continuity as “The extent to which services delivered
by different providers are timely and complementary such that care is experienced as connected
and coherent.”
This definition is relatively recent, and the principal providers of PHC in Canada, family
physicians, would recognize this attribute as coordination of care or integration. Indeed, when we
mapped subscales to operational definitions of attributes of PHC, those that we mapped to
management continuity were labelled by the developers as coordination or integration (Lévesque
et al. 2009). The distinction between these concepts is in the unit of analysis: for coordination it
is the provider; for integration, the organization; for continuity, the care trajectory of the
individual. These are linked, in that management continuity is the result of good care
coordination and/or integration (Reid et al. 2002; Shortell et al. 1996).
PHC reform has targeted increased service integration and multidisciplinary coordination.
Program evaluators therefore require good information to inform their selection of tools that
measure management continuity.
Evaluating management continuity Management continuity can be evaluated from the patient or provider perspective. The reference
to services being “experienced as connected and coherent” clearly pertains to the patient
perspective, while providers may be best placed to assess timeliness and complementarity of
services.
The objective of our larger study was to compare validated instruments thought to be most
pertinent for evaluating PHC from the user perspective in the Canadian context. Among 13
instruments that we identified as relevant, several contain subscales addressing coordination of
care or integration, which we mapped to management continuity. From the candidate
instruments, we selected six that are in the public domain and that we believe to be most relevant
for Canada. In this article, we compare the equivalence of the values of the different
management continuity subscales and determine whether they appear to be measuring the same
construct. We also examine the capture of different elements of the operational definition and the
psychometric performance of individual items with respect to the dimensions they represent. Our
intent is not to recommend one instrument over another, but to provide insight into how well
different subscales fit the construct of management continuity and how well they correspond to
our operational definition.
Management Continuity from the Patient Perspective: Detailed Report
Method
The method of this series of studies is described in detail elsewhere (Haggerty et al. 2009) and is
summarized here.
Measure description There were five relevant subscales in four of the six instruments retained for the study. They are
heterogeneous in their reference points and measurement approaches and are described briefly
here in the order in which they appeared in the questionnaire.
The Primary Care Assessment Survey (PCAS) (Safran et al. 1998) has a six-item Integration
subscale that asks patients to rate on a six-point Likert scale (very poor to excellent) different
aspects of “times their doctor recommended they see a different doctor for a specific health
problem.” The Primary Care Assessment Tool (PCAT) (Shi et al. 2001) has a four-item
Coordination subscale that elicits on a four-point Likert scale (definitely not to definitely)
aspects of their primary care provider’s role in “a visit to any kind of specialist or special
service.” The Components of Primary Care Index (CPCI) (Flocke 1997) has an eight-item
Coordination of Care subscale that uses a six-point semantic difference response scale (poles of
strongly disagree and strongly agree) to elicit agreement with various statements about the
“regular doctor,” whether or not the respondent has seen other providers. Finally, the Veterans
Affairs National Outpatient Customer Satisfaction Survey (VANOCSS) (Borowsky et al. 2002),
informed by the Picker Institute health surveys (Gerteis et al. 1993), has two subscales that offer
Likert-type response options, but a dichotomous scoring is applied to each item designed to
indicate the presence of problems. The six-item Overall Coordination subscale pertains to “all
the healthcare providers your regular doctor has recommended you see.” The four-item Specialty
Access subscale refers to specialists recommended by the regular doctor.
These PCAS, PCAT and CPCI subscales can be expressed as a continuous score by averaging
the values of individual items; high scores indicate the best management continuity. In contrast,
the VANOCSS subscales are scored dichotomously (presence of any problem) or can be
summed to the number of problems encountered (e.g. 0 to 6 problems with coordination), where
0 represents the best management continuity.
Concurrent validation of instruments The six instruments were self-administered to subjects with a regular source of care who had
sought healthcare in the previous 12 months. The study population was approximately balanced
by overall experience of healthcare, educational level, urban/rural context and English/French
language; it is described in detail elsewhere (Haggerty et al. 2009). Each participant filled in all
six instruments and provided information on health utilization and socio-demographic
descriptors.
Analytic strategy Because the provider reference and skip patterns were not the same for each subscale, we defined
a common subgroup on which to analyze instruments: respondents who had seen more than one
provider in the previous 12 months.
Management Continuity from the Patient Perspective: Detailed Report
The summary score for the PCAS, PCAT and CPCI subscales was expressed as the mean of the
component items, a continuous score falling within the range of the response scale values. To
allow comparisons of subscale values, we standardized the subscale scores to a 0-to-10 common
metric, where 10 represents the best management continuity. In contrast, the VANOCSS
subscales were expressed dichotomously (any problem) or as the sum of reported problems. We
initially looked for patterns of missing values and for ceiling or floor effects in the distribution of
values.
For construct validity, only the PCAS, PCAT and CPCI subscales lend themselves to the usual
assumptions of factor analysis, but in secondary analysis we examined the inclusion of the
VANOCCS subscales entered as a single item summing the number of problems. We first
conducted an exploratory principal components analysis using SAS 9.1 (SAS Institute 2003)
using an oblique rotation to explore whether the items loaded on a single factor and then to
identify how many underlying factors accounted for variability in responses using the criterion of
eigenvalue >1. We tested the goodness-of-fit of various factor structures with structural equation
modelling using LISREL (Jöreskog and Sörbom 1996). Since we applied a list-wise missing
values exclusion, our effective sample size relative to the number of items compromised our
statistical power to use the usual Weighted Least Square method (Jöreskog and Sörbom 1996).
We used the Maximum Likelihood Robust estimation, shown to be more reliable for small
sample sizes (Flora and Curran 2004). Our reference was a one-dimensional model with all items
emerging from a single latent variable, presumed to be management continuity. We hypothesized
that models with more than one factor would fit better than the one-dimensional model. If items
had ambiguous factor loadings, we assigned them to the factor we judged to fit with item
content. We used as the reference item for confirmatory factor loading the one with the highest
principal components loading and apparent content fit with the latent variable.
We used item response theory (IRT) analysis to examine the performance of individual items
against the original latent variable identified by the instrument developer and its management
continuity sub-dimensions. We based the underlying assumption of latent variable
unidimensionality on the goodness-of-fit of the structural equation models. We calculated the
discriminatory value and information yield of each item using Multilog (Du Toit 2003).
Finally, we exploited the benchmarking nature of the VANOCCS scale and used binary logistic
regression modelling to examine whether, in separate models, values of the PCAS, PCAT and
CPCI subscales predicted the presence of reported problems with the Overall Coordination and
Specialist Access subscales. We also explored the associations using ordinal regression models
to determine if there were ordinal effects.
RESULTS
Of the 645 respondents to the overall survey, 432 reported having seen a provider other than
their family physician, making them eligible for analysis of management continuity. The number
of non-respondents differed by subscale, as shown in the flowchart in Figure 1. Some were
appropriate non-respondents because they had not seen the specific type of provider (e.g.
specialist) applicable to a given subscale; some were non-respondents who had reported seeing
the type of specialist indicated for the subscale. The low number of responses to the VANOCCS
Overall Coordination and Specialist Access subscales are likely due to respondent fatigue, since
Management Continuity from the Patient Perspective: Detailed Report
these were placed last in the questionnaire. In addition, among the 213 who did not report seeing
a provider other than their doctor, many responded to the subscales, as shown in Figure 1.
Among the 432, 179 (41.4%) had at least one missing value and were excluded from the factor
analysis. Those excluded were more likely to have a regular clinic rather than a specific
physician and were suggestive of being in slightly better health. Table 1 presents the
characteristics of the study population.
Figure 1
Flow chart of responses to various subscales (shown in order of appearance in
questionnaire) demonstrating patterns of non-response in those presumed to be eligible by
virtue of reporting utilization of more than one provider (n=432) and patterns of
unnecessary response in those not presumed to be eligible (n=213)
Total
Total eligible
Non-respondents,
overall eligible
Non-respondents to
specific scale-eligible
Eligible respondents:
included in analysis
Non-eligible
respondents
Total non-eligible
Total
645
432
PCAS
26
PCAT
24
CPCI
207(97%)
VANOCSS
154
VANOCSS
101
PCAS
342 (79%)
PCAT
392 (91%)CPCI
427 (99%)
VANOCSS
278 (64%)VANOCSS
236 (54%)
645
213
PCAS
123 (58%)PCAT
151(71%)
CPCI
5
VANOCSS
1(0,5%)VANOCSS
0(0%)
PCAS
64PCAT
16
CPCI
-VANOCSS
-
VANOCSS
95
Management Continuity from the Patient Perspective: Detailed Report
Table 1
Characteristics of the study sample and comparison of subjects included and excluded
from the factor analysis for management continuity due to missing values on any of the 18
items of PCAS, PCAT and CPCI subscales.
Characteristic
Total
(n = 432)
Missing values Test for
Difference No missing:
included (n = 253)
Any missing: excluded (n = 179)
Personal characteristics
Mean age (SD) 47.6 (15.1) 48.2 (14.8) 46.8 (15.4) t = 0.97
p = 0.34
Per cent female 64.3% (275) 63.6% (159) 65.2% (116) Χ
2 =0.11 1 df
p = 0.74
Per cent indicating health status as good or excellent
34.9% (148) 32.0% (79) 39.0% (69) Χ
2 = 2.22; 1 df
p = 0.14
Per cent with disabilities that limit daily activities
35.9% (152) 37.2% (93) 33.9% (59) Χ
2 = 0.48; 1 df
p = 0.49
Per cent with chronic health
problem∗ 66.2% (282) 69.5%(173) 61.6% (109)
Χ2 = 2.88; 1 df
p = 0.09
Healthcare use
Regular provider:
Physician 93.3% (403) 95.7% (242) 89.9% (161) Χ2 = 5.45; 1 df
Clinic only 6.7% (29) 4.4%(11) 10.1% (18) p = 0.02
Mean number of primary care visits in previous 12 months
7.6 (7.9) 7.9 (7.7) 7.2 (8.3) t = 0.94
p = 0.35
Mean number of visits to other providers (SD):
Other family physician 3.7 (4.1) 3.7 (4.1) 3.7 (4.0) t = -0.14
p = 0.89
Nurse or nurse practitioner 6.6 (17.8) 4.2 (5.3) 9.2 (25.2) t = -1.40
p = 0.17
Nutritionist, psychologist or other therapist
11.5 (22.1) 12.6 (22.3) 9.5 (21.7) t = 0.92
p = 0.36
Specialist 4.2 (5.1) 4.3 (5.3) 4.1 (4.8) t = 0.29
p = 0.77
∗
Per cent indicating they had been told by a doctor that they had any of the following: high blood pressure, diabetes,
cancer, depression, arthritis, respiratory disease, heart disease.
Management Continuity from the Patient Perspective: Detailed Report
Characteristic
Total
(n = 432)
Missing values Test for
Difference No missing:
included (n = 253)
Any missing: excluded (n = 179)
Per cent reporting other doctors or nurses at regular clinic who play important role in care (PCAS 16)
23.7% (101) 25.9% (65) 20.6% (36) Χ
2 = 1.62;1 df
p = 0.20
Mean rating of doctor’s knowledge about care by others
at regular clinic (1 = knows everything completely to
5 = knows nothing, n = 96)
2.0 (1.1) 2.0 (1.0) 1.9 (1.2) t = 0.07
p = 0.95
Comparative descriptive statistics Table 2 presents the distribution of responses for the subscale items. None of them has more than
5% missing values. Respondents were most likely to endorse positive assessments of
management continuity, except with the VANOCCS items. All items in the PCAS and
VANOCCS Overall Coordination subscales and the majority of items in the other subscales
discriminate well between different levels of the latent variable captured in the original subscale,
as indicated by a >1 (right hand column).
Management Continuity from the Patient Perspective: Detailed Report
Table 2
Distribution of values on the items in subscales mapped to management continuity among the 432 respondents who saw more
than one provider.
Item Code Item statement Missing
% (n) Per cent (number) by response option
Item discrimination
*
Primary Care Assessment Survey: Integration
Thinking about the times your doctor has recommended you see a different doctor for a specific health problem,
Very poor
Poor
Fair
Good
Very good
Excellent
PS_i1
how would you rate the help your regular doctor gave you in deciding who to see for specialty care?
1 (4) 2 (6) 3 (9) 8 (28) 26 (87) 34
(117) 26 (89) 2.30 (0.20)
PS_i2
how would you rate the help your regular doctor gave you in getting an appointment for specialty care you needed?
2 (6) 2 (8) 4 (13) 9 (31) 24 (82) 32
(109) 27 (91) 2.35 (0.21)
PS_i3
how would you rate regular doctor’s involvement in your care when you were being treated by a specialist or were hospitalized?
4 (13) 1 (3) 10 (33) 12 (39) 31
(104) 26 (87) 18 (61) 3.68 (0.30)
PS_i4
how would you rate regular doctor’s communication with specialists or other doctors who saw you?
4 (15) 2 (8) 7 (22) 14 (46) 32
(109) 25 (84) 17 (56) 4.89 (0.42)
PS_i5
how would you rate the help your regular doctor gave you in understanding what the specialist or other doctor said about you?
4 (12) 3 (10) 4 (13) 15 (52) 28 (95) 27 (91) 20 (67) 4.06 (0.35)
*Items were assessed against the construct of the original scale. Values >1 are considered to be discriminating.
Management Continuity from the Patient Perspective: Detailed Report
Item Code Item statement Missing
% (n) Per cent (number) by response option
Item discrimination
*
PS_i6 how would you rate the quality of specialists or other doctors your regular doctor sent you to?
1 (4) 2 (5) 2 (6) 8 (27) 23 (77) 38
(130) 27 (91) 1.85 (0.19)
Primary Care Assessment Tool: Coordination (Thinking about visit to specialist or specialized service)
Definitely not
Probably not
Probably
Definitely
Not sure / Don't
remember
PT_c1
Did your Primary Care Provider discuss with you different places you could have gone to get help with that problem?
2 (9) 10 (40) 9 (36) 23 (89) 52
(203) 4 (16) 1.23 (0.17)
PT_c2
Did your Primary Care Provider or someone working with your Primary Care Provider help you make the appointment for that visit?
3 (12) 10 (40) 6 (23) 16 (61) 64
(251) 2 (6) 2.07 (0.26)
PT_c3
Did your Primary Care Provider write down any information for the specialist about the reason for the visit?
3 (11) 7 (29) 8 (33) 22 (86) 54
(212) 6 (22) 2.33 (0.24)
PT_c4
After you went to the specialist or special service, did your Primary Care Provider talk with you about what happened at the visit?
2 (9) 14 (56) 8 (33) 19 (76) 53
(208) 3 (11) 2.23 (0.25)
Components of Primary Care Index Strongly
disagree 2 3 4 5 Strongly
agree
CP_coo1 This doctor knows when I’m due for a check-up.
2 (10) 11 (46) 11 (49) 14 (59) 12 (52) 17 (74) 33
(142) 2.35 (0.21)
CP_coo2 I want one doctor to coordinate all of the health care I receive.
3 (11) 3 (11) 3 (14) 4 (15) 9 (37) 18 (78) 62
(266) 1.13 (0.17)
Management Continuity from the Patient Perspective: Detailed Report
Item Code Item statement Missing
% (n) Per cent (number) by response option
Item discrimination
*
CP_coo3 This doctor keeps track of all my health care.
1 (6) 4 (19) 7 (29) 7 (29) 12 (51) 22 (93) 48
(205) 4.14 (0.36)
CP_coo4 This doctor always follows up on a problem I’ve had, either at the next visit or by phone.
3 (13) 7 (31) 9 (38) 8 (33) 12 (51) 21 (91) 41
(175) 3.91 (0.30)
CP_coo5 This doctor always follows up on my visits to other health care providers.
3 (11) 6 (27) 8 (34) 12 (53) 15 (63) 20 (86) 37
(158) 3.25 (0.26)
CP_coo6 This doctor helps me interpret my lab tests, x-rays or visits to other doctors.
2 (9) 22 (94) 15 (66) 13 (56) 11 (46) 15 (66) 22 (95) 0.43 (0.13)
CP_coo7 This doctor communicates with the other health providers I see.
5 (21) 10 (42) 13 (55) 16 (71) 16 (71) 17 (75) 23 (97) 1.62 (0.16)
CP_coo8 This doctor does not always know about care I have received at other places.
†
5 (22) 11 (46) 13 (58) 17 (72) 13 (55) 20 (86) 22 (93) 0.32 (0.11)
Veterans Affairs National Outpatient Customer Satisfaction Survey: Overall
Coordination of Care ‡
Problem No problem
VA_cco1 Were the providers who cared for you always familiar with your most recent medical history?
0 (1) 59 (165) 40 (112) 1.67 (0.26)§
VA_cco2 Were there times when one of your providers did not know about tests you had or their results?
1 (3) 40 (111) 59 (164) 2.81 (0.46)
† Values were reversed for this reverse-worded item: only 11% (46) strongly agreed that the doctor did not know about care received at other places.
‡ Descriptive statistics based on 279 respondents (subscale placed second-to-last in questionnaire).
§ Items are scored dichotomously; these are not Likert scales.
Management Continuity from the Patient Perspective: Detailed Report
Item Code Item statement Missing
% (n) Per cent (number) by response option
Item discrimination
*
VA_cco3 Were there times when one of your providers did not know about changes in your treatment that another provider recommended?
2 (5) 26 (73) 72 (200) 1.90 (0.35)
VA_cco4 Were there times when you were confused because different providers told you different things?
1 (3) 28 (78) 71 (197) 1.90 (0.30)
VA_cco5 Did you always know what the next step in your care would be?
3 (7) 59 (163) 39 (108) 1.44 (0.26)
VA_cco6 Did you know who to ask when you had questions about your health care?
2 (5) 37 (102) 62 (171) 1.48 (0.28)
Veterans Affairs National Outpatient Customer Satisfaction Survey: Access to Specialists
**
Problem No problem
VA_spa1 How often did you get to see specialists when you thought you needed to?
2 (5 12 (29) 86 (202) 1.26 (0.36)††
VA_spa2 How often did you have difficulty making appointments with specialists you wanted to see?
1 (3) 19 (44) 80 (189) 0.42 (0.23)
VA_spa3 How often were you given enough information about why you were to see your specialists?
1 (2) 11 (25) 89 (209) 2.48 (0.62)
VA_spa4 How often did your specialists have the information they needed from your medical records?
1 (3) 20 (48) 78 (185) 2.68 (0.56)
**
Descriptive statistics based on 236 respondents (subscale placed last in questionnaire). ††
Items are scored dichotomously; these are not Likert scales
Management Continuity from the Patient Perspective: Detailed Report
13
Table 3 presents the descriptive statistics for each subscale, showing both the raw mean and the
value normalized to a 0-to-10 metric for the PCAS, PCAT and CPCI subscales. As expected,
those three subscales are skewed positively, with the PCAT demonstrating the most extreme
skewing. In contrast, 83% of respondents reported at least one problem on the VANOCSS
Overall Coordination subscale and 41% on Specialist Access. Among the respondents in the top
quartile of management continuity according to the PCAS, PCAT and CPCI subscales, 61%,
76% and 74%, respectively, report having one or more problems on the VANOCCS Overall
Coordination subscale and 34%, 37% and 44%, respectively, on the VANOCCS Specialist
Access subscale
Table 3
Mean and distributional scores for management continuity subscales, showing raw and
normalized values (n=432)
7 Standardized score (0-to-10) = (( raw score – minimum ) / ( maximum – minimum )) * 10.
Scale range Cronbach’s
alpha Mean SD Minimum observed
Quartiles
Q1 (25%) Q2 (50%) Q3 (75%)
Raw scores
PCAS Integration (range 1 to 6) 0.90 4.50 0.98 1 3.83 4.67 5.17
PCAT Coordination (range 1 to 4) 0.73 3.28 0.76 1 3.00 3.50 4.00
CPCI Coordination of Care (range 1 to 6)
0.79 4.33 1.02 1 3.63 4.38 5.00
VANOCSS Coordination of Care (Overall): number of problems among 6 items
0.74 2.50 1.90 0 1.00 2.00 4.00
VANOCSS Specialty Provider Access: number of problems among 4 items
0.54 0.60 0.90 0 0 0 1.00
Normalized scores (to 0-to-10 scale7)
PCAS Integration normalized 6.99 1.97 0 5.67 7.33 8.33
PCAT Coordination normalized 7.61 2.52 0 6.67 8.33 10.00
CPCI Coordination of Care normalized
6.65 2.04 0 5.25 6.75 8.00
Management Continuity from the Patient Perspective: Detailed Report
14
Table 4 presents the Pearson correlations between the management continuity subscales and with
subscales for the other PHC attributes. Within the management continuity attribute, PCAS,
PCAT and CPCI subscales correlate highly with each other (r = 0.54 to 0.62) and appropriately
negatively but only modestly with the number of problems reported on the VANOCSS subscales
(r = -0.19 to -0.39). The five subscales correlate strongly with evaluation of relational continuity
(r = -0.28 to 0.68) and interpersonal communication (r = -0.30 to 0.63).
Table 4
Partial Pearson correlation coefficients between subscales for management continuity and
subscales evaluating other attributes of primary healthcare (controlling for language,
education, geographic location). Only statistically significant correlations are shown (at p
<0.05).
n=246 n=132
PCAS:
Integration PCAT
:
Coordination
CPCI:
Coordination of Care
VANOCSS:
Coordination of Care (Overall),
number of problems
VANOCSS: Specialty Access,
number of problems
PCAS Integration 1.00 0.55 0.62 -0.39 -0.20
PCAT Coordination 0.55 1.00 0.54 -0.19 -0.22
CPCI Coordination of care 0.62 0.54 1.00 -0.34 -0.27
VANOCSS Coordination of Care (Overall), number of problems
-0.39 -0.39 -0.39 1.00 0.34
VANOCSS Specialty Access, number of problems
-0.20 -0.20 -0.20 0.34 1.00
Accessibility:
PCAS Organizational Access
0.44 0.37 0.40 -0.23 -0.28
First Contact utilization PCAT
0.16 0.29 0.36 - -
First Contact accessibility PCAT
0.26 0.32 0.29 - -
EUROPEP 0.54 0.52 0.59 -0.36 -0.29
Comprehensiveness:
PCAT Services Available 0.13 0.23 0.19 - -
CPCI Comprehensive Care 0.41 0.46 0.60 -0.40 -0.18
Management Continuity from the Patient Perspective: Detailed Report
15
n=246 n=132
PCAS:
Integration PCAT
:
Coordination
CPCI:
Coordination of Care
VANOCSS:
Coordination of Care (Overall),
number of problems
VANOCSS: Specialty Access,
number of problems
Relational continuity:
PCAS Visit-based Continuity
0.16 0.20 0.20 -0.19 -0.23
PCAS Contextual Knowledge
0.65 0.49 0.64 -0.38 -
PCAT Ongoing Care 0.43 0.49 0.60 -0.28 -0.25
CPCI Accumulated Knowledge
0.51 0.54 0.68 -0.30 -
CPCI Patient Preference for Regular Physician
0.37 0.46 0.63 - -0.20
Interpersonal communication:
PCAS Communication 0.61 0.44 0.48 -0.41 -0.20
PCAS Trust 0.54 0.54 0.50 -0.38 -
CPCI 0.46 0.45 0.55 -0.27 -
EUROPEP 0.63 0.59 0.63 -0.42 -0.25
IPC: Communication (elicited concerns, responded)
0.49 0.54 0.51 -0.41 -0.23
IPC: Communication (explained results, medications)
0.50 0.56 0.52 -0.32 -0.27
IPC: Decision-making (patient-centered decision-making)
0.48 0.55 0.47 -0.30 -
Management Continuity from the Patient Perspective: Detailed Report
16
n=246 n=132
PCAS:
Integration PCAT
:
Coordination
CPCI:
Coordination of Care
VANOCSS:
Coordination of Care (Overall),
number of problems
VANOCSS: Specialty Access,
number of problems
Respectfulness:
PCAS Interpersonal Treatment
0.60 0.42 0.47 -0.34 -
IPC Hurried Communication 0.49 0.45 0.46 -0.38 -0.30
IPC Interpersonal Style (compassionate, respectful)
0.48 0.53 0.45 -0.29 -0.25
IPC: Interpersonal Style (disrespectful office staff)
0.24 0.33 0.28 -0.27 -
Whole-person care:
PCAT Community Orientation
0.32 0.31 0.22 - -
CPCI Community Context 0.43 0.43 0.52 -0.27 -0.04
Management Continuity from the Patient Perspective: Detailed Report
17
Construct validity Most items from the PCAS, PCAT and CPCI subscales loaded well (more than 0.3) on a single
factor with principal components analysis. Items with factor loadings below 0.3 were the CPCI
items with low discriminatory values. The goodness-of-fit of the one-dimensional model was
barely adequate (Table 5, Model 1a): the Root Mean Squared Error of Approximation (RMSEA
p = 0.133) is considerably lower than the p = 0.05 criterion used to indicate good model fit; the
Comparative Fit Index of 0.92, however, is above the 0.90 criterion. When the VANOCCS
subscales were added as single variables summing the number of problems, model fit was similar
(Table 5, Model 1b).
Table 5
Results of logistic regression models examining the prediction of any problem reported on
the VANOCSS Overall Coordination and Specialty Access by the PCAS, PCAT and CPCI
subscale scores. Odds ratios show the decrease in the likelihood of reporting any problem
associated with each unit increase in the assessment of management continuity.8
Odds ratio with
normalized score Odds ratio with raw
score Nagelkerke’s R29
Hosmer and Lemshow p-value
10
VANOCSS Overall Coordination
PCAS Integration 0.46 (0.33-0.63) 0.21 (0.11-0.39) 0.29 0.11
PCAT Coordination 0.75 (0.58-0.96) 0.38 (0.16-0.88) 0.06 0.93
CPCI Coordination of care
0.71 (0.56-0.89) 0.50 (0.31-0.80) 0.08 0.05
VANOCSS Specialty Provider Access
PCAS Integration 0.74 (0.61-0.90) 0.55 (0.37-0.82) 0.09 0.09
PCAT Coordination 0.91 (0.77-1.06) 0.72 (0.42-1.22) 0.01 0.15
CPCI Coordination of care
0.89 (0.74-1.08) 0.80 (0.55-1.16) 0.01 0.03
8 Each OR calculated in a separate regression model. OR-normalized refers to subscale scores normalized from 0 to
10. 9 R2 is interpreted as a reflection of outcome variance explained by a variable in the model.
10 Goodness of model fit; values <0.05 indicate poor fit.
Management Continuity from the Patient Perspective: Detailed Report
18
Exploratory factor analysis with principal components analysis suggested two underlying factors.
The first (eigenvalue = 6.98) seemed to capture coordination actions: observed behaviours of
primary care physician related to the transition of patient care to other providers. It consisted of
the PCAS Integration scale and two PCAT items (PT_c2, PT_c3). Another PCAT item,
discussion of different referral options (PT_c1), had ambiguous factor loadings (0.37, 0.38) and
was placed with coordination actions based on content similarity with other items.
The second factor (eigenvalue = 1.20) suggested provider efforts to produce coherence between
visits within and outside the regular doctor’s office; it consisted of the CPCI Coordination of
Care items and the remaining PCAT item (PT_c4) on discussing the specialist visit with the
patient. This dimension is best exemplified by the CPCI item, “this doctor keeps track of all my
care” (CP_coo3), which has the most discriminatory capacity for this factor (a = 4.01). This
includes primary healthcare (CP_coo1, CP_coo4) and communication with other providers
(CP_coo5).
Table 5 presents the summary results of confirmatory factor analysis models to test different
factor structures, showing models with and without the VANOCCS subscales. As expected,
models with more than one factor fit the one-dimensional model better, though none
demonstrated convincingly good fit. The best was a second-order model where PCAS, PCAT
and CPCI items were associated with their parent subscales, which were in turn associated with a
latent variable presumed to be management continuity (Table 5, Model 3). The difference in chi-
square value between this model and the one-dimensional model (χ²: 1042.85 – 628.16 = 414.69,
df 4, p < 0.001) shows significant improvement. Figure 2 shows this model with the addition of
the two VANOCCS subscales entered as two single items summing the number of problems.
Management Continuity from the Patient Perspective: Detailed Report
19
Figure 2 Parameter estimations for structural equation model showing loadings of items on parent scales
(first-order variables), which in turn load on management continuity (second-order latent variable).
There is modest improvement compared to the unidimensional model (difference in chi-square of the
models χ² = 1042.85 – 628.16 = 414.69, df 4, p <0.001) but persisting high levels of residual error
(shown to the right of the items), especially for the PCAT and CPCI subscales. Root Mean Squared
Error of Approximation (RMSEA) does not support good model fit.
Management Continuity from the Patient Perspective: Detailed Report
20
The fit of the two-factor model with items grouped by coordination actions and coherence (Table
5, Models 4) was also significantly better than that of the one-dimensional model, but the
RMSEA of p = 0.084 indicates it is not as good as the model with items in their parent subscales.
When the VANOCCS subscales were entered as a single item, the RMSEA deteriorated but
Comparative Fit Indices improved. When VANOCCS subscales were entered as single items
summing the number of problems reported, the Overall Coordination subscale loaded weakly
with coherence (-0.38) and the Specialist Access subscale with coordination actions, though
much more weakly (-0.27). Figure 3 illustrates Model 3b.
Management Continuity from the Patient Perspective: Detailed Report
21
Figure 3
Structural equation model showing loadings of items on sub-dimensions of coordination
action and experienced coherence, shown to be correlated (curved lines). There is modest
improvement compared to the unidimensional model (difference in chi-square of the models
χ² = 1042,85 – 671.29 = 371.56, df 1, p <0.001) but persisting high levels of residual error
(shown to the right of the items) especially for the PCAT and CPCI subscales. Root Mean
Squared Error of Approximation (RMSEA) does not support good model fit.
Management Continuity from the Patient Perspective: Detailed Report
22
Predictive validity Logistic regression modelling of the number of reported problems on the VANOCCS against the
PCAS, PCAT and CPCI scores shows a statistically significant lower likelihood of reporting a
problem for every unit increase in each of these subscales. The PCAS Integration subscale has
the largest effect, with an odds ratio of 0.46 with every unit increase in score relative to the
lowest score; it accounts for 29% of the variance in the problems. The effects are more modest
for other subscales and for Specialist Access compared to Overall Coordination. The logistic
model shows poor goodness-of-fit for the CPCI.
Individual item performance Parametric IRT analysis shows that positive skewing of the PCAS, PCAT and CPCI subscales
results in diminished capacity to discriminate between different degrees of above-average
continuity but is highly discriminatory of below-average levels. The VANOCSS subscales, in
contrast, are more discriminatory for above-average continuity.
Discussion
In this study we found that five validated subscales measuring patient assessments of care
coordination relate adequately to a common construct, which we presume to be management
continuity. Exploratory factor analysis suggests that two distinct factors or sub-dimensions
underlie this pool of items: coordination actions and coherence.
Coordination actions relate to physician behaviours to facilitate transition of patient care to other
providers, presumably to achieve timeliness and complementarity of services, though no
subscales directly addressed these qualities. The PCAS Integration subscale covers this
dimension, as do most of the items of the PCAT Coordination subscale. The PCAS Integration
subscale addresses this dimension most specifically and appears to be the most predictive of
problem avoidance, as enumerated by the VANOCCS Specialist Access and Overall
Coordination subscales. This is best exemplified by the most discriminatory item for this
dimension, the PCAS rating of the “regular doctor’s communication with specialists or other
doctors” (PC_i4). Though we were not directly able to assess the fit of specific items, our
analysis and the item content suggest that the VANOCCS Specialist Access subscale addresses
this dimension.
The coherence dimension is highly correlated with coordination actions but seems to address the
provider’s effort, directed at the patient, to link different services and avoid gaps in care. This
refers to providers’ sense-making after a series of visits for a specific health condition and
planning for future care based on results of past visits. The primary care physician talking with
the patient about what happened at the specialist visit or special service (PCAT, PT_c4) thus
relates to coherence, even though the rest of the subscale mostly elicits coordination actions. The
VANOCCS Overall Coordination subscale loads on this dimension, though weakly, possibly
because it assesses system-wide gaps, not just those within PHC.
These subscales’ reference and measurement approaches show important differences. Three
(PCAS, PCAT and CPCI) focus specifically on the PHC physician and elicit positive
coordination behaviours using Likert response scales; these can be summarized as a continuous
measure. Although the PCAT and CPCI subscales do not use a satisfaction rating approach per
Management Continuity from the Patient Perspective: Detailed Report
23
se, they behave similarly to the PCAS satisfaction scale in eliciting predominantly positive
ratings of care, a feature common to rating scales (Williams et al. 1998). In contrast, the two
VANOCCS subscales elicit experienced difficulties across all providers and are scored
dichotomously to indicate the presence of any problem, from which evaluators infer the degree
of coordination or specialist access. This is the measurement approach used by the Picker
Institute (Gerteis et al. 1993), expressly increasing sensitivity to problems in order to guide
improvement efforts. The IRT analysis suggests that subscales could be used in combination to
reliably identify persons with poor management continuity (poor PCAS Integration rating scores
with more than one problem on the VANOCCS Overall Coordination subscale) or good
management continuity (high scores with no problems).
A key question is whether the underlying construct for these subscales is actually management
continuity, especially since the subscales were conceived originally to measure coordination of
care or specialist access. Management continuity as a characteristic of care was proposed in 2001
in an endeavour to clarify and harmonize the different meanings of continuity of care. The
literature review that gave rise to this label recognized that, in disease management and nursing
care, continuity refers to the linking of care provided by different providers (Reid et al. 2002), a
notion recognized in family medicine and general practice as coordination. Patients experience
coordination as management continuity. In the consensus process on operational definitions of
PHC attributes that preceded this study (Haggerty et al. 2007), a further distinction was made
that providers are the best data source for assessing attributes related to coordination
(multidisciplinary teamwork, system integration, informational continuity) and patients, for
management continuity. It is questionable, however, whether patients are the best source for
assessing timeliness and complementarity of services, which would seem best assessed by health
professionals, who can make judgments based on clinical guidelines.
Qualitative studies of patients’ experiences with care received from different providers suggests
patients are often unaware of critical aspects of coordination such as agreed-upon care plans or
information transfers between providers (Gallagher and Hodge 1999; Woodward et al. 2004).
They presume these elements are in place and can only detect failures or gaps. Thus, they can
more validly assess discontinuity than continuity, confirming the VANOCCS approach as a
potentially more accurate assessment of management continuity.
Rather than perceiving connectedness or smoothness, patients in qualitative studies are more
likely to express their experience of management continuity as a sense of security and of being
taken care of (Burkey et al. 1997; Kai and Crosland 2001; Kroll and Neri 2003). The French
term for continuity of care, prise-en-charge, captures this notion but seems to have no English
equivalent. The term implies the presence of a provider who takes responsibility for making sure
all required care is provided, and is most closely captured by the models of case managers in
mental health care or patient navigators in cancer care (Wells et al. 2008). Indeed, qualitative
studies and this one show that management continuity is entwined with relational continuity, as
seen by the strong correlation between subscales assessing these attributes. As measures of
relational continuity or interpersonal communication with the primary care physician increase,
the number of reported coordination problems between all providers decreases.
Management Continuity from the Patient Perspective: Detailed Report
24
This study has several limitations. The most striking is considering together instruments that use
different response formats as well as two distinct approaches to measurement. The confirmatory
factor analysis shows, not surprisingly, that the best model is the one where items are associated
to their own parent instruments. Nonetheless, the distinct approaches also create a unique
opportunity to compare different formats and provide new information on how well assessments
of coordination behaviours predict reported problems. Another limitation is the loss in sample
size incurred through excluding observations with missing values. This compromises statistical
power to assess construct validity and may lead to a biased sample. However, when we imputed
missing values based on the profile of available information, we did indeed increase statistical
significance of all tests and models, but the directions and magnitude of effects were unchanged,
giving us confidence in our conclusions.
In the past decade, various instruments have been developed to measure management continuity
from the patient’s viewpoint. To date, all relate to specific contexts such as discharge from
hospital (Coleman et al. 2005; Hadjistavropoulos et al. 2008; Kowalyk et al. 2004) or to
individual health conditions such as mental health (Burns et al. 2007; Cockerill et al. 2006;
Durbin et al. 2004; Ware et al. 2003;), cardiac care (Kowalyk et al. 2004), diabetes (Dolovich et
al. 2004; Gulliford et al. 2006), or cancer care (King et al. 2006). The instruments addressed in
this study are promising as generic measures that could be applied in PHC. We found that
challenges remain for measuring management continuity and consequently for evaluating the
impact of interventions to improve service integration and provider coordination. For a complete
and valid portrait, assessments from both provider and patient perspectives are needed, with
providers assessing information transfer and coordination between providers and patients
assessing their primary care physician’s coordination actions and occurrences of discontinuity.
Management Continuity from the Patient Perspective: Detailed Report
25
Reference List
Borowsky, S.J., D.B. Nelson, J.C. Fortney, A.N. Hedeen, J.L. Bradley and M.K. Chapko. 2002.
“VA Community-based Outpatient Clinics: Performance Measures Based on Patient Perceptions
of Care. Medical Care 40(70): 578-86.
Burkey, Y., M. Black and H. Reeve. 1997. “Patients' Views on Their Discharge from Follow Up
in Outpatient Clinics: Qualitative Study.” British Medical Journal 315(7116): 1138-41
Burns, T., J. Catty, S. Clement, K. Harvey, K., I. Rees Jones, S. McLaren, S. et al. 2007.
Experiences of Continuity of Care and Health and Social Outcomes: The ECHO Study. London,
UK: National Coordinating Centre for the Service Delivery and Organisation (NCCSDO)
Research Programme.
Cockerill, R., S. Jaglal, L. Lemieux Charles, L. Chambers, K. Brazil and C. Cohen. 2006.
“Components of Coordinated Care: A New Instrument to Assess Caregivers’ and Care
Recipients’ Experiences with Networks of Dementia Care.” Dementia 5: 51-66.
Coleman, E.A., E. Mahoney and C. Parry. 2005. “Assessing the Quality of Preparation for
Posthospital Care from the Patient's Perspective: The Care Transitions Measure.” Medical Care
43: 246-55.
Dolovich, L.R., K.M. Nair, D.K. Ciliska, H.N. Lee, S. Birch, A. Gafni and D.L. Hunt. 2004.
“The Diabetes Continuity of Care Scale: The Development and Initial Evaluation of a
Questionnaire That Measures Continuity of Care from the Patient Perspective.” Health & Social
Care in the Community 12: 475-87.
Du Toit, M. 2003. IRT from SSI: Bilog-mg, Multilog, Parscale, Testfact. Lincolnwood, IL:
Scientific Software International, Inc.
Durbin, J., P. Goering, D.L. Streiner and G. Pink. 2004. “Continuity of Care: Validation of a
New Self-report Measure for Individuals Using Mental Health Services.” Journal of Behavioral
Health Services & Research 31: 279-96.
Flocke, S. 1997. “Measuring Attributes of Primary Care: Development of a New Instrument.”
Journal of Family Practice 45(1): 64-74.
Flora, D.B. and P.J. Curran. 2004. “An Empirical Evaluation of Alternative Methods of
Estimation for Confirmatory Factor Analysis With Ordinal Data.” Psychological Methods 9(4):
466-91.
Gallagher, E.M. and G. Hodge. 1999. “The Forgotten Stakeholders: Seniors’ Values Concerning
Their Health Care.” International Journal of Health Care Quality Assurance 12(3): 79-87.
Gerteis, M., S. Edgman-Levitan, J. Daley and T.L. Delbanco. 1993. Through the Patient's Eyes:
Understanding and Promoting Patient-Centered Care. San Francisco, CA: Jossey-Bass
Publishers.
Management Continuity from the Patient Perspective: Detailed Report
26
Gulliford, M.C., S. Naithani and M. Morgan. 2006. “Measuring Continuity of Care in Diabetes
Mellitus: An Experience-based Measure.” Annals of Family Medicine 4: 548-55.
Hadjistavropoulos, H., H. Biem, D. Sharpe, M. Bourgault-Fagnou and J. Janzen. 2008. “Patient
Perceptions of Hospital Discharge: Reliability and Validity of a Patient Continuity of Care
Questionnaire.” International Journal for Quality in Health Care 20: 314-23.
Haggerty, J., F. Burge, M.-D. Beaulieu, R. Pineault, C. Beaulieu, J.-F. Lévesque, D Santor. 2011.
"Validation of Instruments to Evaluate Primary Health Care from the Patient Perspective:
Overview of the Method." Healthcare Policy Vol 7 (Special Issue):31-46
Haggerty, J., F. Burge, J.-F. Lévesque, D. Gass, R. Pineault, M.-D. Beaulieu, et al. 2007.
“Operational Definitions of Attributes of Primary Health Care: Consensus Among Canadian
Experts.” Annals of Family Medicine 5: 336-44.
Haggerty, J.L., R.J. Reid, G.K. Freeman, B.H. Starfield, C.E. Adair and R. McKendry. 2003.
“Continuity of Care: A Multidisciplinary Review.” British Medical Journal 327: 1219-21.
Jöreskog, K.G. and D. Sörbom. 1996. LISREL 8: User's Reference Guide. Chicago, IL: Scientific
Software International, Inc.
Kai, J. and A. Crosland. 2001. “Perspectives of People with Enduring Mental Ill Health from a
Community-based Qualitative Study.” British Journal of General Practice, 51: 730-6.
King, M., L. Jones and I. Nazareth. 2006. Concern and continuity in care of patients with cancer
patients and their carers: a multi-method approach to enlightened management. London, UK:
National Coordinating Centre for the Service Delivery and Organisation (NCCSDO) Research
Programme.
Kowalyk, K.M., H.D. Hadjistavropoulos and H.J. Biem. 2004. “Measuring Continuity of Care
for Cardiac Patients: Development of a Patient Self-report Questionnaire.” Canadian Journal of
Cardiology, 20: 205-12.
Kroll, T. and M.T. Neri. 2003. “Experiences with Care Co-ordination Among People with
Cerebral Palsy, Multiple Sclerosis, or Spinal Cord Injury.” Disability and Rehabilitation 25:
1106-14.
Lévesque J.-F., Haggerty J., Burge F., Béninguissé G., Gass D., Beaulieu M.-D., Pineault R.,
Santor D., Beaulieu C., « Evaluating Quality of Primary Care : Mapping the Coverage of
Attributes Among Validated Instruments », 34th
Annual meeting of the North American Primary
Care Research Group (NAPCRG), Tucson, Arizona, October 15-18, 2006.
Reid, R., J. Haggerty and R. McKendry. 2002. Defusing the Confusion: Concepts and Measures
of Continuity of Care. Ottawa, ON: Canadian Health Services Research Foundation.
Management Continuity from the Patient Perspective: Detailed Report
27
Safran, D.G., J. Kosinski, A.R. Tarlov, W.H. Rogers, D.A. Taira, N. Lieberman and J.E. Ware.
1998. “The Primary Care Assessment Survey: Tests of Data Quality and Measurement
Performance.” Medical Care 36(5): 728-39.
SAS Institute. 2003. SAS User's Guide: Statistics. (Version 9.1 ed.) Cary, NC: SAS Institute, Inc.
Shi, L., B. Starfield and J. Xu. 2001. “Validating the Adult Primary Care Assessment Tool.”
Journal of Family Practice 50(2): 161-71.
Shortell, S.M., R.R. Gillies, D.A. Anderson, K.M. Eirckson and J.B. Mitchell. 1996. Remaking
Health Care in America: Building Organized Delivery Systems. San Fransisco, CA: Jossey-Bass
Publishers.
Ware, N.C., B. Dickey, T. Tugenberg and C.A. McHorney. 2003. “CONNECT: A Measure of
Continuity of Care in Mental Health Services.” Mental Health Services Research 5: 209-21.
Wells, K.J., T.A. Battaglia, D.J. Dudley, R. Garcia, A. Greene, E. Calhoun, et al. 2008. “Patient
Navigation: State of the Art or Is It Science?” Cancer 113: 1999-2010.
Williams B., J. Coyle and D. Healy. 1998. “The Meaning of Patient Satisfaction: An Explanation
of High Reported Levels.” Social Science and Medicine 47(9): 1351-59.
Woodward, C.A., J. Abelson, S. Tedford and B. Hutchison. 2004. “What Is Important to
Continuity in Home Care? Perspectives of Key Stakeholders.” Social Science and Medicine 58:
177-92.
Recommended