30
Going Beyond The Count: Modifying Service Delivery Strategies Based on the Findings of the Coconino County Continuum of Care Bi-Annual Homeless Survey October 26 th 2009 Welcome!

Going Beyond The Count: Modifying Service Delivery Strategies Based on the Findings of the

Embed Size (px)

DESCRIPTION

Going Beyond The Count: Modifying Service Delivery Strategies Based on the Findings of the Coconino County Continuum of Care Bi-Annual Homeless Survey October 26 th 2009 Welcome!. Initial Introductions:. - PowerPoint PPT Presentation

Citation preview

Page 1: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Going Beyond The Count:

Modifying Service Delivery Strategies Based on the Findings of the

Coconino County Continuum of Care Bi-Annual Homeless Survey

October 26th 2009

Welcome!

Page 2: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Initial Introductions:

• I prefer a relaxed atmosphere: please feel free to get up as needed; let me know if you get lost, can’t hear me, or need clarification. All questions are good questions.

• Interpretive dancing is permitted, as long as it doesn’t become disruptive…

• Brief Group Introductions: your name and organization/position.

• Show of Hands: how many of you are familiar with HUD-mandated street counting or actually participated in a street count effort?

Page 3: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Your Presenter: Michael Van Ness

• The coordinator of the survey-based, homeless count effort we’ll be discussing and the primary author of the evolving survey instrument.

• 3 years experience doing homeless outreach in Northern Arizona with the Catholic Charities PATH Program (and the current program supervisor).

• A graduate student at Northern Arizona University in Applied Sociology.

• Vice Chair of the Flagstaff Shelter Services Board of Directors.

• Educational Orientation: B.A. in Anthropology (minor in Astronomy), with an A.A. in Mathematics.

Page 4: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

The Goal:

• Promote our survey-based, homeless counting methodology in other regions of the State.

• Show how the results of this methodology can be used to improve service delivery.

• Discuss other benefits of our survey-based approach (programmatic benefits, data collection benefits, continuum-strengthening benefits).

• Provide an in-depth examination of our methodology so it can be easily replicated.

Page 5: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Overview:• Brief History of Homeless Counting in Coconino County: Why

did the survey-based homeless count evolve? What were the issues with the point-in-time method in our area.

• Our survey form: the evolution of the instrument over 5 homeless count efforts. We’ll examine the major changes.

• How we implement our methodology: the system and highpoints.• Intermission: group activity – complete the survey.• The advantages of our methodology: a contrast of point-in-time to

surveying, why anyone would want to coordinate this large an effort.• The limitations of our methodology: challenges, potential issues

with our survey instrument.• The 2008 Winter Count of Homeless Families and individuals:

Results.• How these results actually modified our service delivery, how

they should have.• Conclusion: question and answer session (the best part)

blended with brainstorming on the issue “How would the survey-based method function in my service area? Useful or Not? How So?”

Page 6: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 1: A Brief History of Homeless Counting in Coconino County

• Homeless count efforts in Coconino County are coordinated by the Continuum of Care, a Flagstaff-based homeless planning group.

• In 2005 the Flagstaff Catholic Charities regional office took on the responsibility of coordinating the Continuum.

• Historically, street counting of homeless populations in Coconino County has occurred in Flagstaff only (leaving out Williams, Page, N. Sedona, etc.).

• HUD mandates street count efforts at least every two years (usually the last week of January), but some communities (such as Flagstaff) conduct yearly count efforts.

Page 7: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

The 2007 Winter Street Count in Flagstaff: Preparations

• Took place the last week of January (1/30/7) as mandated by HUD.

• Coordinated by a Coconino County Continuum of Care subcommittee.

• Committee Meetings: only 2, a brainstorming session and an implementation meeting just before the actual count began.

• Issues with the meetings: brainstorming session ideas overlooked, low attendance, insufficient volunteers, poor preparation, little training for count participants, homeless definitions misunderstood, no attempt to innovate.

Page 8: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

The 2007 Winter Street Count in Flagstaff: Execution

• 6 Teams of 2-3 set out to drive assigned areas of Flagstaff. Most stayed in their cars.

• Weather = Blizzard! This impacted the count by limiting…[1] visibility, [2] the number of available homeless people on the streets, [3] the length of the count (from 3 hours to ~ 1 hour in most cases). Volunteers were amazing – dangerous to try.

• Not all teams were qualified to identify chronically homeless individuals.

• Each team made check-mark tallies on a “count form” while driving their areas – there were many assumptions made and not all forms were collected.

• Results: 86 homeless men, 13 homeless women (99 single persons, of these 32 were assumed to be chronically homeless), and 1 family with 1 child + 2 adults for a grand total of 102 unsheltered persons.

Page 9: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

The 2007 Winter Street Count in Flagstaff: Issues

• Too small a window of time..• Weather was too much a factor and it was

dangerous..• Inadequate training [in identifying cases, in

definitions]..• Too many assumptions..• While an attempt to prevent duplication was

made, there were likely errors..• Verification, accuracy, and replication issues…• Volunteers had little investment in the process;

there was a “just get it done” approach…• Therefore, another methodology was needed!

Page 10: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 2: Our survey form

• I’ll be referring to pages 1-16 of your handout packet• There are 8 survey forms in your packet, representing 5

count periods…[count] [dates] [pages] [count effort notes]

2007 Summer 6/21-6/26 1-2 First survey-based count

2008 Winter 1/26-2/1 3-4 Second count effort

2008 Summer 8/4-8/10 5-6 Third effort – big changes

2008 Summer (jail) 8/4-8/10 7-8 Third effort – 1st with jail form

2009 Winter 1/27-2/2 9-10 Fourth count effort

2009 Winter (jail) 1/27-2/2 11-12 Fourth count effort

2009 Summer 8/11-8/17 13-14 Fifth count effort

2009 Summer (jail) 8/11-8/17 15-16 Fifth count effort

We’ll discuss their evolutionIn the next slide…

5 total 1 weekcount spans: that’s35 days of counting!

Adapted from a simpler

HUD doc..

Page 11: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Changes over time in the survey instrument• During each count, I made notes on which questions were clumsy

to ask, difficult to answer – or just didn’t work. For example, question #9 from the 2007 Summer Count [page 1 in your packet] tended to offend people and the family questions 18-20 were confusing. These were edited in the next edition of the form.

• Over time, more questions were squeezed into the form, and space became an issue.

• Certain useful questions, such as #3 on the 2007 Summer Count form (used for unduplication of results) and the chronically homeless questions, persisted relatively unchanged.

• Many instrument issues were realized during data reduction: when open-ended responses were many and similar, new categories were added. When responses were confused or illogical, questions were re-tooled.

• The 2008 Summer Count form (a major innovation) used checkboxes for the first time, which opened up space – homeless respondents seemed to prefer checking boxes in this edition to making Xs or checks on underscore-constructed lines (in previous editions). Here I expanded the “Why did you become homeless?”, “Which…are barriers to receiving services?”, and “From what sources do you get income?” questions.

Page 12: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Changes over time in the survey instrument 2• Major paradigm shift: Consider question #4 on the 2007

Summer count form [handout page 1]. Back then, we were trying to get survey forms only to homeless respondents – we didn’t want to complete the data entry for “useless” forms, and question #4 asks non-homeless respondents to stop filling out the form.

• Quite by accident, it just so happened that this was a “rich point” [see Michael Agar’s Language Shock, 1994: 256] which lead to several innovations..

1. We realized that “homeless” is to semantically-charged to get an accurate Yes or No response [elaborate]. Some people who feel like they are not homeless actually do fall under the HUD definition, and we wanted their data.

2. People continued to fill out the form, in some cases, when they didn’t seem to be homeless given their response to question #1. However, they were reporting answers to questions 6, 15, and 16 that suggested they were homeless. It turned out they were reporting on past episodes of homelessness.

3. Furthermore, we realized that we could compare homeless to non-homeless data sets, as they were using the same services (at the agency distributing our forms)

4. Thus, we eliminated the “stop here” element, and by the 2008 Summer Count added a “over the last 90 days..” question..

Page 13: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Changes over time in the survey instrument 3• Realization: Going into the 2007 count process, I had an overly

rigid definition (and understanding) of homelessness. Eventually, I let the respondent’s answer to question 1: “Where did you sleep (or stay) last night” define their homeless status, rather than stressing over any kind of “Are you homeless?” question response. Over time, more categories get added to Q1 as more living situations are realized.

• Initial attempts to record (and code) the race/ethnicity question were failures: we found it was best (and more-aligned to our respondent’s thinking) to include “Hispanic” in the racial background section rather than separating it out. We also eliminated racial/ethnic combinations, like “Asian & White” and just let the respondents choose more than one category.

• By the 2008 Summer count form [handout page 5] we expanded the military service question to include “When?” and “In Combat?” as we suspected these were relevant to ask…

• The “unduplication question” [#3 from 2007, #7 by Summer 2008] drifted down the page, a deliberate strategy to invest the client in the form before hitting them with the most (?) sensitive question.

Page 14: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Changes over time in the survey instrument 4• The advent of the “jail edition” during the 2008 Summer Count

was the result of a strong partnership with the Coconino County Exodus Program: they were willing to offer the forms to prisoners in the CCJ, and using the usual form created issues. We saw an opportunity to ask whether or not being in jail forced them into homelessness, and where they would have stayed if they weren’t in jail (and with who?). Thus, this introduced a hypothetical element into the dataset that must be considered.

• The 2009 Winter Count introduced another significant element [see page 9 of the handout packet]: the “Is this location substandard, significantly dilapidated, or lacking in necessities (plumbing, electricity, etc.)?” question. This was an idea from HUD, but also addressed issues involving local conditions [rt. 66 motel living, conditions on local reservations].

• The next 19 slides go into deeper detail on the evolution of our forms…

• Just Kidding!!! Let’s move on to how we actually conduct the survey every 6 months, which is more fun..

Page 15: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 3: How we implement our methodology

The Coconino County

Continuum of Care: Begin+

End

Gather & distribute materials: forms, etc.

Assign or record agency-

monitoring assignments.

Review Challenges, The “check back” stage.

Recruit volunteers to staff locations

as needed.

Record partnerships,

tasks, climates + gatekeepers.

Promote the Count: Agency

Contacts [e-mail & visit]

Update instrument,

resource lists, invitation letter.

C.o.C. Subcommittee

& Planning Meetings:

Adjustments?

1 Week Count Period

Count Process ResistsFlowcharting

Basic PlanningWheel: dynamic

RecordEverything!!!

See Handoutpage #17 for Letter…

Page 16: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Implementation Highlights• The Catholic Charities PATH (Projects for Assistance in

Transition from Homelessness) Outreach Team Supervisor coordinates the count effort each Winter and Summer (accounting for seasonal movements).

• We cover as much of the county as possible, focusing on Flagstaff, Williams, and Page.

• Strategy: distribute survey forms during outreach and at social service agency locations.

• We avoid: [1] Singling homeless clients out when distributing surveys in agency areas, [2] putting surveys in a box that says “homeless survey, please complete!”, [3] Outreach dissemination in overtly public areas, like public transit, [4] Denying PATH services/resources to non-participants.

• We embrace: [1] Handing out updated resource lists to respondents, [2] Treating survey encounters like outreach contacts, [3] Upfront discussions of why we conduct the homeless count and why anyone would want to fill out a survey, [4] Getting the results back to the agency partners and offering them to the respondents, [5] a willingness to help respondents complete (or read, understand, etc.) our surveys.

Page 17: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Additional Implementation Highlights• We recruit agencies serving potentially homeless respondents by

offering to share the count results with them: we argue that our findings are most useful for grant writing purposes, particularly the “describe the population you serve” section. Recruitment always involves an invitation to the C.o.C. [and we help it grow]

• The timing of the agency invitation and follow-up are crucial: some agencies need weeks notice, and in other cases allowing too much time to pass between the invite and the implementation leads to non-participation (as people forget, go on vacation, etc.)

• Many agencies want to participate, but have inadequate staff – in these cases we use volunteers to distribute forms, usually from set-up tables. Volunteers typically accompany at least one PATH team member (this cuts down on the need for advanced training). Examples: Williams Food Bank, Think Jesus Project.

• In some cases, agencies participate by filling out survey forms for the clients they serve using their service records and notes. We emphasize that agents only complete the forms based on information they know for sure. Examples: WUSD Homeless Liaison, Flagstaff Salvation Army.

Page 18: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 4: Group Activity

• At this point, we’re going to familiarize ourselves with the current survey instrument. Please complete the green 2009 Summer Count survey form as either yourself, a made-up character, a client you’ve worked with, etc. We’re NOT going to collect these, and please don’t record sensitive information – this is for your familiarity and fun only!

• Form-completion questions are welcome!• Reminder: interpretive dance.

Page 19: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 5: The advantages of our methodology At this point, I’m going to compare the point-in-time method

of homeless counting to our survey-based methodology. Note: I have nothing but the utmost respect for the many volunteers throughout America who do point-in-time counting.

First, our approach involves less direct assumption of homeless (and chronically homeless) status. This leads to less “researcher-bias.”

Our survey, packed with questions, can uncover a WEALTH of information. Contrast this with the standard homeless count form.

Our count effort spans 1 week of time, allowing for complex cycles of homelessness to be captured. If someone isn’t “visible” during the point-in-time method, or isn’t quite “homeless” (yet) they won’t be counted.

Our methodology is FAR less affected by weather and other systemic events (homeless “sweeps” by law enforcement for example).

Page 20: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Additional advantages of our methodology Our results are more-verifiably unduplicated than the “grid-

coverage” approach of point-in-time methods. We can spend time reviewing survey forms, examining handwriting – using multiple lines of evidence while unduplicating results.

Contrasted with point-in-time methods involving the earlier example – where people are driving around in cars making check marks on forms and not interviewing – our methodology is more interactive, and more potentially beneficial to the respondent (they can gain services, referrals, information, and ultimately – the results).

Our methodology is more transparent to the participants. We can cover larger geographic areas with week-long

surveying than point-in-time methods. In Coconino County (with its 18,661 square miles making it larger than each of the nine smallest states – thank you Wikipedia) this matters!

The results of our methodology better-supports grant writing and deeper-understandings of homelessness (such as through multi-dimensional analyses seeking correlations)…

Page 21: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 6: The limitations of our methodology We’re not randomly sampling, and we can’t make valid

estimates of “the total homeless population” – our results are “at least” totals.

Typically, respondents in “homeless” living situations tend to report more than one answer to Q1: “Where did you sleep (or stay) last night?” On one hand, this reflects their behavior, but these multiple-responses can be hard to interpret.

People tend to answer Q14: “Did you become homeless in Arizona” (as well as Q16) from the “ever in their lives” perspective. While this is interesting, it can be confusing.

Respondents tend to gloss-over and estimate spans of time, such as “How long have you been staying there..?” – this adds bias, particularly in the chronically homeless category.

Despite our efforts, agency participants will…[1] just put survey forms out on their front desk with a sign, [2] fail to help clients complete surveys, etc. This over-represents the data from agencies who use best-practices.

Page 22: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Additional limitations of our methodology Whereas the point-in-time method is quick to implement and

produces rapid results, our surveys need to be entered into databases and analyzed – this takes a lot of time and effort.

The point-in-time method is more potentially ethically sound, simply given the fact that our surveys ask and record sensitive information (like disability status, deviant behaviors, and so forth).

From the traditional sociological perspective, our surveys have a variety of potential flaws: I’ll leave this as an exercise for the audience to consider..

Thus far, our survey forms have only appeared in English. There are inherent issues with “self reporting” that add bias

to our results: To what extent do clients report ideals and perform for the surveyor? To what extent do they avoid reporting behaviors or situations?

I challenge you to think of additional limitations to discuss at the end of the presentation (which we are quickly moving toward).

Page 23: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 7: The 2009 Winter Count Results27 Count Participants Reported Data

Azpire House Catholic Charities (2

offices) Circle of Page (and

St. David’s Church) Community

Behavioral Health Services

CCCS (Flagstaff, Williams, Page)

CCJ Exodus Program

DNA People’s Legal Services

Flagstaff Family Food Center

Flagstaff Medical Center

Flagstaff Police Department

Flagstaff Shelter Services

Goodwill Industries Hope Cottage North Country Health

Center Page Regional Domestic

Violence Shelter (PRDVS)

Page United Methodist Church

Royal Inn and Inn Transitions

St. Mary’s Food Bank St. Vincent de Paul

Society Sunshine Rescue

Mission Think Jesus Project WUSD Homeless Liaison

Page 24: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

2009 Winter Count Basic Results We collected 526 Survey Forms. One set of full results appears at the end of your handout

packet [after page #17]. As this lists-out the response totals by question, this is a large set of information. I’ll summarize the significant findings…

Unsheltered Individuals: 72 men + 23 women = 95 total. Of these, 34 identified as chronically homeless, 23 reported being SMI, 43 reported substance abuse, 13 reported DV histories, and 17 reported veteran status.

Unsheltered Families: There were 17 without children (total of 37 people); 16 with children (47 adults + 31 kids = 78 additional people).

Additionally, there were 233 individuals “Doubled Up” and 150 more people living in “substandard” conditions.

Thus, counting all situations of homelessness mentioned above, we counted 593 individuals.

Page 25: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Unsheltered Respondent’s Answers to Q18:

Question 18 "Barriers to Service": 2009 Winter Count

Citizenship Issues2%

Lack of ID5% Lack of

Information6%

Lack of Transportation

13%

The hours I w ork2%

Alcohol Use9%

Legal Trouble9%

Religious Orientation1%

Sexual Orientation0%

Child Care Issues1%

Mental Health Issues5%

Hygiene Issues3%

Physical Disability5%

Bad Previous Experiences w ith Social Service

Agencies3%

Language or Education4%

Drug Use3%

Racial/Ethnic Discrimination4%

Lack of Available Services6%

Agencies Help Once per Year6%

Ashamed, Embarrassed, or Afraid

6%

I Do Not Encounter Barriers3%

Other4%

Page 26: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 8: Effects of results on PATH service delivery

Given that a lack of transportation among unsheltered homeless individuals (a significant pool of PATH clients) was the leading “barrier to services” we took the following actions: [1] The previous count effort – the 2008 Summer Count – indicated that “lack of transportation” was the main barrier to service for respondents as well, causing us to increase spending in our “client travel” line item (bus tickets and monthly bus passes). As this was still an issue, we decided to continue an aggressive level of funding for travel assistance, [2] We made an effort to transport clients in our program vehicles more than we had in previous quarters.

Since another leading “barrier” was lack of information: [1] PATH intensified efforts to distribute resource lists, [2] We promoted the distribution of resource lists during the next homeless count effort (and printed them for participating agencies), [3] PATH aggressively rooted-out “bad” resource listings wherever we found them.

Page 27: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Part 8+: How could we have modified service delivery?

Considering the level of detail captured in Q1: “Where did you sleep (or stay) last night?” we could have reexamined our outreach coverage to verify we could capture the most-frequently stated locations: “Drifting from place to place” was 2nd only to “On the street” for example.

Since our PATH clients are necessarily SMI and homeless, we should have searched for respondents (in our database) that reported mental illness and looked to see if there were any correlations between reporting mental illness and reporting anything else. This might have lead us to readdress priorities in regard to service delivery.

While the unsheltered population reports 17 veterans, 9 of which saw combat they also report far lower totals receiving VA benefits. Thus, we should have focused more on connecting vets with benefits – possibly through new interagency partnerships.

These are but a few examples…

Page 28: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Conclusions:

Our survey-based methodology is difficult to implement, but well worth the effort.

We demonstrated that our methodology can improve service delivery.

Coordinating (or participating in) a survey-based count yields a wide range of program-strengthening benefits.

You now have the tools to replicate a survey-based count if you decide it would be of benefit.

Page 29: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Thank You!

And to our funding partners!

www.flagstafffoundation.org

http://uesaz.com/

www.narbha.org/

Page 30: Going Beyond The Count: Modifying Service Delivery Strategies  Based on the Findings of the

Question/Answer and Brainstorming:

“How would the survey-based method function in my service area? Useful or Not? How So?”