Upload
deborah-feldman
View
95
Download
0
Embed Size (px)
Citation preview
February 2014
Alternatives to Detention in Orleans Parish
An Evaluation Study
Deborah Feldman, Toucan Research
Table of Contents Section Page I. Introduction 1
II. Intake and Assignment to and ATD 3
III. Alternatives to Detention: An Overview of Programs 7
IV. Electronic Monitoring 10
V. Evening Reporting Center 13
VI. Orleans Detention Alternatives Program 15
VII. AnyTrax 17
VIII. Overarching Themes 18
IX. Systems Recommendations 23
Appendices A-‐C 26
1
An Evaluation of Alternatives to Detention in Orleans Parish I. Introduction This report outlines findings from a recent evaluation of the Orleans Parish Juvenile Detention to Alternatives Initiative (JDAI), an initiative of the Annie E. Casey Foundation (AECF), adopted by the Parish in 2008. The evaluation was conducted by an independent research consultant who worked with Parish staff to obtain a broad range of data on the program in November and December 2013. The main focus of the evaluation was on the Parish’s several Alternatives to Detention (ATD) programs. The ATDs are a central component of Orleans Parish JDAI. In addition, because intake procedures play such a pivotal role in determining who enters an ATD, the intake process is also discussed.
A. The JDAI Management Context The Orleans Parish Juvenile Court is a key player within JDAI, operating the intake unit and three of the four ATDs subsumed under JDAI and discussed in the evaluation. The Juvenile Court also houses the JDAI Coordinator, who is responsible for monitoring and coordinating multiple aspects of JDAI, including evaluating and reporting on ATD programs.
Another important aspect of OP’s JDAI program is its collaborative structure, which many stakeholders interviewed for the evaluation saw as having been reinvigorated in 2013. Although the Juvenile Court must approve major policy and programmatic changes, the larger JDAI Collaborative also plays a role in raising issues, shaping discussions and promoting policy initiatives. The Collaborative is an advisory body that meets once a month and includes a number of community representatives, as well as representatives from the courts, law enforcement, detention and the offices of the district attorney and indigent defense. Smaller topical work groups tackle specific issues and report to the full Collaborative, which then votes on matters raised by the work groups and makes relevant recommendations to the Court on JDAI policy and program matters. It is beyond the scope of the evaluation to examine the efficiency and effectiveness of these organizational structures and how they impact ATD programming. However, since the Collaborative and the ATD work group are referenced several times in this report, these terms are introduced here.
B. Study Methods and Limitations
The evaluation utilized four main types of data: • Stakeholder Interviews: An evaluator conducted 13 confidential telephone interviews with a
broad range of stakeholders, who were selected by the evaluation team. Interviews ranged in length from 45 to 70 minutes.
• Published and unpublished reports: The evaluation utilized five reports from 2011 through early 2013 made available to them by the JDAI coordinator or the Annie E Casey Foundation. The reports provided activity summaries, issues being considered and program data in different formats.
• Internal quarterly reports and program enrollment and termination logs: The evaluator examined quarterly “reports” available for three quarters in 2012 and the first quarter of 2013; these provided quarterly enrollment, termination, utilization and FTA/re-‐arrest numbers for
2
each ATD. The evaluator also was provided with some sample internal enrollment/termination logs for each ATD and internal monthly summaries of enrollment and termination.
• Internal program documents: These included policies and procedures, brochures and memos made available to the evaluator.
A separate Validation Study of the JDAI detention risk assessment tool was completed just prior to the evaluation. This study examined JDAI case processing and calculated failure rates for each ATD, using 2011 data. Findings from this companion study provide some additional context for this evaluation and are referred to in several sections. Study Limitations There were a number of important limitations to this study, which included:
• Lack of access to reliable, current or recent data. Ideally, the evaluation would use a year’s worth of recent data on individual ATD programs as the best reflection of programs’ current operations. However, due to the JDAI analyst position being vacant for some months, no data beyond the first quarter of 2013 was available. The evaluation had to rely primarily on secondary sources that reported enrollment, termination, length of stay and other program operational and outcome measures.
• Not all stakeholder perspectives captured: Due to time and resource constraints, the evaluation team had to confine interviews to a manageable number of those stakeholders who appeared to be most directly involved with or knowledgeable about the ATDs. As a result, some stakeholder perspectives on the ATDs may not have been captured during data collection. Two key stakeholders –one directly involved in an ATD and one outside the Juvenile Court -‐-‐declined to participate in an interview.
• Staff turnover: In the last two years, the JDAI effort has faced staff turnover in key positions. The data analyst left several months prior to the evaluation and the positions was not filled. There was also turnover in the ERC Supervisor position in 2012. In addition, the critical position of JDAI Coordinator had been filled by at least three different people in recent years. The current JDAI Coordinator had only been in her position a few months and was not yet familiar with key program documents, personnel, and operations.
• Rapidly changing program and policy environment: During the evaluation one of the three main programs ceased operations and a new program started up. The evaluator was unable to schedule an interview with the lead staff for either of these programs due to time constraints (these changes happened at the very end of the data collection time period) and unavailability of one key representative. There was little written documentation on either program available. In addition, many policies affecting the intake and ATD operations appeared to be in flux. Lack of documentation resulted in the evaluator having to rely solely on what various stakeholders said about current policy and procedures and sometimes their reports conflicted.
The evaluation attempted to meet these challenges by a) using multiple sources of data, b) noting conflicting sets of information where they occurred, c) identifying key themes that were most prominent across multiple stakeholder interviews d) identifying stakeholder-‐based information that triangulated well with report or other document-‐based data and e) examining report data in an historical context, identifying patterns that held over time.
3
C. Report Organization The remainder of this report consists of eight additional sections. The first six sections are devoted to describing and presenting findings on key ATD components within the Parish and include: II. Intake and Assignment to and ATD III. Alternatives to Detention: An Overview of Programs IV. Electronic Monitoring V. Evening Reporting Center VI. Orleans Detention Alternatives Program VII. AnyTrax The final two sections move from more specific findings to a broader view of the program. Section VIII pulls together several major cross-‐cutting themes that speak to larger, systemic issues affecting all ATDs; Section IX offers recommendations that are primarily derived from the preceding section. II. Intake and Assignment to an ATD
A. The RAI
The OP Juvenile Court operates an Intake Center for juveniles who have been arrested; the Center does an initial assessment to determine whether a youth apprehended by law enforcement will be detained, released without condition or released to an Alternative to Detention (ATD) program. Intake is currently co-‐located at the New Orleans Police Department (NOPD) Juvenile Bureau where youth are first booked and then taken to the Intake Specialist who is on duty. (The Intake Center is operated 24/7.)The Specialist administers a Risk Assessment Instrument (RAI), which consists of eight questions about the youth’s current, pending and past offenses and related criminal history items. RAI Cut-‐Off Scores Policy regarding RAI cut-‐off scores has remained stable over time, according to various stakeholders interviewed for this evaluation. The cut-‐off scores on the RAI are as follows: 0-‐9 points Release to a responsible adult without conditions
10-‐14 Eligible for placement in an ATD
15+ Detain pending judicial hearing
RAI Results The charts below show RAI outcomes for the most recent program years. As would be expected, the largest portion of RAI outcomes falls into the “released” category (50-‐55 percent). Since ATDs are designed to serve medium-‐risk youth, one might expect the ATD assignments to comprise the next largest group; however, only 10-‐12 percent were scored as eligible for an alternative, with the large remainder (35-‐38 percent) assigned to detention.
4
Source: Louisiana Final JDAI Report, May 1, 2011-‐April 30, 2013. RAI Overrides As reported in the Validation Report that is a companion to this report, RAI overrides “up” during 2011 constituted 21.9 percent of all RAI outcomes; youth who were assessed as low or medium risk were moved up either to an alternative or to detention. This pattern of overriding the RAI score continued through 2012 and into 2013, contributing to higher percentages of youth being sent to detention instead of released or placed on an alternative. The JDAI Data Work Group did an analysis of overrides from January through August 2013, based on data extracted from the juvenile intake log. Their analysis found an override up rate of 16 percent during this more recent time period. Validity of the RAI A second companion report on findings from the RAI Validation Study has found that the RAI, as it is currently constructed, to have weak predictive power in determining risk. The report has recommended specific item and scoring changes to strengthen the tool. Please refer to this report for more information regarding validity issues with the RAI.
B. Assignment to an ATD Program
Current Assignment Policy At the time of the evaluation, assignment to an ATD was typically based on a youth’s RAI score. Youth who score between 10 and 14 on the RAI are eligible for referral to an alternative. According to policy outlined in the program handbook provided to the evaluation team in December 2013 (Orleans Parish Juvenile Court Intake Policies and Procedures, undated), placement decisions made at intake are to be based on the youth’s RAI score, as follows:
Lower range score (10-‐11) Evening Reporting Center (ERC)
Middle range score(12-‐13) Orleans Detention Alternative Program (ODAP)
Upper range score (14) Either ERC or ODAP, plus electronic monitoring (EM)
38%
12%
50%
RAI Outcomes 5/11 to 4/12
Detained ATD Released
35%
10%
55%
RAI outcomes 5/12-‐4/13
Detained ATD Released
5
At the end of December the evening reporting program was discontinued due to lack of funding and was replaced with and a telephone-‐based self reporting system called AnyTrax, which requires youth to call into an automated system. The above three programs and the new program, AnyTrax, are described in more detail in Section III. Note: Court staff have indicated that the above scoring and placement system has been in effect since August 2013, but is still pending formal judicial approval. However, other stakeholders, including intake staff treat these policies as fully in force.
Stakeholder Perceptions of Intake There is general agreement among stakeholders interviewed that the current system needs to be changed. When stakeholders familiar with the current intake system were asked to assess it, most rated the current system a “3” on a scale of 1-‐5; only one stakeholder felt the system was fine as is. At same time, stakeholders who were aware of proposed changes to the assignment system (described later below) agreed that the proposed changes were “moving in the right direction” to strengthen the assignment system. However, it is also important to note that not all informants, including those who were centrally involved in the JDAI Collaborative, were aware of or fully understood the significant changes being proposed to assignment at intake. Recent Changes to Intake Assignment practices have substantially changed within recent months and additional changes are in the process of being approved and implemented. In early or mid-‐2013 the assignment system was revamped1 to the current approach. Previously youth with a low RAI score were assigned to EM, rather than to an activity-‐based alternative. A newly re-‐constituted ATD work group argued that this approach unnecessarily put far too many lower-‐risk youth on EM, conflicting with widely accepted JDAI practices in other jurisdictions regarding the limited use of EM. The assignment practices were subsequently reversed, as is reflected in the current assignment scheme described earlier. Proposed New Assignment Approach As a result of exploring best practices in other jurisdictions the ATD work group most recently has recommended that the score-‐based system currently in use be replaced with “needs-‐based” approach that takes into account the youth’s social history and current circumstances. The work group has developed a draft “ATD Needs-‐Based Placement Guide,” which outlines proposed rules for assignment to the various ATDs. In support of this new approach, the Intake Unit, in conjunction with members of the work group, is in the process of drafting a brief social history interview instrument to be used at intake to inform placement decisions. At the time of evaluation, this proposal was scheduled to be presented to the JDAI Collaborative (formerly called the Advisory Committee) and to the juvenile court judges en banc. The proposal to shift to a needs-‐based assignment enjoyed support among a broad spectrum of stakeholders. As mentioned above, however, some key stakeholders were completely unaware of this major proposed change to assignment practices. C. Re-‐Assignment to an ATD
A few stakeholders described another recent change to the ATD system involving the ability to re-‐assign youth to a new ATD. Previously, if a youth was assigned to an ATD, he/she generally stayed in that assignment until terminated by the court at adjudication (unless terminated for other reasons, such as re-‐arrest, FTA, EM violations, or jurisdictional transfer prior to adjudication). Under the new policy,
1 An internal undated memo cites the most recent changes as “interim” as of August 2013; this is the only documentation of the changed policy received by the evaluation team.
6
program directors have the ability to reassign a youth who is doing well to an ATD that is considered lower on a continuum of services or can recommend release. In practice, since essentially only two alternative programs, plus EM are currently available, this means only ODAP has the ability to refer a youth to telephone self-‐reporting, which is considered a less intense service. As described in Section III, both court and community stakeholders are interested in expanding the continuum of services through inclusion of more community-‐based providers. An undated program document entitled New Orleans Juvenile Detention Alternatives Continuum (file name: JDAI ATD Continuum-‐Interim, located in Appendix A) illustrates the continuum as it existed prior to the closure of the ERC and describes the reassignment process. It is unclear whether the document represents current policy and practice regarding re-‐assignment to a new ATD. In conjunction with re-‐assignment protocols, the document describes a 30-‐day assessment process to occur for all youth on an ATD and still awaiting adjudication. This assessment then forms the basis for a recommendation or a decision to re-‐assign a youth. As described later in Section VIII, the systemic concern of how best to monitor the progress of youth on ATDs and avoid lengthy stays in an alternative program is currently being examined, but a final policy is not in place.
D. Additional Assignment Practices
Not all case processing and assignment to an ATD occurs in the manner outlined above. As mentioned in Section A above and discussed in the Validation Study, judges can “override up” the intake decision, often resulting in a more restrictive placement to EM or detention. It also appears from examination of ATD program data and from interview data that some portion of youth may be assigned to an ATD directly from the bench rather than passing through the intake process. This might occur when a youth who has initially been released or detained appears before the judge, who then orders the youth to participate in an ATD. It also may occur when a youth fails to appear (FTA), is subsequently brought in on a warrant and is then placed on EM or is detained. It was beyond the scope of this evaluation to determine how widespread bench assignments to an ATD occurred without the benefit of formal intake and generation of a RAI score. E. Issues and Observations
• The JDAI Collaborative is making positive changes to strengthen the assignment process. The RAI is designed for a single purpose: to guide decisions as to whether to release, detain or release to an ATD. As outlined in an earlier assessment of the Parish’s JDAI program,2 the score has no bearing on the specific type of alternative that should be employed. Allowing assignment to logically flow from youth circumstances moves to eliminate an unwarranted reliance on the RAI score for placement decisions and should improve ability to match the right intensity of services to youth while conserving Parish resources. Of particular concern is the current use of EM based solely on a RAI score when the youth’s circumstances may not warrant the use of such a restrictive alternative.
• While the proposed changes make sense and align with recommended practices, they come on the heels of other significant changes to the intake and assignment process (reversing the score-‐based assignments, replacing evening reporting with a call-‐in system), placing additional demands on intake staff. If they are to be effective, these major changes in policy and practice
2 Juvenile Detention Alternatives Initiative, Orleans Parish System Assessment, October 2012, prepared by the Annie E. Casey Foundation.
7
must be supported by adequate training and preparation of intake staff, including development of up-‐to-‐date written policies and procedures. Intake staff are not only being asked to learn new procedures, but also must develop new interviewing skills and exercise more individual judgment. Previously intake could apply very clear-‐cut assignment rules based on a youth’s score and feel fairly confident in those decisions. Several stakeholders mentioned that at least some intake staff were “nervous” about moving to a new system that required more judgment on their part in determining placement.
• The AnyTrax call-‐in system was implemented before intake staff were fully trained in its application; at the time of the evaluation, intake had made no referrals to this new ATD, although it had been in place for over a month.
• The intake unit appears to be stretched thin, with more than one stakeholder reporting coverage issues. One shift has not been filled for some time.
• Stakeholders report that intake has no budget for training; all training regarding the intake process is carried out by the supervisor. It appears that intake line staff may not have had an opportunity to participate in basic JDAI training over the past two years. The Casey foundation has previously recommended that the JDAI Collaborative seek ways to provide such training to line staff, as well as supervisors in order to fully integrate all stakeholders and strengthen ATD operations.3 Given the coverage issues cited above, such training of intake staff, though highly desirable, may prove difficult to provide.
• Intake setting is an important consideration. Intake is co-‐located with a police booking center and does not have a designated, private interview space. According to knowledgeable interviewees, the intake staff and police officers are in close proximity to each other. While this arrangement may offer some benefits in terms of information sharing, it also may have some drawbacks: youth and family may feel less comfortable about sharing information with intake staff in this non-‐private setting. Ideally, intake should occur in a setting that offers privacy during the intake interview process.
III. Alternatives to Detention: An Overview This section introduces the alternatives to detention programming in Orleans Parish and discusses issues that appear to pertain to all ATDs in general. Subsequent sections examine each of the alternatives individually. A. Description of ATD Programs At the time of the evaluation, the Parish’s JDAI program was in the midst of significant programmatic changes. For several years there have been three main alternatives to detention, two of which were managed by the Juvenile Court:
• Evening Reporting Center (ERC): An after-‐school program designed to monitor youth in the critical out-‐of-‐school hours and to provide homework support, youth development programming and recreational opportunities. The ERC was a program within the Juvenile Court, but ceased operations at the end of 2013, due to lack of funding.
• Orleans Detention Alternative Program (ODAP): Provides intensive supervision of and support to participating youth through frequent face-‐to-‐face meetings in the home. Each youth is
3 The 2012 AECF assessment cited above in note 2.
8
assigned to a Youth Advocate who assesses youth and family needs and coordinates wrap-‐around community-‐based services. The program is also run by the Juvenile Court.
• Electronic Monitoring (EM): Provides intensive monitoring of the youth through an electronic bracelet that continuously transmits geographic location information on the youth. The program is currently operated by the Orleans Parish Sheriff’s Office.
Since November 2013 a fourth pilot program was initiated:
• AnyTrax: An automated tracking system with two components: 1) A self-‐report component in which youth call into an automated system and answer interview questions that are forwarded to the court and 2) A curfew-‐monitoring component that verifies compliance with curfew and other restrictions through voiceprint technology.
B. Utilization of ATDs As described in Section IIA (The RAI), ATDs in 2012-‐2013 constitute a small share of total intakes—around 10 percent. According to the Orleans Parish Detention Risk Assessment Instrument Validation Study, multiple explanations for this pattern of low utilization are possible, including, the extensive use of overrides for first time firearm possession, assignments to an ATD from the bench which are not captured by intake and the RAI scoring system, which gives substantial weight to nature of the current offense. This percentage may be somewhat larger when assignments from the bench are also taken into account. The charts below show utilization raw numbers and proportions for each ATD during 2011 and January-‐June of 2012, as reported by the Juvenile Court in September 2012.
Source: Orleans Parish Results Report 9-‐26-‐12
EM 67%
ODAP 16%
ERC 17%
2011 OP Use of ATDs
EM 68%
ODAP 23%
ERC 9%
2012 OP Use of ATDs (Jan.-‐June)
2011 ATD Use N
2012 (Jan-‐Jun)ATD Use N EM 273
EM 158
ODAP 66
ODAP 53 ERC 70
ERC 21
9
The above charts confirm that historically an overwhelming proportion of assignments has gone to EM, representing over two-‐thirds of all assignments. A review of first quarter 2013 data reported to the foundation, suggests that this basic pattern of assignments has held, although ERC’s portion continued to shrink through 2012 and into 2013, while ODAP’s portion increased.4 At the time of the evaluation, the data analyst responsible for extracting these data had been gone for some months and the position unfilled. Therefore, information on enrollments, length of stay and outcomes in all of the programs presented here is derived from a number of secondary program data sources, such as quarterly and annual reports and program logs. As will be discussed later, the accuracy of some of these sources is not able to be verified; several stakeholders indicated that they did not have total confidence in the data presented in more recent reports. C. The ATD Continuum As described in Section II, the Parish’s concept of an ATD continuum has been evolving; stakeholders expected to put in place new policy defining the entrance criteria for each existing ATD, as well as guidelines for moving a youth within the continuum. At the time of the evaluation, ATD work group members were drafting these criteria and planning to seek approval for them by the full Collaborative. (See Appendix B for a draft description of criteria for placement in each ATD.) Numerous stakeholders expressed the strong desire to expand the number and types of alternatives available, through use of community-‐based providers of after-‐school tutoring, recreational and youth development services. Community stakeholders, in particular, expressed strong concerns that the current continuum consisted of two technology-‐based tracking systems and only one small program (ODAP) that involved true human interaction and supervision. The general feeling among this group of stakeholders was that programs in which adults interacted with youth in a positive, non-‐punitive setting were more likely to be effective in keeping the youth out of trouble. At the time of the evaluation, no specific plans for adding to the existing continuum were in place. D. General Observations Relevant to all ATDs
• The move towards defining a continuum of ATD programs with clear protocols for assigning youth to a particular service is an important first step towards creating a program that is aligned with best practices. There is widespread support for expanding the continuum to include community-‐based services and it would be important to add these services in as planful a way as possible. As the ATD work group and other stakeholders begin considering adding more community-‐based ATDs to flesh out the continuum, it will be important to have good data on the youth who are currently being served through JDAI programs, including the degree to which programs are or are not meeting their needs. This kind of data will help determine the most critical service gaps to be filled on the continuum of ATD programs.
• Although there is a broad appreciation of the JDAI goals among stakeholders, there is also a sense that, counter to the central purpose of JDAI, stakeholders continue to see value in addressing both need and risk through ATD programming. For example, a number of stakeholders mentioned having alternatives that would “help the youth” or “address the youth’s needs.”
4 January 2013 Quarterly Report, submitted by the Juvenile Court to the Casey Foundation.
10
• An active ATD work group is taking on multiple issues and tasks related to ATD-‐related policies and practices. The group appears to be a real strength of the Collaborative in terms of bringing to the forefront ideas for improving the ATDs and bringing them into better alignment with best practices.
• Assessing utilization and comparing outcomes across the ATDs is difficult because the data collection process is decentralized and non-‐standardized. The programs could benefit by having uniform data collection and reporting systems across the various ATDs. Successful/unsuccessful program outcomes are not clearly defined in internal tracking and reporting documents. Nor is it clear whether reported outcomes are based on all youth on an ATD or only those youth who pass through initial intake.
• Currently, although program representatives “report out” at the monthly meetings on their program, there appears to be no written documentation on program activity presented to the Coordinator or to the Collaborative as a whole. In addition to reporting on enrollments and terminations, programs might consider also having a uniform way to present length of stay, physical attendance and violations data. Data issues are discussed more fully in Section IV.
• Length of stay appears to be a continuing issue across all ATDs. All programs have enrolled a significant percentage of youth who have ended up staying for-‐-‐more than 50 days. None of the programs have clear-‐cut protocols for identifying reasonable length of stays, flagging youth who have exceeded this length of stay and expediting the youth’s release from the ATD.
IV. Electronic Monitoring
A. Program Overview
Currently the Orleans Parish Sheriff’s Office runs the electronic monitoring (EM) program for both juveniles and adults. It continues to be the most frequently utilized ATD under the Parish’s JDAI program. As described earlier in the program overview section, EM constituted between 64 and 68 percent of all intakes assigned to an ATD in 2011 and 2012. Similarly high usage of EM continued into 2013.5 The program is considered by most stakeholders to be the most restrictive of any of the four ATDS that have been or currently are in use. Unlike the other ATDs, participation in EM requires a court order, as does release from monitoring. Youth typically stay on EM until they are adjudicated and released or remanded. The program’s policies and procedures are outlined in a 29-‐page manual, which provides detailed instructions to program staff for enrolling and exiting a client from the program and contains client acceptance forms. The acceptance form for juveniles details the client’s responsibilities, activities that violate the terms of the program and consequences for damaging or removing the monitoring device. The program requires all participants to sign the acceptance form as a condition of enrollment. B. Enrollment and ADP While complete enrollment data for 2012 and 2013 were not available to the evaluator, partial data for 2012 and 2013 prepared by the program indicate that enrollments in EM have remained fairly steady, despite changes in assignment protocols (described earlier), which one might have expected to have
5 The March 2013 JDAI quarterly report indicated that EM served about 64 percent of all youth enrolled during the quarter.
11
reduced the number of EM assignments. Data from quarterly reports in 2012 and 2013 suggest that the program continued to serve at least 275 youth annually and that the average daily population served was between 37 and 47 youth. C. LOS Information Average length of stay (ALOS) on ATDs does not seem to be routinely calculated and reported for any of the ATDs. None of the various annual or quarterly reports provided to the evaluation provides this information. However, a special assessment conducted by the AECF in late 2012 examined ALOS issues6. At that time a data analyst was available to extract this information and it was reported by the Foundation that ALOS for EM was around 60 days. Although EM does not regularly track and report the average length of stay, it appears to be the only ATD that regularly tracks each youth’s individual length of stay, using weekly population roster sheets that contain each youth’s LOS to date. A review of several recent internal population sheets for EM suggests that in late 2013 a portion of youth continued to be held on EM for substantial periods of time. For example, an October 2013 weekly population report indicated that nearly half the youth on EM had been enrolled for over 50 days; one youth had been on EM for 168 days. Although this appears to be an extreme case, other fall quarter population counts included numerous cases of youth enrolled for longer than 50 days. D. Youth Outcomes Two key sets of outcome measures used by the JDAI project to assess program effectiveness are FTA’s and re-‐arrests of youth who have either been released or assigned to ATDs. Annual FTA/re-‐arrest rates for each ATD are not available on a regular basis, that is, they are not contained in the required annual report made to the Casey Foundation. A September 2012 Power Point report 7cited an 86 percent “success rate” for EM, but did not define precisely how this rate was calculated. To supplement this information we also looked at the ratio of FTAs/ re-‐arrests to all termination, as presented in several quarterly reports available at the time of the evaluation. While this approach doesn’t provide a precise measure of the percentage of successful/unsuccessful outcomes, it is an alternative way of assessing success, given the absence of more traditional measures. The data below indicate that the proportion of unsuccessful youth (youth with FTA or re-‐arrest) typically equaled or exceeded 25 percent of total terminations each quarter. (Note: The second quarter 2012 ratio of 1 recidivism case out of 75 terminations is out of line with the other quarters and is likely a programmatic anomaly or reflects a data entry error.)
Quarter FTAs & Re-‐Arrests All Terminations Percent of Terminations
Q12012 20 80 25%
Q2 2012 1 75 1%
Q3 2012 22 65 34%
Q4 2012 no data no data no data
Q1 2013 28 69 41%
6 Juvenile Detention Alternatives Initiative, Orleans Parish System Assessment, October 2012, cited above. 7 Orleans Parish Results Report 9-‐26-‐12, created by Bridgette Butler, former JDAI Coordinator.
12
To have a more accurate read of the “success” rate, we would need to trace the outcomes of every enrolled youth either until release from the ATD or at 30 days post enrollment. The latter approach would standardize rates across individuals with widely varying lengths of stay on an ATD. Violations and Remands EM program policy calls for referring a youth back to court upon a third violation of program terms, and violations appear to be carefully tracked by the program. Common reasons for violations, according to stakeholders, included failure to adequately charge the electronic device, curfew or other out-‐of-‐area violations. About one-‐third of youth assigned to EM are referred back to court due to such violations, and may be remanded to detention for a period of time, according to knowledgeable stakeholders. However, a youth might be allowed to return then to the EM program and still be included in the final success rate of the program. The program may internally track these remands and returns, but this information does not appear to be aggregated, analyzed or reported regularly to the Collaborative as a whole. E. Stakeholder Perceptions
A majority of stakeholders interviewed expressed concerns about two aspects of the EM program 1) the perceived excessive use of EM as a catch-‐all for a variety of youth, including those who did not need to be placed under such a restrictive ATD, and 2) the length of stay of many youth beyond the program’s effective “shelf life” of 30 days. These concerns are essentially systems concerns, as opposed to concerns about the program’s internal operations. F. EM Issues and Observations
• The EM program is well-‐documented. Its policies and procedures manual appears to be up-‐to-‐date and provides detailed guidance to deputies charged with enrolling, monitoring and terminating youth. It also provides detailed information to parents and youth about program requirements and consequences for violating these requirements.
• The EM internal population tracking system appears to be complete, detailed and accurate. It is structured to support ongoing monitoring of length of stay, an important support to case processing and expediting. With some minor additions to the current tracking system, the program could readily calculate ALOS, as well. However, the JDAI system appears to have no one assigned to follow-‐up on this tracking information; the sheriff’s department does not see this as their role. Nor is there anyone from the juvenile court assigned to identify youth who are approaching or have gone beyond a 30-‐day stay.
• The EM program overly dominates JDAI, comprising two-‐thirds or more of all assignments through intake. Yet although JDAI in the Parish is largely defined by EM, the program is not very well integrated into the overall program. The Sheriff’s staff does not appear to participate regularly in JDAI training, nor does there appear to be regular meetings between court program administrators and EM program management to discuss policy, staffing, training, reporting or other operational matters.
• EM has the highest failure rate of the ATDs, with as high as 41 percent of terminations reported (during the first quarter of 2013) as “unsuccessful” (FTAs and re-‐arrests). The data suggest that large number of youth routinely violate EM conditions and spend time in
13
detention as a result. Additional information on this finding is addressed in Section VIII in the context of broad systemic issues affecting overall ATD performance. While this important information appears to be carefully tracked, it is not routinely shared in an easy-‐to-‐understand format with the Collaborative.
• Significant numbers of youth are placed on EM for long periods of time, increasing the chances that the youth will either cut off the monitoring bracelet or violate conditions. Thus, this widely-‐used alternative may actually be leading to “net-‐widening”-‐-‐ greater entanglement with the courts, new arrests and increased days in detention.
• The ATD essentially duplicates the adult program. There appear to be no special adjustments for juveniles, such as increased “checking in” with additional face-‐to-‐face supervision and support. The forms used to explain the program are detailed but are likely above the reading comprehension level of many youths and adults. They could be supplemented with a plain language handout for youth and parents.
V. Evening Reporting Center Owing to the cessation of the ERC program in late 2013, the evaluation had limited access to document, interview and client data pertaining to the program. The evaluator was unable to obtain an interview from program staff despite multiple attempts to schedule such an interview. Nor was the evaluator able to obtain any descriptive material on the program, such as a brochure or the program’s handbook, which is referenced in the Casey Foundation’s 2012 assessment. Thus, most of the findings in this section are based on the 2012 PowerPoint report and quarterly reports referenced previously, as well as stakeholder perceptions of the program. A. Program Overview
In 2013, the program capacity was 10 youth. However, at one point in 2012 it had to temporarily close due to restructuring and staff turnover. The program’s main purpose was to provide a safe, structured environment in which youth could be supervised pre-‐adjudication during after school hours when many parents were at work. Enrolled youth were picked up at school by a van and driven to a central location where they remained until closing at 7:00 pm and then were driven home. The ERC location appeared to have changed three times during its operation, which affected programming. At its last location, the Boys and Girls Club, the ERC was able to offer greater recreation and leisure activities. Youth also received a hot meal and homework support. A program description from 2008 indicated that the ERC provided a range of activities, including health education, community service, drug and alcohol education, gender-‐specific programming, anger management, life skills, and recreation. However, as detailed under Issues and Observations below, this additional programming content appears to have been more aspirational than real, at least during its final two years of operation. B. Enrollments and Attendance
The ERC was the smallest ATD, serving about nine percent of all ATD enrollees in 2012.8 In 2011 it served a total of 70 youth, but in subsequent quarters, enrollments dropped to around 10 entrants per quarter. These enrollment patterns may have reflected staff turnover and problems finding a stable
8 From the Orleans Parish Results Report 9-‐26-‐12 Power Point report cited earlier.
14
home. But in addition, the program’s actual utilization rate appeared to be substantially lower than enrollments, due most likely to attendance issues. Quarterly reports from 2012 and 2013 show average daily population (ADP) rates that are consistently well below the original capacity of 10 youth. The exception is the third quarter of 2012 when the ADP was at 9 youth. However, in much of 2012, ADP was closer to 6 youth, and by the first quarter of 2013, the ADP had plummeted to 4.7 youth. A review of a sample program log indicates that some enrollees had multiple absences and some did not attend over a period of time before being terminated from the program. Because the evaluation did not have access to the program handbook, attendance and termination policies could not be determined. C. Average Length of Stay
Current length of stay data was not available; the AECF assessment cited earlier (footnote 2) reported that from May-‐October 2012, the average length of stay in the ERC program was 41 days and that a few of the youth had lengthy stays of beyond 60 days. As with EM, there apparently were no provisions for flagging youth who had been on for excessive periods of time and expediting their stay on the ATD. D. Youth Outcomes for the ERC
The September 2012 Power Point report cited earlier lists the ERC’s “success rate” as 94 percent, with six percent of terminations presumably resulting from FTAs and re-‐arrests. However, the definition of success used in this program is unclear in this sense: the program’s internal monthly and quarterly summaries of outcomes made a distinction between youth who had recidivated and were terminated from the program and youth who had recidivated but were allowed to remain in the program. This suggests that an unknown number of youth may have failed to appear or been re-‐arrested, but yet were counted as program “successes.” Since none of the reports define exactly how success is measured, we are not certain what the published success rate of 94 percent means. A review of quarterly reports indicates that across the four available quarters (Q1-‐3 in 2012, Q1 in 2013) the program had 35 terminations and reported five FTAs/re-‐arrests across the same time period, or about 14 percent of all terminations (as opposed to the six percent reported in the September 2012 report). E. Stakeholder Perceptions
Most stakeholders deemed the ERC the weakest program component of JDAI. Their assessments were due, in part, to its design and limited capacity, and in part, due to circumstances beyond the program’s control, such as being forced to move locations. Frequently mentioned structural problems included: • Serving too disparate a geographic area
• Youth spending too much time in the transportation van and not enough time in actual activities
• Limited hours of the Center
• Difficulties coordinating with extended day school schedules
• Limited programming opportunities However, many stakeholders supported the concept, in theory, of an after-‐school program and lamented the loss of a program that provided youth with personal contact. Once the ERC was closed, ODAP remained the only ATD program that offered face-‐to-‐face interaction with the supervising adult.
15
F. ERC Issues and Observations
• The program concept appeared to have several major flaws. If the program were to be restarted, serious consideration would need to be given to re-‐structuring in a way that did not reproduce the transportation and programming issues the previous program experienced.
• The program appeared to be consistently under-‐utilized with very low average daily population counts for much of the last two years for which data were available. Attendance records suggest that a number of enrolled youth missed multiple days.
• Internal monthly reports were poorly maintained: there were numerous arithmetic errors in a sample report reviewed for the evaluation because auto-‐calculation functions were not employed in the “totals” cells within the report.
• As with the other programs, nowhere among program documents does it explicitly define how program “success” was measured; it is possible that youth who recidivated but remained in the program were counted as “successful.” Internal monthly reports did not consistently mesh with quarterly reports submitted to the Foundation.
• Updated policies and procedures for the ERC either were not developed during 2013 or were not shared with the JDAI coordinator, who is responsible for overseeing and reporting on the alternatives.
VI. Orleans Detention Alternatives Program (ODAP) A. Program Overview The Orleans Detention Alternatives Program, or ODAP is currently the only ATD that provides regular face-‐to-‐face contact time between the youth and program staff. Established and managed by the Orleans Parish Juvenile Court, the program has a current capacity of 25 and may serve youth post-‐ as well as pre-‐adjudication youth. ODAP is intensive in its approach: youth and families are immediately assigned to one of five case workers, called Youth Advocates. (Maximum caseload for an advocate is five youth.) This advocate meets with the family and youth three times a week to engage them, find out their strengths and needs and coordinate wrap-‐around services when available. The ODAP brochure emphasizes that the program is present to support the youth and family through the pre-‐adjudication process and cites a “never give up” program philosophy, which includes focusing on and developing family strengths. B. ODAP Enrollments and ADP
In 2011 ODAP had a total annual enrollment of 66 youth, which represented about 16 percent of all youth assigned to an ATD that year. Until 2013, the program’s capacity was set at 18 instead of 25, due to caseload assignments. The program policy at that time was to maintain a caseload ratio of three-‐to-‐one; the policy was later revised upwards to the current five-‐to-‐one ratio, thus expanding the program’s capacity without increasing staff size. Program enrollments rose in 2012; by mid-‐year the program had served over 50 youth. ADP In 2012 ADP hovered between 13 and 14 youth; at the beginning of 2013 utilization rose with expansion of capacity to an ADP of 20. Program staff have indicated that in late 2013 the total caseload was 19 youth.
16
ALOS As with the other programs, average length of stay data were not regularly tracked or reported. The most recent ALOS data available on ODAP was 2012 data presented in the Casey Foundation Assessment and cited previously. The assessment reported that for January through September 2012, the program had an ALOS of 53 days. This average, however, masks a pattern of very short stays for some youth and very long stays for other youth. For example, 13 youth were reported to have stayed over 90 days. C. Youth Outcomes
The program has reported an 83 percent “success rate” for the first three quarters of 2012. Interestingly, this is the only program for which quarterly exit data indicate that the rate may have been substantially higher by the end of the year and beginning of 2013. The exit data shown below suggest that the program trend was towards fewer FTAs and re-‐arrests. Recidivist cases constituted close to 11 percent of all terminations for the quarters shown below.
Quarter FTAs & Re-‐Arrests All Terminations Percent of Terminations
Q12012 7 23 30%
Q2 2012 1 26 4%
Q3 2012 1 17 6%
Q4 2012 no data no data no data
Q1 2013 1 23 4%
TOTAL 10 89 11% As with the ERC, ODAP’s monthly and quarterly enrollment/termination summaries make a distinction between the recidivist youth who were terminated and those who were not. Again, this raises the question as to whether all FTAs/re-‐arrests were included in the reported totals for these categories. D. Stakeholder Perceptions
Most stakeholders considered ODAP to be the strongest ATD program because of the personal contact provided to the youth by caring and dedicated staff. Most felt the programming was solid and that the program did a good job of engaging youth and keeping them from re-‐offending. However, a minority of stakeholders questioned using an ATD to provide costly and intensive services that are more oriented towards the youths needs during the pre-‐adjudication phase of case processing. E. ODAP Issues and Observations
• Program records and stakeholder accounts suggest that ODAP is a well-‐managed program. Its numbers have steadily increased over time while percentage of FTAs and re-‐arrests has decreased, according to quarterly reports. As with the other ATDs, however, program success needs to be more clearly and uniformly defined across all ATDs.
• While stakeholders generally agree that ODAP staff work well with the youth, some have raised legitimate concerns about the use of such an intensive supervision and support model in a pre-‐adjudication setting. Moreover, the program seems more oriented towards addressing the youth’s needs, rather than risks. This need-‐based approach runs counter to the basic JDAI
17
model, which focuses on reducing the risk of an FTA or re-‐arrest during the short period (in most cases) between intake and adjudication. Ideally, youth should only be in an ATD for a short period of time—not long enough to work seriously with needs. These intensive services might be more appropriately directly at youth post-‐adjudication.
• The ATD workgroup has recommending confining ODAP assignment to pre-‐adjudicated youth with significant mental health or other needs where a case management model and more intensive supervision may be required to reduce risk and protect public safety. However, if assignment policy were to move in this direction, ODAP enrollment might drop and then the program would face with an underutilization of staff resources. IF ODAP is to remain a viable part of the ATD continuum, its service delivery model may need to be reconfigured so that some staff serve greater numbers of youth less intensively, focusing on reducing risk.
VII. AnyTrax A. Program Overview At the time of the evaluation, the AnyTrax program, which is managed by the Juvenile Court, had been up and running for only a few weeks and was being treated as a pilot program to be evaluated in January 2014. The promotional materials for this proprietary monitoring system describe two components: a self-‐reporting system and a curfew monitoring system. The self reporting system requires the youth to call into a central monitoring system, which conducts an automated interview with the youth. The results of the interview are forwarded to the court. The curfew monitoring system is also telephone-‐based; it uses bio-‐metric technology to monitor the youth’s compliance with curfew or house arrest and is marketed as a less intrusive alternative to traditional electronic monitoring. The program was described by court stakeholders as something that could be available to lower-‐risk youth in lieu of the ERC, now that the ERC was closing. At the time of the evaluation, some basic guidelines for the AnyTrax program had been draw up, but the program did not appear to have descriptive materials for parents and youth or written policies and procedures for staff. As mentioned in Section II, Intake staff had not yet referred any youth to the program and there was no participant data yet available. B. Issues and Observations
• Stakeholders have differing notions as to where on the ATD continuum the AnyTrax program fits. Some believe it to be at the low end of the continuum, while others see it as nearly equivalent to EM and at the most restrictive end of the continuum. Part of the issue may be that stakeholders are focusing on different components of AnyTrax. The telephone self-‐report appears to be minimally intrusive, but the geographic monitoring component is presented in the company’s promotional materials as a less costly substitute for EM. The nature and intended use of these two components needs to be clarified for the Collaborative and especially for those who are working on developing criteria for assignment on the continuum.
• Stakeholders raised concerns about transparency and collaboration, due to the manner in which the AnyTrax was introduced. The program, according to stakeholders, was introduced suddenly with no general discussion about the pros and cons of the program with members of the broader Collaborative. Many stakeholders were unsure of the value of the program and felt that the Court’s unilateral decision-‐making set back their trust in the Collaborative as a viable entity.
18
• AnyTrax appears to have been introduced without a detailed implementation plan, clear guidelines for its operation or a policies and procedures manual for staff and stakeholders. Many stakeholders were unclear how the call-‐in portion of the program would operate when it required a landline in the home, but almost all families in the program would only have a cell phone.
• The same issues of “net widening” that occur with EM are possible with the geographic monitoring component of AnyTrax. Youth are likely to forget to keep their phone adequately charged; they may break curfew and other geographic conditions imposed, as they do with EM. The program will need to discuss strategies for mitigating these potential problems, including plans for careful tracking of violations, length of stay, and remands by the court.
VIII. Overarching Themes This section pulls together overarching themes, emphasizing four major sets of systemic issues that affect all the ATD programming. These systemic areas encompass:
• Planning and program implementation
• Management
• Data systems
• Case Processing
A. Planning and Program Development
Agreeing on the Fundamental Purpose of ATDs While stakeholders generally subscribe to the JDAI goals of reducing risk of an FTA or a new offense prior to case disposition, there is a contradictory tendency to also focus on meeting youth needs through ATD programming. As the Collaborative considers expansion of the continuum of ATD services, it will become especially important to reach a clear consensus on the fundamental purpose of these programs. Using Data to Inform Programs JDAI is a data-‐driven initiative: programs and policies are to be developed based on the larger national context of best practices research and the more local context of the youth population being served and their observed outcomes. However, it appears the intake policies and ATD programs have been developed without adequate consideration of either of these contexts. For example, EM is the dominant ATD within the Parish. Yet research clearly indicates that intensive supervision programs like EM are generally less suitable and less successful for youth and can lead to net-‐widening.9 Over the years, local EM program data seem to have also born this out: the program has relatively high failure rates and appears to lead to increased, rather than decreased involvement with the justice system. While many stakeholders have a general awareness of these issues, the data have not driven program decision-‐making about EM.
9 For more information, see Gendreau et al (2000). The Effects of Community Sanctions and Incarceration on Recidivism.
19
Assessing Program Performance During 2013, JDAI workgroups have played a more active role within the Collaborative in requesting, analyzing and reporting on program data, and this has been a positive development for the Collaborative. However, these, decentralized analysis efforts might be enhanced if there were a more centralized framework established for assessing key program structures, policies, processes and outcomes. Such a framework might also establish priority information needs, as well as define areas of assessment responsibility for the JDAI coordinator, program staff and other court staff, so that efforts at assessment are consistent, targeted and well coordinated. On paper, the JDAI Coordinator is responsible for evaluating ATD programs, but such evaluations have not occurred, or at least have not been reported on beyond collating the limited enrollment and termination data provided by each program. Assigning this task to the Coordinator appears unrealistic, both because the Coordinator already has many other duties assigned and because it requires specialized knowledge and technical skills that the Coordinator is not likely to possess, at least initially. Implementing New Programs The Parish JDAI does not appear to have well-‐ established criteria and protocols for implementing new programs. For example, the AnyTrax program was initiated without a clear set of a priori criteria measuring the need for the program or its suitability. No wider discussion of the program’s pros and cons occurred and the program was implemented rapidly before guidelines for participation were established or all relevant staff were fully trained. Developing Coherent Policies The Orleans Parish JDAI operates within a complex organizational structure in which authority and influence appear to be dispersed across numerous entities beyond the Juvenile Court. Not all of these entities seem equally committed to JDAI principles regarding reducing unnecessary detention of low and moderate risk youth. While it was beyond the scope of the evaluation to examine this organizational structure, it is important to note that it poses inherent challenges to developing coherent, best practices policies. A good example of this is the substantial use of RAI overrides due, in part, to lack of stakeholder agreement over the purpose of detention.
B. Data
Data Collection There is no single, uniform data collection and reporting system that the different parts of the Collaborative share and believe in. In 2012 JDAI received funds to incorporate ATD information into the state’s IJJIS . The program reported in its 2012 annual report that these ATD modules had gone live and would support effective monitoring of ATD client case processing. To date the IJJIS system has not been able to support ATD case monitoring or client outcomes. Instead, each program relies on internal case logs kept on individual program computers. These logs are then presumably used to create aggregate reports that could be made available to stakeholders beyond the individual programs. However, a number of issues are associated with these internal data collection systems, including:
• Non-‐uniform client tracking systems across the different ATDs10
• No standardized, consistent and documented set of data definitions across programs
10 The EM records are kept in a substantially different format from the court-‐run ATD programs and no monthly or quarterly roll-‐ups are routinely produced for this ATD, which serves the largest number of clients.
20
• No documentation of criteria for defining program “success”
• Data errors and inconsistencies in some programs
• Limited tracking of length of stay data Staff Expertise JDAI program continues to be hampered by insufficient staff expertise to support timely, accurate use of data for routine program monitoring and reporting. A data analyst position has remained unfilled for a number of months; there is currently no staff person within the court system who is able to support solid data collection and reporting. A number of stakeholders reported that access good data remains a strong concern. Data Reporting The data collected by the ATDs is not regularly reported to the broader collaborative in a standardized, easy-‐to-‐understand format. The court-‐run programs (ERC and ODAP) have been producing two sets of aggregate reports: a monthly and quarterly summary that appears to be for internal program use and a quarterly summary made to the AEC Foundation. Neither of these two types of aggregate summaries are particularly “user-‐friendly.” The monthly reports in particular are visually dense with multiple data elements and require the user to spend time picking out common measures used in JDAI to monitor program processes and outcomes. In addition, the internal monthly/quarterly summaries don’t easily map to the quarterly or yearly reports made to the foundation.11 EM maintains an enrollment log and weekly population sheets, but data from these or do not appear to be routinely aggregated and reported on to an external audience. The “recidivism” rates (undefined) in internal reports don’t match up with re-‐arrests shown in the 2012 and 2013 annual reports—it appears that the rates used in the annual reports indicate number of arrests as a percentage of both released youth and ATD youth combined, whereas the quarterly reports show “recidivism” as a percentage of only ATD participants. A specific definition of re-‐arrest/recidivism rates does not appear in any of the reports, which can lead to a user’s uncertainty as to what is being measured. C. Program Management
Due perhaps in part to turnover in key JDAI-‐related staff positions in the past two years, the program has lacked continuity in its program structures and processes. In addition to data management issues discussed above, there are several systemic management concerns that appear to cross-‐cut programs, as described below. Program Policies and Procedures Of the four programs, plus intake reviewed here, only EM made available to the evaluation an updated, comprehensive policies and procedures manual. The other components either had an out-‐of-‐date manual, had not created a manual or did not provide a manual. (Two of the manuals were reportedly in
11 For example, the internal monthly/quarterly summaries do not capture FTAs, nor length of stay information—critical JDAI measurements. By contrast, in the quarterly reports submitted to the foundation, FTAs and re-‐arrests for the quarter are combined and there are no distinctions between youth who recidivated and remained in the program vs. youth who recidivated and were terminated. Moreover, the reports use varying terminology in referring to “recidivism”
21
hard copy format only and were therefore difficult to update or distribute.) Important planning and policy-‐related documents lack basic information on when the document was created, who created it and whether it described proposed or approved policy. Stakeholders were sometimes unclear about which policies were actually approved and in force and did not appear to have ready access to either an electronic or hardcopy library of policy and program documents. Training Ensuring adequate training of staff is a central management function. However, training resources in at least some of the ATD programs appear to be quite constrained and the burden of training falls entirely on the program supervisor. Individual ATD program supervisors may be quite capable in providing training/refresher training to their staffs, but there does not appear to be a cross-‐program framework or set of guidelines for such ensuring sufficient and quality training—when it should occur, what the curriculum should include, or how/when staff competencies related to the training should be assessed. Currently it appears that new line staff involved in ATD program components do not necessarily receive basic training in JDAI. Program Oversight and Quality Assurance Specific oversight responsibilities related to JDAI in general and the ATDs specifically are not clearly spelled out. Given the many other more concrete and specific duties the Coordinator must carry out, it appears that some program oversight functions have been dispersed to other court administrators. Lines of authority for ensuring program coherence/integration across the ATDs and compliance with policy are not entirely clear. Protocols for identifying strengths and trouble-‐shooting problems across the ATDs do not appear to have been established and/or documented. For example, ongoing tracking and reporting of length of stay of participants is a basic monitoring responsibility important to JDAI goals. Yet, as described further below, no person or group of persons appears to have undertaken this responsibility. An important aspect of oversight is quality assurance; established measures of quality assurance (which extend beyond participant outcomes and are customized to each program) support ongoing program development and improvement. Such measures are particularly useful and important when new policies, programs and practices are being introduced, so that implementation issues are rapidly identified and corrective measures developed and assessed early on. If the JDAI program expands beyond the current set of ATDs quality assurance and other oversight functions will become even more critical. D. Case Processing
The RAI The Risk Assessment Instrument, or RAI, which is administered post-‐arrest to youth at intake, is an important initial step in the case processing sequence, as it determines the youth’s pre-‐adjudication status. RAI issues have been touched on earlier in this report, as well discussed extensively in the companion Validation Report and therefore will only be briefly summarized here. These issues include:
• Limited ability to predict risk: The current instrument does not discriminate very well between low, medium and high risk youth, in terms of their propensity to reoffend. Multiple adjustments to items and scoring are suggested to improve the instrument’s usefulness as an intake tool.
22
• Score-‐based assignment to an ATD being phased out: Up until the current evaluation, the RAI results were inappropriately used to determine assignment to a particular ATD. The tool was never intended to be used in this manner and it is unclear how the practice was started originally. This score-‐based assignment likely contributed to the substantial overuse of EM, when less restrictive options would have been possible. Decision-‐makers in the court, as well as many stakeholders outside the court now appear to agree that this system of assignment should be replaced. The ATD Work Group, in conjunction with Intake staff have been working to revise intake and assignment procedures in order to better match the youth’s circumstances to an ATD.
• Significant use of RAI Overrides: Since at least 2011 RAI overrides “up” have been applied to significant numbers of youth at intake; unless policies underlying these overrides are changed, they will continue to be a potential source of extra days in detention for youth who are at a lower risk of reoffending.
Average Length of Stay In 2012-‐2013, the average length of stay (ALOS) for youth in detention was 14.2 day.12 By comparison the ALOS for youth on an ATD was as follows:13
EM-‐60 days (time period unspecified)
ERC-‐ 41 days (May-‐October 2012)
ODAP – 53 days ((Jan-‐Sept. 2012)
We don’t have more recent length of stay data for the ATDs and, as discussed in earlier sections on the individual ATDs, that lack of information likely contributes to overly long stays for many youth on ATDs. As the above data indicate, youth are staying, on average, between three and four times longer on ATDs than they are in detention. LOS and Electronic Monitoring There is widespread recognition among stakeholders that length of stay is an issue across all ATD programs, but especially across EM where the suspicion is that excessive time on EM leads to tampering and then results in re-‐arrest. The Override Report of 10/13 (cited earlier) indicates that EM warrants constituted the biggest portion of RAI overrides—44 out of 115 overrides during an eight-‐month period from January to August 2013. While we don’t know for certain whether it is longevity, per se, that leads to EM violations and subsequent new arrests and detention, the data suggest that this is a distinct possibility. Case Monitoring and Expediting There is no clearly-‐ defined system in place for tracking length of stay on ATDs, for flagging those youth who are on longer than a set amount of time and for working to expedite their cases. A number of stakeholders mentioned 30 days as the maximum “shelf life” of ATD participation, which aligns with a large body of research on intensive supervision models. This research suggests that placing youth in intensive supervision settings increases the opportunities for non-‐compliance and thus, increases the potential for greater, not lesser involvement with the courts and with detention. Moreover, the more 12 As reported in the JDAI Annual Results (2012-‐07-‐01 to 2013-‐06-‐30) 13 As reported in the AECF Assessment, Orleans Parish System Assessment, October 2012, Pg 10
23
open-‐ended stays of many youth may create a program expectation of serving a portion of the youth in a more long-‐term fashion; such an expectation might naturally feed into a desire to create/maintain enhanced services and focus on youth need, rather than risk. An undated document entitled New Orleans Juvenile Detention Alternatives Continuum (Appendix A ) discusses the use of a 30-‐day assessment for every youth on an ATD and the decision-‐makers involved in moving the youth to a more or less restrictive ATD or releasing the youth at or before the 30-‐day assessment. However, it appears that the proposals in this document never became formal policy consistently applied across all ATDs. One stakeholder involved in ATD programming indicated that there existed the possibility of moving youth to a less restrictive ATD, but none of the programs appeared to have set up mechanisms for routinely tracking youth or flagging and assessing them at 30 days. Many stakeholders both inside and outside the court are aware of the length of stay issues and want to address them. Some stakeholders who represent the community or other entities external to the Juvenile Court have expressed an interest in instituting a policy that calls for releasing an ATD-‐assigned youth at 30 days, if the youth had had no problems up to that point. However, at the time of the evaluation, such a policy proposal had not yet been developed and vetted by the Collaborative or the Court. Section IX. Systems Recommendations
A. Program Planning and Implementation
• Ensure that planners and decision-‐makers develop ATD programming focused primarily on reducing risk of FTA/reoffending during the short period between intake and disposition of a case. Consider redefining ODAP program content and structure to better align it with this purpose.
• Develop/document a policy and procedures framework for assessing all ATD program components detailing assessment-‐related information to be collected, how often and by whom. It’s important to clarify what specific assessment and decision-‐making responsibilities within this framework fall to the Coordinator, to program supervisors, to Court administrators or others, such as members of the ATD work group. In addition, document how/by whom the assessment information will be used to improve programs.
• Create specific policy and protocols for adding programs to the ATD continuum. For example, develop specific criteria for assessing the need for and suitability of any program being considered as an ATD. Document these criteria and share with the broader Collaborative. Clearly define the roles and responsibilities of the different entities involved in implementing an ATD (e.g., the responsibilities of the Court vs. a community-‐based provider).
• The current ATD continuum places too few resources into programs involving personal interaction with a supervising adult, relying instead largely on EM, with its propensity to net-‐widen and lead to increased punishment. Consider expanding ATD options that are less punitive in nature and provide opportunities for connecting with a supportive adult. An example would be a “tracker program” in which youth have the benefit of face-‐to-‐face supervision and some personal support. The tracker has sufficient personal contact to be able to identify youth who may be struggling and need some additional support in order to be successful in meeting program goals.
24
B. Data
• Define standardized data elements to be collected across all ATDs so that all programs collect uniform, comparable data. Document the definitions of all data elements in a data dictionary. For example, clearly define terms such as “reoffending,” “FTA” and “successful/unsuccessful termination.” Once a uniform system is created and documented, train all staff involved in collecting ATD–related data.
• Set up a data quality assurance system to measure and improve compliance with the new data collection system.
• Create uniform internal program client tracking programs; this will be especially important if the ATD continuum expands with community-‐based programs. For example, create computer-‐based enrollment/termination log templates and report templates for all ATDs.
• Bring in the expertise needed to have the JDAI components within IJJIS working as intended, so that this comprehensive system can be used to assess and evaluate the ATDs in an efficient and effective manner.
• Ensure that program supervisors or others involved in maintaining program data receive adequate training in not only how to define data elements and enter data, but also how to minimize data entry errors, create secure storage and report out in a uniform manner
• Create a simple, uniform data reporting system that supports the generation of regular, user-‐friendly reports to the broader Collaborative, so that the work of the Collaborative is more truly data-‐informed.
Program Management
• Require up-‐to-‐date policies and procedures manuals for all ATDs and make these available, along with other key program documents, through a readily accessible electronic file library. Consider posting links to key documents on a dedicated JDAI Facebook or web page so that a broad array of users has ready access to updated documents and can easily be notified of important policy or program changes. Given the decentralized nature of recent policy and program planning through work groups, such access is particularly important.
• Establish and document protocols for naming, dating, electronically filing and distributing key JDAI policy and other documents so that anyone examining a document can know who created the document, when it was created and whether it has been super-‐ceded by another document.
• Develop a framework to guide both program-‐specific and more general JDAI training of ATD staff. Ensure that the Coordinator has received the necessary training to support the more technical requirements of the position, such as interpreting, summarizing and reporting on ATD program data. Consider expanding opportunities for line staff to participate in JDAI trainings and explore possibilities for greater involvement of line staff in ATD work group activities.
• Define/document specific, ongoing oversight and quality assurance responsibilities for the ATDs. Determine which oversight responsibilities should fall to program supervisors, to the JDAI Coordinator or to others. Oversight mechanisms, as well as data collection and reporting protocols need to be in place before any further expansion of the ATD continuum occurs. Obtain systematic feedback from participants and parents. An example of an exit questionnaire that might be adapted to obtain such feedback is contained in Appendix C.
25
Case Processing
• Revise the RAI per the recommendations of the Validation Study to make it a more effective tool.
• Continue to work collaboratively to educate key stakeholders about the detrimental impacts of detaining a youth who is lower-‐risk; JDAI leadership needs to speak with one voice and promote detention and ATD policies that are more aligned with the best practices research .
• Ensure that all programs track and report on length of stay on a regular basis.
• Create and implement a uniform policy regarding length of stay on an ATD, which includes defining the appropriate maximum “shelf life” of an ATD, releasing a youth earlier than this maximum and expediting cases that extend beyond this shelf life.
27
Appendix B: Draft Proposal for Needs-‐Based Assignment
ATD NEEDS BASED PLACEMENT GUIDE ERC
-‐ Youth with no supervision in the evenings. -‐ Youth with no after school activities. -‐ Youth who have issues with curfew. -‐ Youth who have known gang activity. -‐ Youth who have no assistance with homework/ compliance issues at school. -‐ Youth with family issues who need time away from home.
ODAP
-‐ Youth with family issues at home. -‐ Youth who need intensive support services. -‐ Youth with mental health issues.
Electric Monitoring
-‐ Youth who need strict physical monitoring or who have been ordered to a specific geographical location for public safety.
-‐ Youth who are rearrested while on a current ATD. -‐ Youth who carry firearms (EM along with another ATD and subpoenaed to court next
day.) Anytrax Curfew Check
1. Youth who are considered for Anytrax placement must score a 10 – 14 on the Risk Assessment Instrument (RAI) for ATD placement.
2. Parents of youth considered for Anytrax must have a land line phone in place at their home for curfew check in. While preference is for the land line under special circumstances, a judge might allow use of a cell phone in lieu of a land line. Must develop criteria.
3. Youth considered for Anytrax must be determined not to need more restrictive ERC or ODAP services to reduce chances of re-‐offense and ensure their appearance at court.
4. Youth considered for Anytrax must have adequate supervision in place at home. 5. For youth who have been detained and bonded out on pending charges, we would recommend
that Anytrax be considered as an alternative to EM. 6. The ATD workgroup would recommend conducting an assessment of the Anytrax program during January 2014 to study the effectiveness of the program.
1
Appendix C: Sample Alternative Programs Exit Questionnaire (hard copy option)
You participated in an Orleans Parish Juvenile Court Alternative to Detention program, and we are interested in your feedback about your experiences with this program. Please take a few minutes to complete this brief anonymous questionnaire. All the questions refer to the alternative program in which you recently participated, as shown below in #1. (If you have participated in another alternative program, as well, you may receive a separate questionnaire about that program.) When you are finished, please seal your completed questionnaire in the envelope provided to you and return it to _________.i Thank you very much for your feedback!
1. The alternative to detention program you participated in was: (staff to complete)ii r Electronic Monitoring
r ODAP r AnyTrax-self report only r Anytrax- curfew monitor only r Anytrax – both self-report and curfew monitor r Evening Reporting r Other_______
2. Below are statements about the alternative program you participated in. For each statement, decide how much you agree or disagree with that statement and then circle the best choice. If you neither agree nor disagree or feel a statement doesn’t apply to you, circle No Opinion.
a. Staff clearly explained the program to me when I started. Strongly Disagree
Disagree No Opinion Agree Strongly Agree
b. From the start, I understood what I needed to do to complete the program successfully.
Strongly Disagree
Disagree No Opinion Agree Strongly Agree
c. It was difficult for me to follow all the program rules. Strongly Disagree
Disagree No Opinion Agree Strongly Agree
d. The program helped me to know when to show up for court. Strongly Disagree
Disagree No Opinion Agree Strongly Agree
e. Program staff treated me with respect. Strongly Disagree
Disagree No Opinion Agree Strongly Agree
f. I felt like program staff wanted me to be successful. Strongly Disagree
Disagree No Opinion Agree Strongly Agree
g. The program helped me stay out of trouble. Strongly Disagree
Disagree No Opinion Agree Strongly Agree
h. Program staff seemed more interested in punishing me rather than helping me.
Strongly Disagree
Disagree No Opinion Agree Strongly Agree
i. I improved my attendance at school while I was in the program. Strongly Disagree
Disagree No Opinion Agree Strongly Agree
Please continue on the next page à
2
3. What did you most like about the program?
_______________________________________________________________________________________
________________________________________________________________________________________
_________________________________________________________________________________________
4. What did you least like about the program?
_______________________________________________________________________________________
________________________________________________________________________________________
_________________________________________________________________________________________
5. What changes should the program make in order to better serve youth?
_________________________________________________________________________
________________________________________________________________________________________
________________________________________________________________________________________
Thank you for taking the time to complete and return this survey. We appreciate your feedback!
Please return the survey in the enclosed envelope to _____________________.
i To ensure confidentiality and encourage candid responses, the survey should be returned to a third party outside the program, such as the Coordinator. ii To ensure accuracy, staff, not youth, should check the appropriate box on this item before youth starts questionnaire.