73
Miami-Dade County Public Schools Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN INTERIM EVALUATION OF OPERATION SAFETY NET A FIVE-YEAR PROJECT 1996-97 - 2000-01 THREE YEAR REPORT 1996-97 - 1998-99 April 2000 Principal Evaluator/Author Jerome L. Levitt, Ph. D.

Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

Miami-Dade County Public SchoolsOffice of Evaluation and Research

1500 Biscayne BoulevardMiami, Florida 33132

AN INTERIM EVALUATION OF OPERATION SAFETY NET

A FIVE-YEAR PROJECT 1996-97 - 2000-01

THREE YEAR REPORT 1996-97 - 1998-99

April 2000

Principal Evaluator/AuthorJerome L. Levitt, Ph. D.

Page 2: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

Board Page

Page 3: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

iii

Table of Contents

Executive Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

Description of the Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2Student Achievement Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3Surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Evaluation Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5Third through Sixth Grade . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7Computer-assisted Instructional Component . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11OSN /Roots and Wings Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15Selected Climate Survey Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

Staff Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16Parent Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

Student Surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18Feelings about School . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18Attitude Toward Reading Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Medical Screening . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21Shared Accountability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Discussion and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22Reading Comprehension Subtest Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Survey Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

Appendix A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

Appendix B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

Appendix C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

Bureau Response/Plans of Action to Address Evaluation Findings . . . . . . . . . . . . . . . . . . . . . . . . . . 59

Page 4: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

iv

Page 5: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

v

List of Tables and Figures

Table 1 Number of Schools and Groups for Attitude and Climate Surveys . . . . . . . . . . . . . . . . . 4

Table 2 Title I Students with Test Scores 1994-95 to 1998-99 Repeated MeasuresAnalysis of Variance of Stanford Achievement Test Reading ComprehensionScaled Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Figure 1 Overall Grades Includes All Particpants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Figure 2 Grade 3 Students . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Figure 3 Grade 4 Students . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Figure 4 Grade 5 Students . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

Figure 5 Grade 6 Students . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

Table 3 Computer Curriculum Corporation and Jostens Learning CorporationRepeated Measures Analysis of Variance of Stanford AchievementTest Reading Comprehension Scaled Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

Figure 6 Grade 3 Students comparing CCC to JLC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Figure 7 Grade 4 Students Comparing CCC to JLC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

Figure 8 Grade 5 Students Comparing CCC to JLC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Figure 9 Grade 6 Students Comparing CCC to JLC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Table 3 Staff Climate Survey Means for Selected Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

Table 4 Parent Climate Survey Means for Selected Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

Table 5 Means and Standard Deviations of OSN Study Groups on the Feelings About School Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Table 6 Means and Standard Deviations of OSN Study Groups on theElementary Reading Attitude Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

Table 7 Kindergarten and 1st Grade OSN Students Results of Medical Screening1996-97 Project R.E.A.D.I.N.E.S.S. Miami-Dade County

Page 6: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

vi

Department of Health/School Health . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Figure 10 Medical Screening Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Page 7: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

vii

Page 8: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

viii

Executive Summary

Operation Safety Net (OSN) is an ambitious project intended to raise the reading performance at 44 TitleI schools whose consistently low level of performance over a two-year period had led to them beingidentified by the state as “Critically Low” schools. These schools which currently serve approximately18,000 students, are among the poorest schools in Miami-Dade County with an average of 93% of theirstudents being eligible to receive free or reduced price lunch.

OSN redirected Title I funds into two instructional interventions. The first, Success for All (SFA), was ahighly structured and scripted reading curriculum developed at Johns Hopkins University by Robert E.Slavin and Nancy A. Madden. The second involved the introduction of computer-assisted instruction. SFAwas implemented in all OSN schools. The computer-assisted instruction was provided by two companies:one company, Jostens Learning Corporation(JLC), served schools mainly in the northern half of the countyand the other company, Computer Curriculum Corporation (CCC), served schools mainly in the southernhalf. The companies provided both the hardware and the software to equip each of these schools with fivecomputers for each classroom in grades 1 to 4 and a computer lab. In addition to the two instructionalinterventions, efforts were made to provide medical screening to the students in these schools throughexisting community resources.

This evaluation examined the first three years of implementation of OSN (1996-97 to 1998-99), which wasprojected to last two additional years (until 2000-2001). The evaluation focused on reading achievement(the primary goal of the project) as well as teachers’ attitudes toward Success for All and their schools’climate and parents’ perception of their children’s school climate. Students’ attitudes toward reading andschool were also assessed. The evaluation consisted of a longitudinal analysis of the results of the StanfordAchievement Test, Reading Comprehension subtest and the administration of several surveys.

Curriculum and Instructional Software Results

! Reading comprehension achievement over the period of the study showed an upward trendfor both OSN schools and other (non-OSN) Title I schools used for comparison. However,the initial gap in performance (favoring the non-OSN schools) did not appear to be closingin subsequent years.

! An analysis of achievement for schools using CCC software vs. that for schools using theJostens software showed greater achievement gains for the CCC schools.

Survey Results

!! Teacher survey results, obtained after the first year of program implementation, showed bothnegative and positive reaction to the SFA curriculum, in about equal proportions.

! Teacher climate survey results (selected items), which were obtained to illustrate theperceived environment in which program teachers worked, showed that OSN teachers wereexperiencing far greater challenges than staff at non-OSN schools.

Page 9: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

ix

! Parents (from both OSN and non-OSN schools) indicated a positive outlook toward theirchildren’s schools.

! Parents with children in OSN schools had more favorable attitudes toward their children’scomputer training and critical thinking/reasoning instruction than did parents from thecomparison schools (Title I, non-OSN).

! Students at OSN schools had the lowest scores regarding their feelings about school andReading in comparison to those students in other non-OSN schools.

Comments on the Medical Screening Component

As noted, medical screening of OSN students was the third component of the program. Initial screeningwhich was performed during the first year of the program was implemented through the Miami-DadeCounty Department of Health’s, Project R.E.A.D.I.N.E.S.S. This screening revealed that 10% of theKindergarten and first grade students at these schools were nutritionally at risk, based on height/weightproportions. This volunteer effort (the screening) was not maintained in subsequent years.

Summary

Overall, this evaluation has found that Operation Safety Net has not increased the performance of theselowest performing schools to the level of other Title I schools. In fact, 13 of the Operation Safety Netschools received a grade of “F” in the state accountability system, while the rest received a grade of “D”for their 1998-99 school year performance on the Florida Comprehensive Assessment Test, “FCAT”, andFlorida Writing Assessment test.

The initial conception of OSN involved a unified reading curriculum/school reform effort paired with acomputer-assisted instructional component and some form of medical screening. Early staff dissatisfactionwith SFA and complications due to the involvement of two computer companies led to a lack of clarity forthis intervention.

The district has since turned to a more diverse approach, the Comprehensive Reading Plan, which enablesgreater choice at the school level. Although Operation Safety Net served to focus attention on those poorlyperforming schools and provide funding and initial direction for reform, it no longer seems to exist asoriginally conceived and might better be subsumed under the district’s current efforts to support the schoolsgraded “D” and “F”.

Page 10: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

x

Recommendations

1. It is recommended that following the 2000-2001 project year, Success for All bediscontinued at Operation Safety Net schools unless there is full support of staff and thenecessary financial and staff resources leading to the possibility that the curriculum can beimplemented exactly as required by the Success for All Foundation. This would help us todetermine if the investment in this curriculum would have the dramatic effects predicted by itsdevelopers.

2. It is recommended that contracts with each of the computer-assisted instructional providersbe reviewed in light of this evaluation and the difficulty of actualizing the shared accountabilityportion.

3. It is also recommended that the role of computer-assisted instruction in some form beexpanded in Operation Safety Net schools. The results of this evaluation show promise forat least one of the companies, based on the Stanford Achievement Test Results, even thoughstudents were exposed to the computer software for only 20 minutes per day. In addition,parents of students in Operation Safety Net schools recognize the opportunity that theirchildren are getting by being exposed to computer skills in whatever form. Most of theseparents may not have the resources to purchase computers for their homes and rate theschools’ provision of computer education as very positive.

4. It is recommended that community involvement in the provision of medical screening andfollow-up services be further encouraged for all elementary students at poorly performingschools.

5. Finally, it is recommended that the concept of Operation Safety Net be reviewed in light ofthe changes in Success for All implementation, the limited medical screening, the variableresults of computer assisted instruction, and the introduction of the Comprehensive ReadingPlan.

Page 11: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

xi

This page intentionally left Blank

Page 12: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

1

Introduction

The re-authorization of Title I in 1995 required each state to develop proficiency standards for Title Ischools which would ensure that Title I students were performing at a level comparable to all students inthe state. This prompted the Florida Department of Education (FDOE) to establish state- wide proficiencystandards for all schools, including Title I schools. These standards shifted the focus from yearlyimprovement of the whole school to a standard set of performance criteria. A school’s competence wasto be judged by the percentage of students achieving those criteria in reading, mathematics, and writingat certain designated grade levels. For elementary schools this was the fourth grade, for middle schools theeighth, and for senior high schools, the grade levels were 10th and 11th . If a school failed to reach theestablished percentage of students reaching criteria in each of the three areas for two consecutive years,that school was designated by the state to be “Critically Low.” (See Appendix A for complete criteria)

In November 1995, the FDOE published a list of “Critically Low” schools based on the performance ofschools in both the 1993-94 and 1994-95 school year. Fifty-one schools in Miami-Dade County were onthat list. Thirty-eight elementary schools included on the list, plus several schools whose gradeconfigurations were K-3 and whose students usually went on to one of the identified schools, became thefocus of the district’s attention. In response to this, Operation Safety Net (OSN) was conceived as amassive three pronged approach to improve the reading achievement in this group of schools. During the1995-96 school year, a decision was made to reallocate Title I resources in order to provide two majorinstructional innovations.

The first of these innovations was the Success for All (SFA) reading curriculum developed at JohnsHopkins University by Robert Slavin and Nancy Madden. It was expected that this curriculum would makea dramatic improvement in student performance in reading as evidenced by increases in StanfordAchievement Test Reading Comprehension scores. This program uses a formalized scripted teachingapproach, supplemented by tutoring. It includes regular assessment, parental support, and a school basedfacilitator. This program is viewed as a school restructuring model which requires an eighty percent voteof confidence by school staff prior to implementation. OSN schools were called upon to embrace thiscurriculum due to its positive impact on high risk students, as reported by its developers.

The second of these instructional innovations was a computer-assisted tutorial component for reading andmathematics. In October 1996, contracts were signed with the Computer Curriculum Corporation (CCC)and the Jostens Learning Corporation (JLC) for the hardware and software to provide each classroomwith computers and a computer lab for each school. Each classroom in grades 1 to 4 was provided withfive networked workstations which in turn were linked together with all other classrooms so that a uniformtechnology was available throughout the school. The OSN schools were geographically divided into twogroups, CCC was given the schools in the south of Miami-Dade, while the JLC was given schools in thenorth. The computers were to be phased in over the fall and winter of the 1996-97 school year. Thecontracts included a performance clause called Shared Accountability, whereby the companies guaranteed

Page 13: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

2

a level of student improvement given a minimum time on the computers.

In addition to the two instructional components, OSN was to include a medical screening component. Thiswas a non-funded redirection of local medical resources to the OSN schools. In order to accomplish thisobjective, Project R.E.A.D.I.N.E.S.S. (Every child’s Right to Early Assessment, Detection, Intervention,and Networking for Early Super Success) which was developed in 1993 by the School Health Program,a project of the Dade County Public Health Unit, agreed to participate at the request of formerSuperintendent Octavio Visiedo. This effort consisted of a nursing assessment of each Kindergarten and1st grade child that screened for dental status, nutritional status (height and weight), pediculosis (head lice),and skin conditions (infections, fungi, inflammations or other lesions or abnormalities).(ProjectR.E.A.D.I.N.E.S.S. Results 1996-97 School Year, Dade County Department of Health/School Health).In addition to these screenings, students are normally screened for vision and hearing difficulties.

Description of the Evaluation

An evaluation of the Success For All component was conducted by the Office of Evaluation and Research(Urdegar, 1998) using a sample of six OSN schools compared to three schools using Success For All butnot part of OSN, three schools using SRA/Reading Mastery, and six Title 1 schools which were not partof Operation Safety Net. The 12 comparison schools were carefully selected to match the OSN schoolsdemographically. That evaluation did not find that the Success For All curriculum offered any advantageover the other curricula in place during that time period.

This present evaluation examines the totality of OSN in its effort to improve students’ reading level asmeasured by Reading Comprehension scores on the Stanford Achievement Test. The target populationof the OSN intervention was the students at the 38 elementary schools who were identified as “CriticallyLow” by the state as a result of 1993-94 and 1994-95 fourth grade achievement data plus the primaryschools that fed into those schools. The initialization of the Success for All curriculum and installation ofthe hardware and software for the computer instructional component occurred during the 1996-97 schoolyear. The first full year of implementation was the 1997-98 school year. All students who were enrolledin a Title I school at the time of the 1997-98 Stanford Achievement Test administration, who had a valid1997-98 Reading Comprehension subtest score, and who took the Stanford Achievement Test at thesame school for the three or four years prior to 1998-99 were identified. This enabled us to examine thepattern of achievement of those students who were in attendance prior to and during the initialimplementation of most of the OSN components. Those students at Title I schools which were not OSNschools were used as a comparison group. In addition to reading performance: student, staff, and parentalsurvey results were examined.

Page 14: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

3

Student Achievement Analysis

The essential question of this evaluation was whether a combination of major interventions in these schoolswould result in an increase in the Stanford Achievement Test scores of OSN schools. The introduction ofthe Success for All curriculum, a major computer-assisted instructional component, and medical screeningwere to provide the extra help required to raise the SAT scores of the students at these schools.

Since the impetus for OSN was the very poor achievement test performance at these schools in 1993-94and 1994-95, it was decided that the 1994-95 test score results would be the appropriate starting pointto determine if the pattern of performance was influenced by the OSN intervention. To that end, theStanford Achievement Test, Reading Comprehension subtest results were obtained for all students whomet the criteria described above for the years prior to the initialization of the project through 1998-99. Thenumber of years of available data for each student depended on their grade level.

The focus of the data analysis was to compare the reading achievement of all non-OSN Title I schools andall OSN schools. Specifically this analysis questioned whether those students who had been attendingOSN schools showed any improvement in test scores over the last 3 or 4 years when compared to the restof the Title I population. Using the Repeated Measures Analysis of Variance we examined the growth intest scores over a three and four year time period for students who had attended an OSN school and otherstudents who had attended Title 1 schools that were not part of OSN and who had been tested in the sameschool for three to four consecutive years.

We expected there to be a significant result for growth over the several years because of the way in whichthe Stanford Achievement Test scaled scores are created. For example, the average grade level scaledscore (the 50th percentile) will represent a higher level of performance (a years growth) for eachconsecutive grade level. So if a child stayed at the same relative position on the test (scored at the 50thpercentile) their scaled score would be “significantly” higher each year. Given this expected scaled scoregrowth, the question we are asking here is: Are there any differences in the pattern of performance of theOSN group when compared to the non-OSN group?

It is assumed in this analysis that the OSN students’ Reading Comprehension subtest scores weresignificantly different from the other Title 1 students since these students attended “Critically Low” schoolswhich were partly defined by their extremely low Stanford Achievement Test scores. If this interventionhad an effect, differences between the scaled score means of two groups would reduce over time so thatin the last time period (1998-99) examined there would be no significant difference between the groups.

Surveys

In addition to an analysis of Stanford Achievement Test data, several surveys were used to determine the

Page 15: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

4

affective response of teachers, students, and parents to this effort. An initial survey was administered toteachers at the end of the first year of implementation (April 1997). In the following year (1997-98) a setof two surveys were administered to a small sample of students in OSN and several comparison groups.In addition, climate survey data from teachers and parents were examined for the second year of theintervention 1997-98. In April 1997, in response to concerns about the acceptance of OSN in the schools, teacher surveys wereconducted at all OSN schools. These surveys were primarily directed to the Success for All curriculum,in particular the Reading Roots and Reading Wings components which are the reading instructionalcomponents. In addition, there were several questions regarding teacher expectations for the technologyand medical component of OSN. The purpose of the survey was to determine the general state ofacceptance of the Roots and Wings components by school personnel.

In order to determine if OSN had any attitudinal effects on the students, an attitude survey wasadministered to students at a selected sample of 31 schools. The survey actually consisted of two surveys,one was a general survey called “Feelings About School” constructed from items developed by Estrada(1996) and Ford & Harris (1996). The other was the “Elementary Reading Attitude Survey”(McKenna& Kear, 1990) A sample of grades from 6 groups of OSN schools and non-OSN schools were selectedto respond to these surveys. Of these 12, six were OSN schools (three were CCC schools and threewere JLC schools), three were schools using the Success for All curriculum but were not OSN schoolsand therefore did not have the technology component, and three schools were using the Science ResearchAssociates (SRA) direct instruction curriculum. Of the remaining 19 schools, seven were Title 1 fundedschools using a variety of other reading curricula and 12 were a group of non-Title 1 schools withdemographic characteristics approximating the Title 1 population. Table 1 presents the distribution of the31 schools among the 6 groups.

As part of this study the school level Parent and Staff Climate Survey results from the 1997-98 annualspring adminstration were examined (Blueprint 2000 School Climate Survey Report for the 1997-98Administration) . This is a 35 item survey focused on school satisfaction issues. For this study a select groupof items were examined for the same schools used for the student surveys.

Table 1Number of Schools and Groups for Attitude and Climate Surveys

Grades

Operation Safety Net(All schools use SFA and

Computer-assistedInstruction)

Comparison Groups

CCC JLC

Title I SchoolsNon-Title 1

Schools Totals

SFA SRAOther Reading

CurriculaOther Reading

Curricula

3 1 1 1 1 3 4 11

4 1 1 1 1 2 4 10

Page 16: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

5

5 1 1 1 1 2 4 10

Totals 3 3 3 3 7 12 31

Evaluation Results

An initial analysis of the Reading Comprehension scaled scores was performed combining all grades. This repeated measures analysis compares the results of the last five years of testing (where available) forthe OSN vs. non-OSN (Title I) students. There was a significant time by group interaction, which meansthat in addition to the test scores showing significant differences for each successive year the two groupswere significantly different from each other sometime over the five years. The grade to grade growthconfirms that students in both groups were learning but they were not learning at the same rate over the fiveyears.

An examination of means of this data, as seen in Table 2, indicate that the two groups were significantlydifferent prior to the onset of OSN and remained significantly different for each of the following five years.In this analysis as in all subsequent analyses, the non-OSN group had higher scores than the OSNparticipants. The actual mean difference was largest in 1994-95 which was the year that the OSNschools exhibited their lowest performance levels and were identified as critically low. Figure 1 pictures therelationship between these groups. The difference between the means of the groups for each year isdisplayed in the inserted horizontal bar graph.

For each of the following four years, during which OSN was implemented, the mean difference scoreswere practically the same. It is most likely that the reduction of this initial difference in the subsequent yearsyielded a statistically significant result for this analysis. (See Table 2 and Figure 1).

! An analysis over all grades for the five-year period from 1994-95 through 1998-99 showed an initialreduction (prior to the onset of the project) in the mean difference between OSN and non-OSNschools but no subsequent reduction in the following four years.

Page 17: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

6

520

540

560

580

600

620

640

1994-95 1995-96 1996-97 1997-98 1998-99

Not OSN OSN

Overall GradesIncludes all participants

Figure 1

15 20

1994-95

1995-96

1996-97

1997-98

1998-99

Mean Differences

Table 2Title I Students With Test Scores 1994-95 to 1998-99

Repeated Measures Analysis of Variance ofStanford Achievement Test Reading Comprehension Scaled Scores

1998-99Grades 94-95 95-96

Firstyear ofOSN

97-98 98-99 N

Time byProgramParticipation F

96-97

OverallGrades

OSN 522.51 564.92 586.17 606.14 620.64 48023.65 **

non-OSN 542.02 582.75 603.96 624.37 638.22 7322

3rd GradeOSN G G 491.29 550.94 581.64 1743

13.80 ***non-OSN G G 513.62 568.69 599.40 2791

4th GradeOSN G 493.77 548.51 581.27 594.54 1149

.65 nsnon-OSN G 514.48 568.51 601.23 616.02 1919

5th GradeOSN 488.96 550.52 579.44 595.92 617.92 934

2.97 **non-OSN 511.94 569.86 598.88 615.33 637.71 1668

6th GradeOSN 546.00 577.91 592.65 616.48 624.90 1172

.06 nsnon-OSN 565.07 597.24 611.69 635.81 644.24 2304

**p<.05

***p<.001

ns=Not Significant

Page 18: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

7

480

500

520

540

560

580

600

1996-97 1997-98 1998-99

Not OSN OSN

Grade 3 Students

Figure 2

12 14 16 18 20 22 24

1996-97

1997-98

1998-99

Mean Differences

Third through Sixth Grade

The previous overall analysis combined all the grades to provide a general overview of performance butthese results somewhat masked the precision of the Stanford Achievement Test, which is normed by gradelevel. Reading Comprehension scores for students in the third, fourth, fifth, and sixth grades in 1998-99,who had been tested in the same school for three or four consecutive years, were included in this nextseries of analyses.

As Table 2 indicates, students in the third grade in 1998-99 had three years of data available starting withfirst grade scores in 1996-97. The third grade students showed expected significant growth over the threeyears of OSN especially between the first and second grade The OSN students made a slightly greatergain then did those from the other Title 1 schools from 1996-97 to 1997-98 but the following year gainswere similar, as indicated by the parallel lines. As in the overall data, the significant result is most likelyattributable to that first year of change which did coincide with the initial implementation of OSN. However,as seen in Figure 2, results for the following two years did not continue to exhibit this convergence.

Page 19: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

8

480

500

520

540

560

580

600

620

1995-96 1996-97 1997-98 1998-99

Not OSN OSN

Grade 4 Students

Figure 3

15 20 25

1995-96

1996-97

1997-98

1998-99

Mean Differences

Students who were fourth grader’s in 1998-99 were examined starting with first grade in 1995-96. Therepeated measures analysis for these students did not reveal a relationship between the change in scoresover the four grade levels and whether or not the students were in an OSN school or non-OSN school (seeFigure 3 and Table 2). As these students continued through the second and third grades the differencesbetween the OSN and non-OSN schools remained the same but by the fourth grade there was a slightdivergence with the non-OSN participants having a slightly greater gain than the OSN students.

Page 20: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

9

480

500

520

540

560

580

600

620

640

1994-95 1995-96 1996-97 1997-98 1998-99

Not OSN OSN

Grade 5 Students

Figure 4

18 19 20 21 22 23

1994-95

1995-96

1996-97

1997-98

1998-99

Mean Difference

Fifth grade results showed a significant test by group interaction. That is there was a significant change overthe five year period that related to attendance in OSN or non-OSN schools. An examination of the meansshows a similar pattern to grade three, with the scores of first graders showing the most divergencebetween the two groups. This again represents the year these schools were designated as “Critically LowPerforming”. Following this year, however, third, fourth, and fifth grade scores remained almost the samedifference apart (Figure 4 and Table 2).

Page 21: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

10

540

560

580

600

620

640

660

1994-95 1995-96 1996-97 1997-98 1998-99

Not OSN OSN

Grade 6 Students

Figure 5

18 19 20

1994-95

1995-96

1996-97

1997-98

1998-99

Mean Differences

Sixth graders in this analysis represent both elementary and middle school students. Most of the sixthgraders who had been identified in earlier grades were attending middle school in 1998-99. In order toinclude these students in our analysis, we identified those students who had been tested at either an OSNor non-OSN school for the previous four years (second through fifth grades), regardless of their sixth gradeenrollment. We then conducted a repeated measures analysis on those students for whom we had fiveconsecutive years of test data. No significant relationship was seen between growth over the testadministrations and enrollment in OSN vs. non-OSN schools. An examination of Figure 5 and Table 2reveal that there was a parallel performance pattern for the two groups for all five years.

! Overall, Reading Comprehension data presented does not reveal any real change in the relationshipbetween the performance of OSN schools and non-OSN schools. Although there were somesignificant effects which point to differences between the OSN and non-OSN schools these differencesoccurred in the first year of implementation and the strength of these differences are very weak; instatistical parlance they have very small effect sizes. Simply put, the OSN intervention has not reducedthe difference between these critically low schools and the other Title 1 schools.

Page 22: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

11

Computer-assisted Instructional Component

In addition to examining the overall data we hoped to determine if there were differences in the extent towhich the two computer-assisted instructional components (CCC or JLC) provided some benefit to thestudents. Repeated measures ANOVA’s were performed for the fourth, fifth, and sixth grades of OSNschools comparing the two computer components over the three years during which CCC and JLC wereimplemented.

An examination of Table 3 reveals that there was a significant finding for the relationship between theparticular computerized instructional component and time. That is, over the three years the relationshipbetween the two groups changed. In the first time period (1996-97), which was the year in which fullimplementation began, those students receiving computerized instruction using the CCC curriculum scoredbelow those students receiving instruction from the JLC curriculum. However in subsequent years, at leastfor students who were in the third, fourth, and fifth grades in 1998-99, the CCC participants appeared tobe improving at a slightly higher rate. The sixth graders, although having narrowed the gap from 1996-97,when they were fourth graders, remained below the performance of the JLC participants.

Table 3Computer Curriculum Corporation and Jostens Learning Corporation

Repeated Measures Analysis of Variance ofStanford Achievement Test Reading Comprehension Scaled Scores

1998-99Grades

Firstyear ofOSN

97-98 98-99 N

Time byProgramParticipation F

96-97

3rd GradeCCC 485.5 549.35 580.31 799

21.99 ***JLC 496.42 552.22 582.39 959

4th GradeCCC 542.13 577.58 593.50 995

25.00 ***JLC 547.78 579.54 591.39 1029

5th GradeCCC 574.44 593.73 616.24 1022

10.24 ***JLC 576.39 591.33 615.25 1004

6th GradeCCC 588.12 615.10 622.02 789

5.40 ***JLC 591.55 615.17 623.64 896

***p<.001

Page 23: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

12

480

500

520

540

560

580

600

1996-97 1997-98 1998-99

CCC JLC

Grade 3 Students Comparing CCC to JLC

Figure 6

0 2 4 6 8 10 12

1996-97

1997-98

1998-99

Mean Difference

Overall the difference between the two groups was very slight; however the trend in the third and fourthgrades is notable. Those students who were in the third and fourth grades in 1998-99 had maximumexposure to these computerized instructional curricula as they were in the first and second gradesrespectively in 1996-97. Since these curricula were phased in during the 1995-96 school year, the fourthgraders have received the most instruction on these systems.

Focusing on the third graders first, we note an initial difference of almost 11 scaled score points when theywere first graders in 1996-97 (Figure 6). This difference has narrowed each subsequent year. If this trendwere to continue we would expect the CCC participants to exceed their JLC counterparts in the fourthgrade exams of 1999-2000. This reversal is exactly what is seen in the scores of those students who werefourth graders in 1998-99.

Page 24: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

13

540

550

560

570

580

590

600

1996-97 1997-98 1998-99

CCC JLC

Grade 4 Students Comparing CCC to JLC

Figure 7

0 1 2 3 4 5 6

1996-97

1997-98

1998-99

Mean Difference

These fourth grade students exhibited the same pattern as the third graders. In 1996-97 when they weresecond graders those students using the JLC curriculum exceeded the performance of the students usingCCC by 5.7 scaled score points. The following year that difference was reduced to only about 2 pointsand in 1998-99 the CCC participants exceeded the JLC participants by 2 points. These differences arevery slight but they do indicate a trend.

Fifth and sixth graders show similar initial differences when they were third and fourth graders respectivelyand a similar narrowing of the difference or reversal the following year. However, their 1998-99 scoresdo not show the same trend as did the third and fourth graders. For the fifth graders there is a reversal inthe fourth grade with CCC students exceeding the JLC students by two points which narrows to 1 pointin the fifth grade (see Figures 8 and 9).

For sixth graders, there is no difference in fifth grade (1997-98) scores and those students who had beenusing the JLC curriculum were almost 2 points higher than their fifth grade counterparts in their sixth gradescores. It is important to note that most of the sixth grade population moved to middle school in the 1998-99 school year and no longer had access to these two instructional components.

Page 25: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

14

570

580

590

600

610

620

1996-97 1997-98 1998-99

CCC JLC

Grade 5 students comparing CCC to JLC

Figure 8

0 0.5 1 1.5 2 2.5

1996-97

1997-98

1998-99

Mean Difference

580

590

600

610

620

630

1996-97 1997-98 1998-99

CCC JLC

Grade 6 students comparing CCC to JLC

Figure 9

0 0.5 1 1.5 2 2.5 3 3.5

1996-97

1997-98

1998-99

Mean Difference

Page 26: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

15

These data indicate that for the students who were exposed to these computer-assisted instructional modelsearly in their educational careers, the CCC curriculum seemed to have some advantage. This effect ismitigated by the small statistical effect size associated with the difference between the two computerinstructional models. In other words, although there is a significant difference it is very small in practicalterms. However, any difference is remarkable given that students are exposed to the program for only 20minutes per day.

OSN /Roots and Wings Survey

In response to concerns about the acceptance of OSN in the schools, a survey was conducted at all OSNschools in the spring of 1997. This survey was primarily directed to the Success for All curriculum,although there were several questions regarding teacher expectations for the technology and medicalcomponent of OSN. The purpose of the survey was to determine the general state of acceptance of theRoots and Wings component of OSN by school personnel. Questions covering everything from trainingand program operation to more general perceptions were presented using a four point Likert type responsescale (Strongly Agree, Agree, Disagree, and Strongly Disagree). This survey is presented in Appendix B.

A statistical procedure called factor analysis was performed to determine if there were groups of questionswhose responses appeared to “cluster together” and therefore addressd the same issues. The analysisrevealed that only one factor was meaningful. This factor was named the Index of Satisfaction and is acombination of 17 of the 35 items. Since the response scale which was used consisted of 4 ratings(1=Strongly Disagree to 4=Strongly Agree), a mean exceeding 2.5 on this index would indicate generalsatisfaction with the OSN/Roots and Wings component.

The median Index of Satisfaction was 2.53. The schools were ranked from the highest to the lowest Indexof Satisfaction and were split into two groups of schools; one set of schools whose Indexes of Satisfactionwere above the median of 2.53 and the other group whose indexes were below the median. Anexamination of these two groups of schools indicated that three times as many of the more satisfied schoolshad been Success For All schools prior to this year then was the case for the least satisfied schools. Theone school that had previously been a Success for All school and was in the less satisfied group has sincedropped Success for All. The less satisfied schools had lower return rates and more non-standard gradeorganizations. It appeared that Success for All was not consistently supported among the OSN schools.

When asked to make an overall judgement about the OSN components, respondents agreed that Rootsand Wings, the technology component, and medical screening would have a positive impact on studentperformance. Those respondents who had begun using the technology component believed it wouldsupport reading and mathematics and provide precise information about their student’s progress.

Page 27: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

16

Negative StatementsMy ability to do the best possible job at this school is limited by:

too many students in each class. student deficiencies in basic academic skills.insufficient resources (e.g., funds,books,equipment,supplies,etc.).

I frequently feel overloaded and overwhelmed while working at my school.

Positive StatementsStudents generally come to my class at the beginning of the term prepared forthe grade level or courses I teach.I believe children attending my school are receiving a good education.The overall climate or atmosphere at my school is positive and helps studentslearn.

The italicized words in each statement are presented in Table 3 below.

Selected Climate Survey Data

Parent and Staff Climate survey data for 1997-98 were examined for the same schools at which theStudent Attitude Surveys were administered (see Table 1). Both the Parent and Staff Surveys consist ofstatements which are rated by the respondents on a 5 point Likert type scale (Strongly Agree, Agree,Unsure/Unkown, Disagree, and Strongly Disagree). School level means for each of these points werecalculated and then the average percent of Agree and Strongly Agree was calculated. These data representschool level data; school means were used to calculate group means, rather than individual staff and parentresponses. Table 3 displays the means for selected items for the staff surveys and Table 4 displays theParent survey means. Although this is a limited sample there are some indications that there weredifferences in the staff and parents perceptions.

Staff Survey A sample of four negative statements and three positive statements were selected from theStaff Climate Survey to reflect the academic environment at each of the sampled schools at the end of thefirst full year of OSN implementation. The selected statements were:

Individual items means and the means of the negative and positive statements combined, reveal that OSNschools in this sample had the highest agreement to the negative items followed by the Success for All onlyschools. The lowest total mean of agreement to the negative items was for those schools implementing theSRA Direct Instruction curriculum (23.05%). When we look at the positive statements, staff at the Successfor All only schools were the lowest (28.33%) followed by the OSN school staff (29.22%). The highesttotal mean for the positive statements was for the SRA Direct Instruction schools (36.06%)

Table 3Staff Climate Survey Means for Selected Items

Study Groups

Staff Climate Survey Items

OSN(6)* Success forAll Only(3)

SRA DirectInstruction(3)

Sample ofOther Title 1Schools(7)

Sample ofNon-Title 1

Schools(12)

Page 28: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

17

My child’s school:maintains high academic standards.makes available textbooks, equipment, and supplies needed for learning.

My child’s school is effectively teaching students:the basic academic skills in reading.the basic academic skills in mathematics.to speak and write correctly in English.to use computers.to think critically and reason out problems.to develop good study and work habits.

My child is getting a good education at this school.The overall climate or atmosphere at my child's school is positive and helps my childlearn.

The italicized words in each statement are presented in Table 4 below.

Too Many Students 30.08 33.00 28.17 32.29 31.75Students Come with Deficiencies 36.67 39.67 29.67 34.50 28.71Insufficient Resources 25.08 24.00 15.17 17.00 22.21Overloaded and Overwhelmed 28.75 21.83 19.17 24.57 21.67 Negative Statements Means 30.15 29.63 23.05 27.09 26.09 Students Prepared 12.92 12.50 21.00 15.07 21.54Students Get Good Education 38.83 36.33 44.17 43.29 42.38Positive Climate 35.92 36.17 43.00 42.50 40.79 Positive Statements Means 29.22 28.33 36.06 33.62 34.90

*Number of schools

Parent Survey The Parent Climate Survey consisted of almost all positive statements which parents rated.Parents had more agreement to positive statements (see following box) over all the student groups. The totalmeans for parents ranged from 40.27 percent agreement to 43.97 percent agreement (Table 4). The lowesttotal mean was from parents whose children attended schools implementing Success for All only and thehighest parental mean was for those schools implementing SRA Direct Instruction. Parents rated the OSNschools right below the SRA schools at 43.31 percent agreement to positive statements.

Table 4Parent Climate Survey Means for Selected Items

Study Groups

Parent Climate Survey Items OSN (6)Success forAll Only (3)

SRA DirectInstruction (3)

Sample ofother Title 1Schools(7)

Sample ofNon-title 1

Schools (12)

High Academic Standards 36.92 36.17 39.33 37.00 38.33Enough Textbook, Equip. 44.42 40.83 45.33 43.64 43.13School Teaches Reading 44.67 43.17 47.00 44.93 44.79School Teaches Math 45.08 43.67 46.67 46.29 45.33Schools Teaches to Speak and WriteEnglish

44.58 43.67 46.67 45.07 44.92

Page 29: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

18

School Teaches Computers 43.42 31.33 37.50 36.29 35.00School Teaches to Think Critically andReason

41.33 38.00 40.83 39.86 38.63

School Teaches Good Study Habits 44.00 41.67 43.67 43.64 43.17My Child Gets a Good Education 44.83 42.50 46.67 46.00 44.79Overall Positive Climate 43.83 41.67 46.00 45.36 44.67

Total Means 43.31 40.27 43.97 42.81 42.28

Of particular interest in the parent survey was the recognition by parents from OSN schools that the school“Teaches Computers.” Forty-three percent of the parents agreed to this item. This was considerably higherthan all the other groups. Over the two respondent groups there appeared to be greater variability amongthe staff responses then the parent responses for each of the study groups.

Student Surveys

Student Surveys were sent out to 31 schools which comprised a sample of five groups. Twelve of theseschools were part of the Success for All Evaluation study (Urdegar, 1998) and were all receiving Title 1funds (see Table 1). Of the remaining 19 schools, seven were Title 1 funded schools using a variety of otherreading curricula and 12 were a group of non-Title 1 schools with demographic characteristicsapproximating the Title 1 population.

Each school’s entire third, fourth, or fifth grade was surveyed. The specific grade was randomly selectedsuch that each group of schools had all the grades represented in the survey. Approximately 4000 surveyswere distributed to these schools and 2818 surveys were returned. Unfortunately, the number of fifth graderesponses were so low that this data could not be used in the analysis. Therefore, the analysis discussedbelow is limited to the third and fourth grades which were fairly well distributed across the groups.

For survey data to be useful they must be relatively consistent, that is the relationship between the wayindividuals respond to each of the items and the overall survey should be similar. In order to check on theinternal consistency or reliability of a survey, a statistic known as Cronbach’s alpha is calculated. Thisstatistic represents the degree (from 0 to 1) to which the survey is consistent in its measurement. Since thesurvey was composed of two different surveys an alpha was calculated for each survey as well as bothsurveys together. The Feelings About School survey had an alpha of .83, the Attitude Toward Readingsurvey had an alpha of .91 and the alpha for both surveys combined was .91. These are relatively highreliabilities and they support the usefulness of these surveys. Feelings about School.

The first of the two scales was the Feelings About School Survey. This survey is a general attitude surveywhich asks students to indicate how they feel about various school related areas (See Appendix C).

An analysis of variance was performed on the survey results to determine if the various groups differed in

Page 30: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

19

their feelings about school or their attitude toward reading. The Feelings About School survey used a 5point scale with five being the most positive and 1 being the least positive. The analysis revealed that therewas a significant effect of group membership (F=5.394 p<.05) on the Feelings about school survey. Themeans and standard deviations are displayed in Table 5 below. An examination of the means indicate thatthose students attending schools in this sample implementing the SRA curriculum had the most positiveattitude about school. A statistical examination of the mean differences between the groups indicates asignificant difference between the students at schools implementing SRA and those students at the OSNschools in this sample (Mean difference=.19 p< .001) and students at non-OSN schools implementingSuccess for All (Mean difference=.17 p <.002). There were no other significant differences between themeans of the other groups.

Table 5Means and Standard Deviations of OSN StudyGroups on the Feelings About School Survey

Groups F=5.394** Mean Std. Deviation N

OSN 3.31 0.46 205 Success for All Only 3.33 0.52 247

Sample of Other Title 1 Schools 3.41 0.47 459 Sample of Non-Title 1 Schools 3.41 0.46 689

SRA Direct Instruction 3.50 0.42 169 Total 3.39 0.47 1769

**p<.05

Attitude Toward Reading Survey

The Attitude Toward Reading survey (McKenna & Kear, 1990) consists of twenty items which ask studentsto indicate how they feel about various reading activities. The student responds by circling a picture of thecartoon character “Garfield” whose facial and body posture matches the way the student feels about thatitem. The characters are drawn to represent a range of happy to very upset. This is a 4-point forced choiceLikert type scale. This survey is scored by assigning points to each of the “Garfields”, 4 points for thehappiest to 1 point for very upset. These points are then added for a maximum of 80 points. The twentyitems also represent both academic and recreational reading and the survey is scored for each of thosecategories. Each category of reading has a maximum of 40 points.

An analysis of variance was performed for each of the reading attitude subscales, Recreational Reading and

Page 31: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

20

Academic Reading, to determine if the various groups responded differently to them. An examination ofTable 6 reveals that both the recreational reading scale (F=6.705, p<.05) and the academic reading scale(F=4.405 p<.05) had a significant effect for groups. An examination of the means indicated that studentsparticipating in OSN schools showed the lowest mean scores in the two categories (28.55 and 29.80respectively). Post hoc tests of these means revealed that the OSN students had a significantly less positiveresponse to items reflecting recreational reading when compared to students in schools implementing theSRA curriculum (mean difference = 2.33, p<.01) and non-Title 1 students(mean difference=1.61, p<.05).Also, OSN students were significantly less positive on items reflecting academic reading then other Title 1students (mean difference=1.75 p<.05). The other groups did not have any significant differences betweenthem.

Page 32: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

21

Table 6Means and Standard Deviations of OSN Study

Groups on the Elementary Reading Attitude Survey.

Survey F Groups Mean Std. Deviation N

RecreationalReading

6.705** OSN 28.55 6.86 205

Sample of Other Title 1 Schools 29.38 7.07 459

Success for All Only 29.84 6.52 247

Sample of Non-Title 1 Schools 30.16 6.02 689 SRA Direct Instruction 30.88 6.35 169 Total 29.80 6.53 1769

AcademicReading

4.405** OSN 28.46 6.89 205

Sample of Non-Title 1 Schools 29.40 6.27 689

Success for All Only 29.82 6.83 247

SRA Direct Instruction 29.99 6.36 169 Sample of Other Title 1 Schools 30.21 7.09 459 Total 29.62 6.66 1769

**p<.05

Page 33: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

22

Figure 10

Medical Screening

Project R.E.A.D.I.N.E.S.S. was established by the Miami-Dade County Department of Health’s SchoolHealth Program in 1993 to screen Kindergarten and 1st grade students for dental status, nutritional status(height and weight), pediculosis (head lice) and skin conditions (infections, fungi, inflammations or otherlesions or abnormalities). At the request of former Superintendent Octavio Visiedo they redirected theirresources to serve OSN schools. At the end of the 1996-97 school year they submitted a report whichdelineated the results of those screenings. Table 7 and Figure 10 summarize the results for the 32 OSNschools that were served by Project R.E.A.D.I.N.E.S.S.

Table 7Kindergarten and 1st Grade OSN Students

Results of Medical Screening 1996-97 Project R.E.A.D.I.N.E.S.S.Miami-Dade County Department of Health/School Health

Dental Hts/Wts Pediculosis SkinTotal Number of

ScreeningsNumber of Students Screened 7884 7884 7884 7884 31536 Number of failures 2518 774 168 231 3691 Percent of failures 31.94% 9.82% 2.13% 2.93% 11.70%

Almost 10 percent of the students were identified as at risk for nutritional deficiency and 30 percent wereidentified as having dental problems. Follow-up services were to be provided to identified students by theProject R.E.A.D.I.N.E.S.S. staff. However, the initiative was not continued the following year, althoughschool nurses do rotate through the neediest schools. In addition to Project R.E.A.D.I.N.E.S.S., hearingand vision screening was provided to all schools in the district, including the OSN schools.

Shared Accountability

Shared Accountability was meant to be a performance contract between Miami-Dade County PublicSchools and each of the companies providing the computerized instructional component. The school district was to ensure that each student was exposed to the computer for at least 50 hours, to be distributedover the school year. Students were to spend approximately 20 minutes per day at the computer. In turn,

Page 34: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

23

the companies guaranteed that each student (who met the usual district standards for inclusion in its StanfordAchievement Test pool) would gain 1.33 grade levels using software provided by Jostens LearningCorporation (JLC) or 1.6 grade levels using software provided by the Computer Curriculum Corporation(CCC). A percentage of the cost of the contract was retained by the school system to be paid upondemonstration that students met the guaranteed performance levels.

There were several problems with this procedure. The first of which was that student performance was tobe measured by the internal computerized management systems of the software provided by the respectivecompanies. That is, time and student progress were measured by systems built into the instructionalprogramming. There was no practical way to verify this data.

In order to examine the time variable a small sample of schools were visited to pilot an observationinstrument but it was decided that a considerable amount of resources would have to be devoted to timeverification. Instead the district chose to rely on the companies’ reports of time on task. Variability of timeon task was seen over the two companies and within the companies for each of the schools. In reality, theissue of time was not as salient as the fact that performance was measured internally through each system'sinstructional management program.

Student responses to the questions provided by the instructional program were handled differently by thetwo companies. JLC counted each correct answer as a completion so that students accumulated correctanswers until the goal of 120 correct responses were recorded. Unfortunately students at some schoolslearned that there were no penalties for wrong answers and they found that they could just try each of theanswers in turn until they “hit” the correct one. It was the teachers responsibility to notice this in thecumulative reports available to them and to modify students behavior. Due to this problem, schoolsimplementing the JLC software quickly reached the goals and in fact “ran out” of software. That is studentswent through the material too quickly.

The CCC software appeared to handle students’ random attempts with a little more sophistication. Theprograms adjusted to students' repeated wrong answers by returning to simpler questions. Students receivedmore credit by choosing the correct answer the first time than by subsequent attempts. Thus, this system mayhave been more accurate in gauging student performance.

The companies were required to provide quarterly reports and yearly summaries, which they did in formatsthat were very difficult to evaluate. It was on the one hand a simple matter to see that the schoolscumulatively met their goals but because the data was provided in hard copy format it was difficult toaggregate over schools. The CCC has provided electronic files in a cooperative exchange of information.This will allow some further analysis of student progress on their system. However, since the overall goalof the program was to increase reading performance as measured on the Stanford Achievement Test, thatanalysis was not completed for this report. In addition, electronic data in a useable format was not providedby the JLC as required by the contract.

Discussion and Conclusions

OSN was an ambitious project to raise the reading performance at 44 Title I schools whose consistentlylow level of performance over a two year period had led to them being identified by the state as “CriticallyLow” schools. These schools, serving approximately 18,000 students, are among the poorest schools in

Page 35: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

24

Miami-Dade County with an average of 93% of these students being eligible to receive free or reduced pricelunch.

OSN redirected Title I funds into two instructional interventions. The first was a highly structured andscripted reading curriculum developed at Johns Hopkins University by Robert E. Slavin and Nancy A.Madden. The second was the introduction of computer-assisted instruction through contracts with twocompanies. The Computer Curriculum Corporation (CCC) and the Jostens Learning Corporation (JLC)provided both the hardware and the software to equip each of these schools with five computers for eachclassroom and a computer lab. In addition to these instructional interventions, efforts were made to providemedical screening to the students in these schools through existing community resources.

This evaluation examined the first three years of implementation of this intervention, which was projectedto last for five years. Since the three independent entities, Success for All, the Computer CurriculumCorporation, and Jostens Learning Corporation, were all initializing their efforts in the same time frame withthe overall goal to increase reading performance, it was decided to design this evaluation as an overviewof this effort. The evaluation consisted of a longitudinal analysis of the results of the Stanford AchievementTest, Reading Comprehension subtest and an analysis of several surveys which we hoped would provideinformation on staff, student, and parent attitudes as they might be influenced by the OSN intervention.

Reading Comprehension Subtest Results

Success for All was implemented in all OSN schools while the computer-assisted instruction was providedby two companies: one company served schools mainly in the northern half of the county (JLC) and theother company served schools mainly in the southern half (CCC). All schools meeting the “Critically Low”criteria established by the state were included in OSN as well as several primary schools whose studentswould attend the “Critically Low” schools. This meant that any comparison schools would, by definition,be higher performing schools. In order to overcome this a Repeated Measures Analysis of Variance analysiswas used on all the achievement data. Using this procedure we look for differences in the pattern ofperformance over time between all the OSN participants and all the Title 1 non-OSN participants who hadbeen tested at the same school each year that the analysis covered.

In general our results show no improvement in the Reading Comprehension subtests of OSN participantsas compared to all other Title 1 students over several years. Both groups show growth as they moved fromgrade to grade but the gap between the very low performing OSN school participants and their non-OSNTitle 1 counterparts failed to narrow over time.

There was an initial narrowing between 1994-95, which was the year in which schools were identified as“Critically Low” and the following year for all grades combined. However, this initial “improvement”represents students who were fifth graders in 1998-99, which means they were in the first grade in 1994-95and in the second grade when they showed the improvement in 1995-96, which was before the fullimplementation of OSN.

This improvement in performance between the first and second grades, for these OSN students, does notcontinue and it is not clear why the first grade scores of OSN schools would be that much lower then scoresof non-OSN first graders. The identification of OSN schools followed their classification as “Critically Low”

Page 36: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

25

schools which was based on two successive years of fourth grade test results. First grade performance wasnot considered in this classification and although it may be correlated with fourth grade performance themovement toward higher performance in second grade does not continue.

The same narrowing was seen in students who were third graders in 1998-99 and occurred between theirfirst and second grade scores after which their performances remained parallel. Since OSN began to beimplemented in schools during the 1995-96 school year and was in the first year of full implementation in1996-97, when these third graders where in the first grade, it could be argued that there was some effectof this intervention on the third grade students. However, it was that year (1996-97) that they exhibited verylow scores. So it would have to mean that they did worse during the first year that OSN was fullyimplemented and then improved the second year with no improvement the following year. This does notseem to be a plausible explanation. The pattern of these results for these grades indicates a fairly stableexpected growth rate that is not altered by the introduction of this intervention.

A similar parallel growth pattern was seen in the fourth grade and sixth grade results With each schoolimplementing a major research based reading curriculum and a computer-assisted instructional component,some convergence with the more highly performing Title I schools should have been evident.

Another independent evaluation was conducted of the Success for All program by the Office of Evaluationand Research (Urdegar, 1998). That evaluation used a different methodology from the longitudinal approachused in this evaluation. A small sample of OSN schools implementing Success for All as well as severalgroups of comparison schools over a one year time period were observed. In that study Success for Allparticipants performed as well but not better than the students in comparison schools. Taken together withthis current evaluation, the improvement in Reading Comprehension scores that planners had hoped for,given the reputation of Success for All, did not occur.

Although we did not find an overall effect there was still the question of the technology component. In orderto explore any differences between the two computer-assisted instructional curricula an analysis wasperformed comparing OSN schools using CCC software to schools using JLC software. This analysisindicated a small trend favoring CCC schools. Schools using CCC started below schools using JLC andnarrowed the difference between them or even exceeding them in subsequent years. This is most obviousin fourth grade students. It is important to note that these are very minor effects, yet the trend is there andalthough small does indicate that the CCC software is having a somewhat more positive impact on ReadingComprehension scores than is the JLC software, with a 20 minute daily exposure.

Survey Results

Surveys were administered to shed light on the teacher, student, and parent attitudes that may haveaccompanied OSN. The first of these surveys was administered in the spring of 1997 which was the firstfull year of implementation and focused on teachers’ response to Success for All but also included somegeneral questions about the other OSN components. In this first survey it was clear that there was adivergence of acceptance of Success for All by teachers. Respondents from schools that had previously hadSuccess for All were most positive about it except for those teachers at one school which later rejectedSuccess for All when given the opportunity. An overall Index of Satisfaction was calculated which yieldedan average score of 2.53 on a scale of 1 to 4. This indicates a pretty even split among the schools in theircomfort with Success for All. So even after the first year there was a good number of schools that were not

Page 37: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

26

happy with this aspect of the program. Other groups of individual items did yield more positive ratings andthere was high positive expectancy for the impact of the technological component and for a positive impactof medical screening.

A further examination of staff attitudes was carried out using the School Climate Survey administered in1997-98. This survey is administered to all schools on a yearly basis and consists of positive and negativestatements about the school to which respondents agree or disagree. Staff attitudes were compared at asample of OSN and non-OSN schools at which student attitude data was also collected. This surveyconsists of positive and negative statements about their school. Staff at OSN schools and non-OSN schoolsthat were using Success for All were the least positive among the schools. The schools using the SRA DirectInstruction curriculum were the most positive about their schools.

Both these surveys indicate a less then enthusiastic satisfaction of staff at OSN schools with both theintervention and their overall school climate. The OSN teacher survey results present a mixed picture.Although there is some acceptance and positive expectancy for many schools, there was as much rejection and dissatisfaction. The School Climate Survey tells us that the staff feel that they havegreater challenges then do staff at non-OSN schools. Given this state of staff discomfort with their situationit seems that it would be difficult to optimize the implementation of OSN without recognizing andcompensating for this difficulty.

On the other hand, the parents seem to be generally positive about the schools their students attend. Usingthe same sample of schools as the staff School Climate Survey, parent results were examined for selecteditems. For parents there was consistent positive agreement to statements about their children’s schools. Theschools using the SRA Direct Instruction curriculum had the highest agreement rate of all the sample withOSN schools being next. Two areas that parents at OSN schools recognized and rated more positivelywere computer training and critical thinking and reasoning. These may well be very important issues forparents of students at very low performing schools.

In addition to the staff and parent attitudes we sampled student attitudes. They are after all the recipientsof the OSN intervention. Selected students from the same sample of schools used for the School ClimateSurvey data were given a student survey consisting of two parts. The first was a general scale whichexamined their “feelings” about school. The other scale was the Attitude Toward Reading Survey, whichgauges how students feel about academic and recreational reading. The students surveyed at OSN schoolshad the lowest scores of all the sample schools on the feelings about school scale and on both parts of thereading survey. Thus, the students and the staff at OSN schools seem to be least happy about theireducational situation. This data does not explain the low performance of these students but does present apicture of the main participants in the educational process, teachers and students, as more troubled with theprocess then those at other schools.

OSN was conceived as a response to a then current situation, but was in fact a response to a long standingand difficult problem in Miami-Dade County. These targeted elementary schools are among the poorest andmost needy in the district. The medical screening component of the project, Project R.E.A.D.I.N.E.S.S.,found that 10% of the Kindergarten and first grade students at these schools were nutritionally at risk basedon height/weight proportions. This would make learning difficult for these students. This was a superficialscreening, but raises the question of what other unaddressed medical problems exist for these students?

Page 38: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

27

Summary

In summary, the interventions attempted for Operation Safety Net had never been implemented on sucha broad scale. Although Success for All had been implemented in several schools in Miami-Dade Countyprior to the onset of Operation Safety Net, the attempt to bring this curriculum to so many new schools atonce was apparently an effort beyond the capabilities of the Success for All Foundation which monitors andsupports the implementation of their curriculum.

The computer- assisted instructional component provided by Jostens Learning Corporation did not meetits shared accountability reporting obligations. Computer Curriculum Corporation met its sharedaccountability reporting requirements and met the shared accountability criteria. In addition, there was asignificant positive trend for OSN students using the Computer Curriculum Corporation’s softwarecompared to OSN students using Jostens Learning Corporation’s software.

Overall, this evaluation has found that OSN has not brought these lowest performing schools closer to otherTitle I schools. In fact, 13 of the OSN schools received a grade of “F” in the state accountability system,while the rest received a grade of “D” for their 1998-99 school year performance on the FloridaComprehensive Assessment Test “FCAT” and Florida Writing Assessment test.

The initial conception of OSN involved a unified reading curriculum/school reform paired with a computer-assisted instructional component with some form of medical screening. This was to be maintained for a fiveyear period to enable training to occur and provide programmatic stability. Instead early dissatisfaction withSuccess for All and complications due to the involvement of two computer companies led to a lack of clarityfor this intervention.

The district has since turned to a more diverse approach, the Comprehensive Reading Plan which enablesgreater choice at the school level. Several Operation Safety Net schools have discontinued using the SuccessFor All curriculum and there is no continuing medical screening related to OSN. Operation OperationSafety Net served to focus attention on those poorly performing schools and provide funding and initialdirection for reform; however it no longer seems to exist as originally conceived.

Recommendations

1. It is recommended that following the 2000-20001 project year, Success for All be discontinued atOperation Safety Net schools unless there is full support of staff and the necessary financial and staffresources leading to the possibility that the curriculum can be implemented exactly as required bythe Success for All Foundation. This would help us to determine if the investment in this curriculumwould have the dramatic effects predicted by it developers.

2. It is recommended that contracts with each of the computer-assisted instructional providers bereviewed in light of this evaluation and the difficulty of actualizing the shared accountability portion.

3. It is also recommended that the role of computer-assisted instruction in some form be expanded inOperation Safety Net schools. The results of this evaluation show promise, for at least one of thecompanies, based on the Stanford Achievement Test Results, even though students were exposedto the computer software for only 20 minutes per day. In addition, parents of students in Operation

Page 39: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

28

Safety Net schools recognize the opportunity that their children are getting by being exposed tocomputer skills in whatever form. Most of these parents may not have the resources to purchasecomputers for their homes and rate the schools’ provision of computer education as very positive.

4. It is recommended that community involvement in the provision of medical screening and follow-upservices be further encouraged for all elementary students at poorly performing schools.

5. Finally, it is recommended that the concept of Operation Safety Net be reviewed in light of thechanges in Success for All implementation, the limited medical screening, the variable results ofcomputer-assisted instruction. and the introduction of the Comprehensive Reading Plan.

Page 40: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

29

Page 41: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

30

References

Dade County Department of Health/School Health (1997). Project R.E.A.D.I.N.E.S.S. results1996-97 School Year, Miami, Florida.

Estrada, P. (1996, April). Teacher support during the transition to middle school and itsrelation to education functioning in poor urban youth. Paper presented at the annual meeting of theAmerican Educational Research Association, New York, New York.

Ford, D.Y. & Harris, J.J. (1996). Percepton and attitudes of Black students toward schoolachievement and other educational variables. Child Development,67, 1141-1152

McKenna, M.C. & Kear, D.J. (1990). Measuring attitude toward reading: A new tool for teachers.Reading Teacher, 43, 626-639.

Miami-Dade County Public Schools (1998). Blueprint 2000 Climate Survey Report for the 1997-98 Administration Parent Form, Miami, Florida.

Miami-Dade County Public Schools (1998). Blueprint 2000 Climate Survey Report for the 1997-98 Administration Staff Form, Miami, Florida.

Urdegar, S. (1998) Evaluation of the Success for All program 1997-98, Miami-Dade CountyPublic Schools, Miami, Florida.

Page 42: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

31

Blank

Page 43: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

32

Appendix A

Page 44: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

33

Blank

Page 45: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

34

Performance Criteria For School Accountability 1993-94 and 1994-95

Elementary Schools (Grade 4 only)

C At least 33 percent of students scored above the 50th percentile in Reading Comprehension.

C At least 33 percent of students scored above the 50th percentile in Mathematics Applications.

C At least 33 percent of students scored at or above 3 on the Florida Writes! Assessment.

Middle Schools (Grade 8 only)

C At least 40 percent of students scored above the 50th percentile in Reading Comprehension.

C At least 40 percent of students scored above the 50th percentile in Mathematics Applications.

C At least 50 percent of students scored at or above 3 on the Florida Writes!

Senior High Schools (Grades 10,11)

C At least 85 percent of students passed the High School Competency Test in Communications (grade11).

C At least 80 percent of students passed the HSCT in Mathematics (grade 11).

C At least 67 percent of students scored at or above 3 on the Florida Writes! (Grade 10).

If students at a given school failed to meet the above listed criteria for the applicable grade(s) inall three subject areas for two consecutive school years, that school was designated by the FloridaDepartment of Education (DOE) as Critically Low Performing.

Page 46: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

35

This page intentionally left blank.

Page 47: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

36

Appendix B

Page 48: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

37

This page intentionally left Blank

Page 49: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

38

Page 50: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

39

Page 51: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

40

Page 52: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

41

Page 53: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

42

Appendix C

Page 54: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

43

This page intentionally left blank

Page 55: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

44

Page 56: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

45

Blank

Page 57: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

46

Page 58: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

47

Page 59: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

48

Page 60: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

49

Blank

Page 61: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

50

Page 62: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

51

This page intentionally left blank

Page 63: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

52

Page 64: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

53

Page 65: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

54

Page 66: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

55

Page 67: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

56

Page 68: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

57

Blank

Page 69: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

58

Page 70: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

59

Blank

Page 71: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

60

Bureau Response/Plans of Action to Address Evaluation Findings*

______________________________________________________________________________* The following section was developed by program staff. It consists of a bureau/office response and

Page 72: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

61

plans of action which are to be (or have already been) initiated by the relevant bureau/office.

Blank

Page 73: Miami-Dade County Public Schools Office of Evaluation and ...oer.dadeschools.net/osninterim.pdf · Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 AN

62