88
ED 378 364 AUTHOR TITLE INSTITUTION SPONS AGENCY PUB DATE CONTRACT NOTE AVAILABLE FROM PUB TYPE EDRS PRICE DESCRIPTORS IDENTIFIERS ABSTRACT DOCUMENT RESUME CE 067 972 Stecher, Brian M.; And Others Improving Perkins II Performance Measures and Standards. Lessons Learned from Early Implementers in Four States. National Center for Research in Vocational Education, Berkeley, CA. Office of Vocational and Adult Education (ED), Washington, DC. Jan 95 V051A30003-94A; V051A30004-94A 88p. NCRVE Materials Distribution Service, Horrabin Hall 46, Western Illinois University, Macomb, IL 61455 (order no. MDS-732: $6). Reports Research/Technical (143) MF01/PC04 Plus Postage. *Accountability; Educational Legislation; Educational Policy; *Federal Legislation; Federal State Relationship; Policy Formation; Postsecondary Education; *Program Improvement; *Public Policy; Secondary Education; *State Standards; Statewide Planning; *Vocational Education *Carl D Perkins Voc and Appl Techn Educ Act 1990 A study examined implementation of the accountability and program provisions of the Carl D. Perkins Vocational and Ppplied Technology Education Act of 1990 (Perkins II) in four unnamed states that were "early adopters" of the Perkins II measures and standards. In each state, staff in the state agencies administering secondary and postsecondary vo:ationai education were interviewed along with administrators and instructors from secondary and postsecondary vocational institutions in two geographically separated regions. The interviews established that, although substantial progress has been made in implementing meeiures and standards, much work remains to be done to make the systems function as envisioned in Perkins II. Two sets of factors were identified as contributing to variations in states' responses to performance measures and standards. The first set were within the sphere of influence of federal policy, whereas the second set flowed from state/local contexts and are thus less directly subject to f'Jeral policy intervention. The following recommendations were directed toward federal policymakers: clarify the interrelationships among Perkins II mandates and coordination of Perkins II initiatives, create models for outcome-based program improvement, provide focused technical assistance regarding choices and resources, and address common measurement problems. (Appended are the state and local interview guides.) (MN) *********************************************************************** * Reproductions supplied by EDRS are the best that can be made from the original document. ***********************************************************************

Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

ED 378 364

AUTHORTITLE

INSTITUTION

SPONS AGENCY

PUB DATECONTRACTNOTEAVAILABLE FROM

PUB TYPE

EDRS PRICEDESCRIPTORS

IDENTIFIERS

ABSTRACT

DOCUMENT RESUME

CE 067 972

Stecher, Brian M.; And OthersImproving Perkins II Performance Measures andStandards. Lessons Learned from Early Implementers inFour States.National Center for Research in Vocational Education,Berkeley, CA.Office of Vocational and Adult Education (ED),Washington, DC.Jan 95V051A30003-94A; V051A30004-94A88p.

NCRVE Materials Distribution Service, Horrabin Hall46, Western Illinois University, Macomb, IL 61455(order no. MDS-732: $6).Reports Research/Technical (143)

MF01/PC04 Plus Postage.*Accountability; Educational Legislation; EducationalPolicy; *Federal Legislation; Federal StateRelationship; Policy Formation; PostsecondaryEducation; *Program Improvement; *Public Policy;Secondary Education; *State Standards; StatewidePlanning; *Vocational Education*Carl D Perkins Voc and Appl Techn Educ Act 1990

A study examined implementation of the accountabilityand program provisions of the Carl D. Perkins Vocational and PppliedTechnology Education Act of 1990 (Perkins II) in four unnamed statesthat were "early adopters" of the Perkins II measures and standards.In each state, staff in the state agencies administering secondaryand postsecondary vo:ationai education were interviewed along withadministrators and instructors from secondary and postsecondaryvocational institutions in two geographically separated regions. Theinterviews established that, although substantial progress has beenmade in implementing meeiures and standards, much work remains to bedone to make the systems function as envisioned in Perkins II. Twosets of factors were identified as contributing to variations instates' responses to performance measures and standards. The firstset were within the sphere of influence of federal policy, whereasthe second set flowed from state/local contexts and are thus lessdirectly subject to f'Jeral policy intervention. The followingrecommendations were directed toward federal policymakers: clarifythe interrelationships among Perkins II mandates and coordination ofPerkins II initiatives, create models for outcome-based programimprovement, provide focused technical assistance regarding choicesand resources, and address common measurement problems. (Appended arethe state and local interview guides.) (MN)

************************************************************************ Reproductions supplied by EDRS are the best that can be made

from the original document.***********************************************************************

Page 2: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

National Ccntcr for Research inVocational Education

University of California, 11,..zrkeley

Improving Perkins II PerformanceMeasures and Standards

Lessons Learned fromEarly Implementers in Four States

U.$. DEPARTMENT OF EDUCATIONOtto of Educal:cnal Rosoarch and Improvement

EDUCATIONAL RESOURCES INFORMATIONCENTER (ERIC)

This document has been reproduced a.:received from the person or organizationoriginating It.

Minor changes have been made toimprc. reproduction quality.- -- ----Points of view or opinions stated in thisdocument do not necessarily representofficial OERI position or policy

Supported byThe Office of Vocational and Adult Education,

U.S. Department of Education

2

BEST COPY AVAILABLE

Page 3: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

This publication is available from the:

National Center for Research in Vocational EducationMaterials Distribution ServiceWestern Illinois University46 Horrahin HallMacomb, IL 61455

800-637-7652 (Toll Free) 3

Page 4: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Improving Perkins II PerformanceMeasures and Standards

Lessons Learned fromEarly Implementers in Four States

Brian M. Stecher, Lawrence M. Hanser,and Bryan Hallmark, RAND

Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander,David Emanuel, and Steven G. Klein,

MPR Associates, Inc.

National Center for Research in Vocational EducationUniversity of California, Berkeley2150 Shattuck Avenue, Suite 1250

Berkeley, CA 94704

Supported byThe Office of Vocational and Adult Education,

U.S. Department of Education

January, 1995

1

MDS-732

Page 5: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

.

FUNDING INFORMATION

Project Title:

Grant Number:

Act Under WhichFunds Administered:

Source of Grant:

Grantee:

Director:

Percent of Total GrantFinanced by Federal Money:

Dollar Amount ofFederal Funds for Grant:

Disclaimer:

Discrimination:

National Center for Research in Vocational Education

V051A30004-93A/V051A30003-93A

Carl D. Perkins Vocational Education ActP.L. 98-524

Office of Vocational and Adult EducationU.S. Department of EducationWashington, D.C. 20202

The Regents of the University of Californiac/o National Center for Research in Vocational

Education2150 Shattuck Ave., Suite 1250Berkeley, CA 94704

Charles S. Benson

100%

$6,000,000

This publication was prepared pursuant to a grantwith the Office of Vocational and Adult Education,U.S. Department of Education. Grantees undertakingsuch projects under government sponsorship are en-couraged to express freely their judgment in profes-sional and technical matters. Points of view or opin-ions do not, therefore, necessarily represent officialU.S. Department of Education position or policy.

Title VI of the Civil Rights Act of 1964 states: "Noperson in the United States shall, on the ground ofrace, color, or national origin, be excluded from par-ticipation in, be denied the benefits of, or be subjectedto discrimination under any program or activity re-ceiving federal financial assistance." Title IX of theEducation Amendments of 1972 states: "No person inthe United States shall, on the basis of sex, be ex-cluded from participation in, be denied the benefitsof, or be subjected to discrimination under any educa-tion program or activity receiving federal financial as-sistance." Therefore, the National Center for Researchin Vocational Education project, like every program oractivity receiving financial assistance from the U.S.Department of Education, must be operated in com-pliance with these la -. s.

r-,

Page 6: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

PREFACE

The Carl D. Perkins Vocational and Applied Technology EducationAct of 1990 (Perkins II) contained a number of significant changesfrom previous federal vocational education legislation. The act re-quired states to develop outcome-based accountability systems builtaround statewide performance measures and standards. States weregiven considerable flexibility in identifying outcomes to be mea-sured, selecting measurement tools, and setting standards. Localprograms were given the principal responsibility for program im-provement, with states intervening only when local programs wereunable to demonstrate significant progress. Measures and standardswere supposed to be adopted by the fall of 1992 and used thereafteras accountability tools and guides to program improvement.

This study is one of a series of investigations conducted by theNational Center for Research in Vocational Education (NCRVE) relat-ing to vocational education accountability and the implementationof these measures and standards. The present research focuses onthe effects of the performance measures and standards on state vo-cational education agencies, local programs, and the relationshipsbetween them. The study examines four states that were among thefirst to adopt measures and standards and implement data collectionand reporting systems. Information gathered from these states givessome indication of the impact of the legislation among "earlyadopters" and the factors that affect implementation. The findingsshould be of interest to federal policymakers considering the reau-thorization of the legislation, as well as to state and local vocationaleducators looking for ways to improve the utility of their account-ability systems.

iii 6

Page 7: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

CONTENTS

Preface iii

Tables vii

Executive Summary ix

Acknowledgments xv

Chapter OneINTRODUCTION 1

Legislative Requirements 2Prior NCRVE Research 3Overview 4

Chapter TwoPROCEDURES 5Sampling 5Data Collection 6Data Analysis 7

Chapter ThreeWHAT WE FOUND 9

State Summaries 9Alcorn 10Columbia 11

Erie 12Piedmont 12

Implementation Progress to Date 14Variations in Implementation 17Room for Improvement 18

v

7

Page 8: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

vi Improving Perkins II Performance Measures and Standards

Chapter FourWHAT EXPLAINS PROGRESS TO DATE? 21

Factors Subject to Federal Influence 21

Flexibility Provided to the States in the Law 22

Unclear Integration and Coordination of Perkins IIProvisions 23

Lack of Models or Incentives for State and LocalImplementation 24

Limited Expertise and Resources at the State Level . . 26Mandated Measurement of Learning Outcomes 27

Factors Subject to State and Local Influence 29

Choice Between Central and Local ImplementationStrategy 29

Existence of Centralized Data Systems 32

Availability of State-Level Assessment Tools 33

Ongoing Educational Accountability and ProgramReview Mechanisms 35

Historical Relationships Among Agencies 38Efforts to Reduce Burden on Local Administrators . . 39

Influence of Key Administrators 40

Future Concerns 42

Skills Standards and Curriculum Standards 42

Data Quality 43

Integration of Academic and Vocational Education . . 43Consistency in Federal Vocational Education Policy . 44

Chapter FiveRECOMMENDATIONS 45Clarify the Interrelationships Among Perkins II Mandates

and the Coordination of Perkins II Initiatives 45Create Models for Outcome-Based Program

Improvement 46Provide Focused Technical Assistance Regarding Choices

and Resources 47Address Common Measurement Problems 49

Conclusions 50

AppendixA. STATE INTERVIEW GUIDE 53

B. LOCAL INTERVIEW GUIDE 61

References 75

8

Page 9: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

TABLES

3.1. Secondary Measures in Four States 153.2. Postsecondary Measures in Four States 16

9vii

Page 10: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

wit

EXECUTIVE SUMMARY

States have had three years to implement the accountability andprogram improvement provisions of the Carl D. Perkins Vocationaland Applied Technology Education Act of 1990 (Perkins II). Thisstudy examined progress in implementing statewide systems of per-formance measures and standards, the effects of these systems onstate agencies and local vocational programs, and the factors that in-fluenced state and local actions.' We offer recommendations forchanges in federal policy to promote the goals of local accountabilityand program improvement embodied in the original legislation.

PROCEDURES

We selected four states that were "early adopters" of measures andstandards for study. In each state, we inierviewed staff in the stateagency (or agencies) that administered secondary and postsecondaryvocational education and visited secondary and postsecondary vo-cational institutions in two geographically separated regions. Eachinstitution was visited twice, once in the fall of 1993 and again in thespring of 1994. During the site visits, we interviewed administratorsand instructors in each institution.

'No formal attempt was made to determine whether states were in compliance withPerkins II or to judge the quality of the measures and standards states had chosen toimplement.

x

10

Page 11: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

x Improving Perkins II Performance Measures and Standards

WHAT WE FOUND

Substantial progress has been made in implementing measures andstandards in the states that were visited, although much work re-mains to be done to make the systems function as envisioned in thelaw. By 1994, little attention had been paid to building state- or lo-cal-level capacity for translating the measures and standards datainto actions at the local level for program improvement. These"leading edge" states were still largely engaged in developing andimplementing their systems.

Furthermore, wide variation was found in the states' approaches tothe development and implementation of measures and standards.This variation was evident in almost every aspect of program imple-mentation, including how the process was managed, who partici-pated, and the level of resources devoted to it. These differences ap-peared to be jointly a function of the states' individuality and theflexibility inherent in Perkins II.

WHAT EXPLAINS PROGRESS TO DATE?

We identified several factors that contributed to the variation in stateresponses to performance measures and standards. Some of theseexplanatory factors are within the sphere of influence of federal pol-icy, including:

Flexibility provided to the states in the law itself. The flexibilityof the Perkins H mandate for measures and standards had mixedeffects. It permitted states to create systems that were respon-sive to local conditions, but it also increased the influence ofstate and local contextual factors on implementation, whichslowed and limited the process in some cases.

Unclear integration and coordination of Perkins II provisions.In the states we visited, the Perkins H prioritiesmeasures andstandards, integration, tech-prep education,2 and service to

2"The term 'tech-prep education program' means a combined secondary and post-secondary program which leads to an associate degree or 2-year certificate; providestechnical preparation in at least I field of engineering technology, applied science,mechanical, industrial, or practical art or trade, or agriculture, health, or business;

11

Page 12: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Executive Summary xi

special populationswere treated as separate activities. Theywere not seen as a coordinated system at either the state or locallevel, and performance measures and standards were not beingused comprehensively to evaluate the other Perkins initiatives.

Lack of models or incenti res for state and local implementation.Perkins II contains an explicit framework for structuring systemsand federal agency checks for compliance at the adoption stage,but there are neither models nor incentives for ensuring thatperformance measures and standards are used to improveprograms.

Limited expertise and resources at the state level. Perkins IIcreated new responsibilities for state staff while reducing the set-aside for state administration. This presented a dilemma forsome states that lacked both the expertise and the resources toaddress these new demands.

Mandated measurement of learning outcomes. The scarcity ofvalid assessment tools for measuring selected learning outcomesled states to adopt alternatives that were less than optimal.States are still struggling with how to measure important studentoutcomes, such as academic skill gains at the postsecondarylevel.

The second set of factors flows from state and local context, andhence, these factors are less directly subject to federal policy inter-vention. These included the choice betweem a centralized and a lo-cal implementation strategy, the existence of statewide data systems,the availability of state-level assessment tools, the nature of ongoingeducational accountability and program review mechanisms, histor-ical relationships among state and local educational and employ-ment agencie attempts to reduce the burden on local administra-tors, and the influence of key administrators.

builds student competence in mathematics, science, and communications (includingthrough applied academics) through a sequential course of study; and leads to place-ment in employment." (The Carl D. Perkins Vocational and Applied TechnologyEducation Act Amendments of 1990. P.L. 101-392; Part E, Sec. 347.)

1 '

Page 13: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

gat Improving Perkins N Performance Measares and Standards

Finally, there are issues on the horizon that will affect the implemen-tation of measures and standards and should be considered in plan-ning the reauthorization of Perkins:

Skill standards and curriculum standards. The Departments ofEducation and Labor are funding the development of industryskill standards in more than 20 occupational areas and curricu-lum standards in six educational fields. There should be somecoordination between these efforts and the systems of measuresand standards developed under Perkins II.

Data quality. States have not yet addressed the questions of reli-ability and validity of the measurement tools they have selected.Data quality questions will become more impor.ant as statesbegin the accountability and program improvement cycle, whichwill add high stakes to the performance measures and standards.

Integration of academic and vocational education. The inte-gration of academic and vocational education raises a host ofproblems for defining, measuring, and attributing outcomes andtherefore threatens the validity of existing systems of measuresand standards.

Consistency in federal vocational education policy. Some statevocational educators believe new laws will supplant many of theinitiatives contained in Perkins II; as a result, they make onlyhalfhearted efforts at implementation. In this way, the volatilityof federal vocational education policy discourages rapid andeffective response to federal initiatives.

RECOMMENDATIONS

Federal policymakers may take several actions to enhance the futuresuccess of performance-based accountability in vocational educa-tion and promote the goals of Perkins II:

Clarify the interrelationships among Perkins II mandates andthe coordination of Perkins II initiatives. Perkins II containsfour major priorities: integrating vocational and academic edu-cation, providing service to special populations, creating tech-prep education programs, and establishing systems of measuresand standards. Two of these are primarily curricular changes;

13

Page 14: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Executive Summary zit'

one relates to the selection of students; and one relates to ac-countability and program improvement. In theory, these effortsshould complement one another, and states' efforts to addressthe priorities should be coordinated. In fact, none of the fourstates coordinated their efforts to address these mandates, andthere was wide variation in the relative priority assigned to thesefour critical initiatives. Policymakers should clarify the inter-relationships among systems of measures and standards, the in-tegration of academic and vocational education, tech-prep edu-cation programs, and service to special populations and offeradditional guidance about coordinating states' efforts in theseareas.

Create models for outcome-based program improvement. In1994, most state action was still driven by the mandate to de-velop a structure for accountability, i.e., the system of measuresand standards. Little had been done to use that structure tomake programs better. State and local agencies need assistancein translating outcome deficiencies into action plans. One ap-proach might be to require states to develop program improve-ment models that illustrate how outcome-based information canbe used for local program reform. The alternative we suggestwould be to commission an agency other than the states them-selves to collect examples of effective outcome-based local im-provement practices and disseminate them widely for states andlocal programs to use.

Provide focused technical assistance regarding choices and re-sources. The "flexible mandates" of Perkins II place greater de-mands on state agencies yet restrict the use of funds for state-level services. These fiscal restrictions come at a time whenmany state administrative budgets are also being reduced.Under these circumstances, federal actions that help states re-spond to their choices and make better use of resources mightsignificantly improve implementation of the act. We also suggestincreasing the funds available for state admininration during thestart-up phase, so states can meet initial demands and developsome of the expertise they will need to operate a reformed voca-tional education system.

Address common measurement problems. The technology tomeasure learning and occupational performance gains in reli-

1 4

Page 15: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

itioroving Perkins II Performance Measures and Standards

able, valid, and efficient ways is not widely available. Most statesare not equipped with either the resources or expertise to de-velop tools for measuring learning and occupational outcomes,and it is unfair to require them to accomplish this difficult task.The federal government needs to assume leadership in address-ing these problems, since they are best solved nationally and arelargely the result of the provisions of Perkins II.

Incorporating these changes into the reauthorization of federal vo-cational education legislation will increase the efficacy of statewidesystems of measures and standards. Legislators also should antici-pate difficulties and conflicts that may be created by pending reformsof education, training, and workforce preparedness programs andshould work to coordinate the accountability requirements of allthese initiatives.

Page 16: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

ACKNOWLEDGMENTS

This study is based on interviews with state and local vocational edu-cators in four states. Our promise of anonymity prevents us fromnaming the states and the individuals but not from recognizing thecentral role they played in this research. We want to thank the fourstate directors cs ational education and the staff in the four statevocational education agencies. We also want to acknowledge the as-sistance of the administrators, instructors, and other staff in the sec-ondary schools and community colleges we visited.

xv

16

Page 17: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Chapter One

INTRODUCTION

It has been three years since states began to implement the account-ability and program improvement provisions of the Carl D. PerkinsVocational and Applied Technology Education Act of 1990 (PerkinsII), and policymakers are asking whether these provisions are work-ing. We know, from previous National Center for Research inVocational Education (NCRVE) research, that states adopted systemsof measures and standards as prescribed by federal law (Rahn et al.,1992), but we do not know whether, or how well, these systems wereimplemented, or whether they are leading to program improvement.These two questions are particularly relevant at present, because it istime for the reauthorization of federal vocational education legisla-tion. Policymakers are looking for information about the success ofthese locally focused, outcome-based systems of program improve-ment and about ways to improve the federal rules that govern them.

This study examines the implementation of Perkins II performancemeasures and standards to date and, based on this interim review,recommends actions the federal government can take to improvethese systems. Ultimately, Perkins II should be judged in terms ofoutcomes: Has the legislation established and/or strengthened localsystems of outcome-based program review and improvement, andhave these systems led to improved workforce outcomes at the locallevel?

However, it may he too soon to make these judgments. States areonly now entering the first program improvement cycle, andwidespread data on outcomes are at least one to two years away.

'17

Page 18: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

2 Improving Perkins II Performan.e Measures and Standards

Consequently, we framed our research in terms of two more imme-diate questions:

Are states implementing their systems of measures and stan-dards as envisioned in the federal legislation?

What factors have affected their progress, and can progress beenhanced through federal action?

Before presenting our results, we provide some background on therequirements of Perkins II and on relevant NCRVE research.

LEGISLATIVE REQUIREMENTS

Perkins II called for the creation of performance-based state ac-countability systems for improving vocational programs. One of themost dramatic changes in Perkins II was the addition of outcome-based performance measures and standards in these accountabilitysystems, particularly the inclusion of learning outcomes that wereseldom specified in previous evaluations of vocational programs.

Perkins II required each state to "develop and implement a statewidesystem of core standards and measures of performance" bySeptember 1992 (Public Law 101-392, Section 115(a)). The mandatehas three key distinguishing features: (1) i iphasizes the use ofoutcomes to monitor program success; (2) IL gives states consider-able flexibility in creating systems; and (3) it assigns primary respon-sibility for program improvement to local programs.

At minimum, each state was required to include in its system at leasttwo sets of measures. The first must be a measure of "learning andcompetency gains, including student progress in achieving basic andmore advanced academic skills." The other set must include any oneof the following four measures: (1) competency attainment; (2) jobor work skill attainment; (3) retention or completion in school; or (4)placement in further education, the military, or employment.

Although Perkins II prescribes the basic guidelines that states mustfollow in designing accountability systems, it also provides each stateconsiderable discretion in developing and implementing a systemthat fits its individual situation and needs. For example, some stateshave taken a centralized approach to designing their measures, pre-

18

Page 19: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Introduction 3

scribing specific assessment instruments or data collection proce-dures to be used by local recipients, while others have taken a decen-tralized approach, charging local recipients to develop their own in-struments or procedures. At the time of our research, states had notall reached the same stage of implementation of their selected sys-tems of measures and standards.

PRIOR NCRVE RESEARCH

Since Perkins II was enacted, NCRVE has been actively involved inproviding technical assistance to state-level administrators to assistthem in developing and implementing systems of performancemeasures and standards. NCRVE researchers wrote a guidebook,Accountability for Vocational Education: A Practitioner's Guide(Hoachlander, Levesque, and Rahn, 1992), which was Cstributed atregional workshops and has been widely used by state-level adminis-trators to explore key issues in designing an accountability system.

NCRVE researchers also examined several local vocational educationaccountability systems to gather information that might help im-prove the implementation of Perkins II. Beyond Vocational Edu-cation Standards and lWasures: Strengthening Local AccountabilitySystems for Program Improvement (Stecher and Hamer, 1993)described local accountability systems that already existed for manyvocational programs and showed how those systems could be usedfor program improvement. The authors suggested ways that statesand local programs might improve the functioning of localaccountability systems.

Continuing interest in obtaining information about state responsesto the performance rieasures and standards req 'irements ledNCRVE to compile State Systems for Accountability in VocationalEducation (Rahn, Hoachlander, and Levesque, 1992). This reportcontains a summary of the performance measures and standardsimplemented by states in the fall of 1992 and provides examples ofclearly defined measures and standards. The appendix briefly de-scribes each state's system of performance measures and standardsat both the secondary and postsecondary levels. The report is basednot only on information gathered through workshops and telephoneinterviews but also on analysis of documentation submitted toNCRVE.

19

Page 20: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

4 Improving Perkins II Performance Measures and Standards

OVERVIEW

This study examines how some states are progressing in achievingthe accountability goals outlined in Perkins II. It focuses on the pro-cess of implementing performance measures and standards for vo-cational education at the state and local levels. It is too soon to tellhow the implementation will work out in these states, let alone toknow whether the systems will have the desired effects of institu-tionalizing local outcome-based program improvement systems.Nevertheless, results to date can be used to judge the degree to whichstates are succeeding in achieving the vision embodied in Perkins II,what factors affected their progress, and how federal legislationmight promote greater success in the future.

In this study, no attempt was made to judge whether the states werein compliance with the legislation. The Office of Vocational andAdult Education (OVAE) is the agency responsible for determiningcompliance, not NCRVE. Moreover, no formal assessment was madeof the quality of the measures and standards themselves in thesestates. Perkins II requires the Secretary of Education to submit a re-port to Congress that assesses "the validity, predictiveness, and reli-ability" of standards and measures. The secretary's report may in-clude some measures of quality.

Our report is organized as follows. Chapter Two briefly describes theresearch procedures. Chapter Three presents findings regardingstate implementation and includes brief summaries of the evolutionof performance measures and standards at the secondary and post-secondary levels in each state. Chapter Four discusses the factorsthat affected states' actions, including both conditions that may besubject to federal actions and local and state contextual factors thatare within the domain of state policy. The discussion also identifiesemerging conditions that are likely to affect the implementation ofPerkins II in the near future. The final chapter recommends actionsto enhance the implementation of these systems.

' ( I

Page 21: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Chapter Two

PROCEDURES

SAMPLING

A small purposive sample of four states was chosen to develop adeeper understanding of the implementation of Perkins II measuresand standards from states that were farthest along in the process asof January 1993 (Rahn, Hoachlander, and Levesque, 1992). Thissampling strategy has two advantages. First, it permits us to extrapo-late and draw inferences from states that have completed more of therequired steps in establishing outcome-based accountability sys-tems. That is, because these states are further along, they havegreater experience with more aspects of the system and its imple-mentation. Second, and as a direct result of their experiences, ourstrategy enables us to identify models that might inform other statesand improve their systems. For example, state and local strategiesthat worked in our sample of states might be adopted by statesstruggling with implementation.

The major disadvantage of this strategy is that the four states are notnecessarily representative of the country as a whole. Conditions inthese four states might not be duplicated in others, and actions thatworked here might be less successful elsewhere. As a result, thereader should be cautious about extrapolating these findings or in-terpreting them as repre._tative of the country as a whole. Whilethe findings from the four states in our sample may not be generaliz-able, H e believe our recommendations, if implemented, wouldbenefit many states in their implementations of Perkins II.

5 21

Page 22: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

6 Improving Perkins II Performance Measures and Standards

DATA COLLECTION

In each state, we visited the state department of education and se-lected secondary and postsecondary institutions in two different re-gions. The secondary and postsecondary institutions we visited inthese regions were chosen by the state, in consultation with us. Weasked the state departments of education to choose institutions typi-cal for their state. We also asked for variety in terms of economicconditions and encouraged the states to choose one urban and onerural region.

A team of two researchers spent a total of approximately one andone-half weeks in each state over the course of two visits (Fall 1993,Spring 1994). At the school districts, we spoke with administratorsand instructors and asked about the implementation of Perkins IIand its effects. We developed a structured interview guide with tai-lored questions for chief administrators (including district super-intendents, college chancellors, and college presidents), principals,directors of vocational education (including college vocational ad-ministrators), and instructors. After determining his or her famil-iarity with the state-adopted system of measures and standards, wequestioned each interviewee on the following topics:

The importance of measures and standards vis-à-vis other voca-tional education initiatives (For example, which vocational edu-cation reforms or initiatives received the most emphasis in yourinstitution in the past two years?)

The integration of vocational education accountability withother educational reforms (For example, is accountability an im-portant aspect of general education reform? If so, how have vo-cational education measures and standards affected this processor been affected by it?)

Implementation of measures and standards (For example, whatsteps have been taken to implement performance measures andstandards from the time they were adopted until the present?)

Effects of measures and standards (For example, what changeshave occurred in program evaluation procedures as a result ofmeasures and standards?)

22

Page 23: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Procedures 7

Technical assistance an support for measures and standards(For example, what technical assistance has been provided bythe state? By the district?).

Copies of our interview protocols can be found in the appendixes.

DATA ANALYSIS

We compiled extensive notes in the course of our interviews, whichwe then edited, summarized, and distributed to all members of theresearch team. Impressions gained in initial visits to the states weretested during subsequent visits, and respondent comments werecompared across institutions. The findings documented in this re-port represent the results of many discussions and formal meetingsduring which we attempted to synthesize what we had collectivelyobserved over the span of several weeks in the field.

23

Page 24: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Chapter Three

WHAT WE FOUND

We begin this chapter with brief summaries of the status and mostsalient elements of each state's implementation of Perkins II. It isimpossible to convey the full complexity of each state's system insuch short summaries; however, we have tried to provide enoughinformation to help the reader distinguish among the states and tointroduce distinct characteristics of each state's approach.

Following that, we summarize the overall progress these states madein implementing statewide systems of performance measures andstandards and compare some of their choices. We also found con-siderable variation in the way the four states responded to thePerkins II mandatesthey followed different procedures, selecteddifferent measures and standards, and implemented their systems indifferent waysand this variation is discussed next. The sectionconcludes with observations about how much remains to be done,even in these four selected states.

STATE SUMMARIES

We refer to the four states by the pseudonyms Alcorn, Columbia,Erie, and Piedmont. This convention serves both to protect theanonymity of our sources and to preclude interpretations based onexisting attitudes toward particular jurisdictions. Most states createdtwo distinct systems--one at the secondary level and another at thepostsecondary levelso summaries differentiate between secondaryand postsecondary vocational institutions.

9 21

Page 25: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

10 Improving Perkins II Peiformance Measures and Standards

Alcorn

Alcom is a predominantly urban state with a large agricultural sector.It has a major metropolitan area that dominates the economy of thestate. At the secondary level, there is an extensive accountability sys-tem that includes academic skill exams, a performance-based qualityreview process on the academic side, and schoolwide report cards.The state hopes to incorporate performance measures and standardsinto this "academic" accountability system.

Alcorn is in the process of creating a workforce preparation systemthat will include ten measures and standards at both the secondaryand postsecondary levels. This system includes piloting of workplacereadiness instruments and the development of industry skill stan-dards and certification. The primary goal of performance measuresand standards in Alcorn is to produce high-quality data that can beused to compare the progress of programs across the state. The statemade early strides in meeting this goal because of its existing com-puting capacity and centralized approach. There was evidence ofinteragency coordination to obtain access to new outcomes dataanddevotion of staff time to develop a strong data system. However, thecomplexity and centralized nature of the state's data system have re-sulted in the delayed dissemination of data to districts. Little atten-tion has been paid to local implementation, particularly how instruc-tors might use the data to improve programs.

At the postsecondary level, the regional accreditation body leads themain accountability initiative. This accreditation effort is movingtoward requiring postsecondary institutions to include outcome-based measures. Several other initiatives are pushing communitycolleges in this direction. Alcorn proposes to establish an "insti-tutional guarantee" in which postsecondary institutions guaranteeemployers that students have acquired certain occupational skills.The state would like to develop a system that will link accreditation,the state-led program review, and performance measures and stan-dards while also using work-readiness assessment instruments andindustry skill standards.

As with the secondary level, however, there has been little evidencithat Perkins II performance data have been used by deans arid in-structors for program improvement purposes. Both deans and re-

25

Page 26: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What We Found 11

gional directors reported that the scope of the data they received wasoverwhelming and that a strategic targeting of the most relevant datawould be necessary to make it useful for improving programs.

Columbia

Columbia is a historically agricultural state with a growing industrialbase and expanding school system. Educational accountability hasbeen a major concern of the state legislature for the past decade.Annual school report cards that focus on outcomes are issued foreach district and postsecondary institution. Poor performance af-fects a small part of the pool of funds availably; for teacher salaries.This accountability system put the state a step ahead in terms of im-plementing Perkins II measures and standards.

At the secondary level, Columbia has a strong central educationalbureaucracy. Schools report data to the state at the student level,wh;ch the state analyzes and uses to produce required reports. Afterreviewing existing data-collection efforts, Columbia adopted eightcentralized measures; data for three of the measures were alreadybeing collected. The vocational education coordinator is an effectiveleader who involved staff across the state and created a vision forprogram improvement. Under her direction, the state moved ag-gressively to expand a computerized test bank and instructionalmanagement system to cover all its vocational courses. This systemwas developed in anticipation of changes in federal vocational edu-cation, and funds from Perkins II were used to transform it into atesting system to provide required occupational measures.

Postsecondary education in Columbia is more decentralized, withinstitutions retaining considerable autonomy. For example, al-though they have to produce report cards, each institution collectsand analyzes its own data. The vocational education coordinator re-spects this autonomy arid tries to shield the institutions by reducingthe "burden" of federal requirements. The state relied on existingmeasures to form the backbone of its postsecondary Perkins IIstatewide system.

26

Page 27: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

*

12 Improving Perkins II Performance Measures and Standards

Eiie

Erie is primarily a rural state with a growing student and minoritypopulation. Elementary and secondary enrollments increased over10 percent from 1980 to 1992, and postsecondary enrollments grewover 25 percent from 1987 to 1991. Accomitability is important inErierecent state legislation requires standardized assessment inseveral grades. At the same time, school districts have a long historyof autonomy and retain a great deal of independence. Coordinatingstate and federal legislation, in conjunction with wide local discre-tion, has been difficult for the state in its efforts to implement thePerkins II measures and standards. Additionally, unclear communi-cation at the local level has confused local agencies about perfor-mance measures and standards.

Having a single state department that is responsible for both sec-ondary and postsecondary occupational education eased the processof selecting measures and standards for Erie. The state adopted atotal of nine measuressix based on data that the state was alreadycollecting. The other three were defined conceptually by the statebut were passed to the local agencies to operationalize and imple-ment. These three are to be measures of academic and occupationalachievementperhaps the most difficult to operationalize and im-plement with local resources.

Staffing changes at the state level also resulted in a short hiatus incontact between the state and local agencies. As a result, the localsecondary and postsecondary agencies we interviewed have largelyignored the six measures and standards the state was already collect-ing and have been perplexed with the three that they were given tooperationalize. The state and local agencies could be best describedas "regrouping" in their efforts to implement the measures and stan-dards. The state is reconsidering the set of measures and standards itselected, and local secondary and postsecondary agencies are begin-ning to struggle with operationalizing and implementing the threemeasures and standards that are their responsibility.

Piedmont

Piedmont is a predominantly urban state with a growing andincreasingly diverse population. Its major metropolitan area has

27

Page 28: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

f`.

What We Found 13

grown dramatically in the last decade and has become the regionalhub for business and financial services. Perkins performancemeasures and standards represent the main accountability initiativefor vocational education in Piedmont at the secondary level.Traditionally, local agencies have enjoyed autonomy, with the state'srole primarily limited to compliance monitoring. However, with theimplementation of performance measures and standards, the state'srole has dramatically shifted from monitoring compliance toproviding technical assistance to foster program improvement. Thestate adopted four outcome areas and standards; each district choseor developed its own assessment instruments to evaluate students inthese areas. Vocational teachers have played an active role indeveloping pre- and post-tests to measure related academic gain andcompetency attainment. This has encouraged vocational teachers tothink about the academic skills related to the occupationalcompetencies they teach. The state developed an extensive technicalassistance program for local districts, including a detailed guide tocollecting Perkins-related data and a series of regional workshops forvocational education directors.

At the postsecondary level, Piedmont is even more decentralized,with each ;nstitution having considerable autonomy. The vocationaleducation coordinator respects the autonomy of each institution andin general has tried to minimize the data burden on vocational deansand instructors. At the postsecondary level Piedmont adoptedscaled-down version of the secondary performance measures andstandardsmost of the system to be phased in slowly. The main ac-countability initiative at this level has not come from the state butfrom the regional accreditation body. To minimize the data-gather-ing burden, the state originally decided to use the same cohort ofstudents that the federal student right-to-know legislation requires.'However, this cohort proved to be too limited an.i not useful for thepurpose of improving programs. The vocational deans plan to ex-pand the cohort this year. Because of the limited cohort and the im-portance placed on the accreditation system, there has been little ev-

1The Student Rightto-Know and Campus Security Act, P.L. 101-542, November 8,1990, requires postsecondary institutions to collect and publish data on the comple-tion or graduation rate of certificate or degree-seeking full-time students. Regulationsimplementing this act were still pending in 1994.

28

Page 29: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

14 Improving Perkins II Performance Measures and Standards

idence that performance measures and standards have been widelyadopted and used for program improvement purposes. There is aperception among some faculty members that performance mea-sures and standards represent another superfluous federal "add-on"program.

IMPLEMENTATION PROGRESS TO DATE

We selected these four states for this study, because they were amongthe "early adopters" of measures and standards and had made con-siderable progress in implementing their statewide systems at thetime of the study. The committees of practitioners in all four stateshad formally adopted the measures and standards; to the best of ourknowledge, all four states were in compliance with the Perkins II

regulations. On average, these states had adopted seven outcomes atthe secondary level and nine outcomes at the postsecondary level.Table 3.1 shows the breadth of the measures adopted at the sec-ondary level. All four states adopted measures of academic skills,specific occupational competencies, and labor-market outcomes.Most also had measures of general job or work skills and of programretention and/or completion. The pattern was similar at the sec-ondary level. Table 3.2 shows that three of the four states adoptedpostsecondary measures in the majority of these major outcome ar-eas.

Some of the states were phasing in one or more elements of their sys-tems over a one- or two-year period, but the vast majority of thecomponents were fully operational in the 1993-94 school year. Datawere collected during that year for most measures, although analysisand reporting of these data will not occur until the 1994-95 schoolyear in many cases.

In addition to the measures described in Tables 3.1 and 3.2, all fourstates either had or were actively developing the necessary informa-tion infrastructure to support standards-based accountability.Columbia and Erie have computerized student recordkeeping sys-tems in place that could be used for analyzing and reporting progresstoward state standards and functioning paper-based data systems forcollecting most of the remainder of the statewide measures. Alcornwas expanding its existing centralized data-collection and analysissystem to accommodate newly adopted measures, while Piedmont

29

Page 30: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Tab

le 3

.1

Seco

ndar

y M

easu

res

in F

our

Stat

es

Out

com

e

Aca

dem

ic s

kills

Spec

ific

occu

patio

nal

com

pete

ncie

s

Gen

eral

job

or w

ork

skill

s

Ret

entio

n or

com

plet

ion

Lab

or m

arke

t

Oth

er

Alc

orn

Atta

inm

ent o

n st

ate

test

of a

cade

mic

ski

lls

Atta

inm

ent o

n st

ate

test

of te

chni

cal s

kills

and

appl

ied

acad

emic

sb

Atta

inm

ent o

n st

ate

test

of w

orkp

lace

ski

llsb

Perc

enta

ge o

f st

uden

tsco

mpl

etin

g pr

ogra

m a

ndgr

adua

ting

Perc

enta

ge o

f co

mpl

eter

sem

ploy

ed in

rel

ated

fiel

d or

in f

urth

ered

ucat

ion

or tr

aini

ng

Em

ploy

men

t ret

entio

n

Col

umbi

a

Gai

nsa

on s

tate

test

of

liter

acyb

Eri

ePi

edm

ont

Atta

inm

ent a

nd g

ains

a on

test

of

core

occ

upat

iona

lco

mpe

tenc

ies

Atta

inm

ent a

ndga

insa

on

test

of

wor

k sk

ills

N/A

c

Perc

enta

ge o

f co

mpl

eter

sem

ploy

eere

late

d fi

elds

or in

fur

th,

.duc

atio

nor

trai

ning

Com

pari

son

to c

ount

yun

empl

oym

ent r

ate;

car

eer

deve

lopm

ent p

lann

ing

aMea

sure

d as

the

diff

eren

ce b

etw

een

post

test

and

pre

test

sco

res.

bUnd

er d

evel

opm

ent.

cN/A

ref

lect

s an

out

com

e no

t cho

sen

by th

e st

ate.

3 0

Atta

inm

ent o

n lo

cally

sele

cted

test

s of

foun

datio

n sk

illsb

Atta

inm

ent o

n lo

cally

sele

cted

mea

sure

sof

occ

upat

iona

lco

mpe

tenc

iesb

Atta

inm

ent o

f w

orkp

lace

com

pete

ncie

sb

Incr

ease

in p

rogr

amco

mpl

etio

n an

dgr

adua

tion

Incr

ease

in p

erce

ntag

e of

com

plet

ers

empl

oyed

inre

late

d fi

elds

or

in f

urth

ered

ucat

ion

or tr

aini

ng

Gai

nsa

on lo

cally

sele

cted

test

s of

acad

emic

ski

lls

Atta

inm

ent o

f lo

cally

sele

cted

cou

rse

com

pete

ncie

s

N/A

c

Scho

ol c

ompl

etio

nra

te o

f en

rolle

dst

uden

ts

Perc

enta

ge o

f co

m-

plet

ers

empl

oyed

inre

late

d fi

eld

or in

furt

her

educ

atio

n or

trai

ning

Page 31: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Tab

le 3

.2

Post

seco

ndar

y M

easu

res

in F

our

Stat

es

Out

com

e

Aca

dem

ic s

kills

Spec

ific

occu

patio

nal

com

pete

ncie

s

Gen

eral

job

or w

ork

skill

s

Ret

entio

n or

com

plet

ion

Lab

or m

arke

t

Oth

er

Alc

orn

Col

umbi

aE

rie

Pied

mon

t

Com

plet

ion

of r

emed

ial

cour

ses

Atta

inm

ent o

n st

ate

test

of te

chni

cal s

kills

and

appl

ied

acad

emic

s'

Atta

inm

ent o

n st

ate

test

of

wor

kpla

ce s

kills

'

Perc

enta

ge o

f st

uden

tsco

mpl

etin

g pr

ogra

m

Perc

enta

ge o

f co

mpl

eter

sem

ploy

ed in

rel

ated

fie

ldor

in f

urth

er e

duca

tion

ortr

aini

ng

Em

ploy

men

t ret

entio

n

Com

plet

ion

of g

ener

aled

ucat

ion,

rel

ated

and

rem

edia

l cou

rses

N/A

b

NIA

b

Perc

enta

ge o

f re

quir

edcr

edit

hour

sco

mpl

eted

, cou

rse

sequ

ence

com

plet

ed,

reen

rollm

ent r

ate

N/A

b

Atta

inm

ent o

n lo

cally

sele

cted

test

s of

foun

datio

n sk

ills'

Atta

inm

ent o

n lo

cally

sele

cted

mea

sure

s of

occu

patio

nal

com

pete

ncie

s'A

ttain

men

tof

wor

kpla

ceco

mpe

tenc

ies'

Incr

ease

in p

rogr

amco

mpl

etio

n an

dpe

rsis

tenc

e

Incr

ease

In

perc

enta

geof

com

plet

ers

empl

oyed

In r

elat

ed f

ield

or

infu

rthe

r ed

ucE

At o

rtr

aini

ng

Com

plet

ion

ofac

adem

ic c

ours

es

Com

plet

ion

ofoc

cupa

tiona

l pro

gram

s

N/A

b

N/A

b

Perc

enta

ge o

f co

rn-

plet

ers

empl

oyed

inre

late

d fi

eld

or in

furt

her

educ

atio

n or

trai

ning

'tind

er d

evel

opm

ent.

bN /A

ref

lect

s an

out

com

e no

t cho

sen

by th

e st

ate.

31

Page 32: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What We Found 17

was devoting considerable resources to developing local capacity fordata collection and analysis consistent with its decentralized ap-proach to measures and standards.

All four states made efforts to coordinate the implementation of theirsystems of measures and standards with other state and federal oc-cupational training and economic development programs, as re-quired by the Perkins II guidelines. Unfortunately, the states foundvery little common ground between the measures and standards de-veloped by vocational educators in response to Perkins II and thedata systems in use by other agencies. Although all four states at-tempted to coordinate data elements among their diverse economicdevelopment efforts, none had great success in this effort. Manystates were working toward using unemployment insurance wagerecords as placement data across agencies.

Finally, all four states had begun to prepare local educational agen-cies for their roles in the standards-based accountability system. Afew of the local school or college staff people we interviewed were in-volved directly in the selection or adoption of measures or standards;they were familiar with the requirements from the outset. The re-mainder of the local staff had learned about the new requirementssubsequently. Almost everyone we spoke to was aware of the mea-sures and standards in general terms if not in specific detail. Three ofthe four states informed local programs about the data demands andthe accountability standards in fairly traditional ways, assigning dis-semination and training responsibility to existing state program orarea staff, distributing written information, presenting regionalworkshops, etc. Efforts in the fourth state went well beyond those inthe others, perhaps because local programs were responsible for se-lecting the measurement tools to be used to assess outcomes. In thisstate, the number of workshops and the amount of contact with localprogram staff were considerably greater than in the others.

VARIATIONS IN IMPLEMENTATION

One of the most striking findings from our interviews and observa-tions was the large degree of variation in the way each of the fourstates implemented Perkins II requirements. States' approaches toPerkins II measures and standards differed in almost every respect,

Page 33: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

18 Improving Perkins II Performance Measures and Standards

including who participated in the process, how the process wasmanaged, what resources were devoted to it, how the regulationswere interpreted, and how the system was integrated with existingstate and local initiatives.

The inherent "flexible" nature of the directive in Perkins II to developand implement a system of performance measures and standardsencouraged states to customize their systems to meet local needs.Moreover, states interpreted the law in different ways, which led tofurther variation in implementation. For example, states had differ-ent interpretations of the requirement to measure learning andcompetency gains. Some felt that the measurement of "gain" re-quires pre- and posttest scores; others chose to use course comple-tion as a proxy for improvement. In the absence of strong federalcontrol over interpretation, states acted in the spirit of "flexibility,"creating their own interpretations. In the next chapter, we attemptto describe some of the factors that affected states' interpretationsand responses to Perkins II.

Implementation at the postsecondary level is moving more slowlyand is more varied than at the secondary level. This difference inimplementation is due largely to the historical autonomy granted topostsecondary institutions. The movement toward educational ac-countability has only recently come to the postsecondary level, andthere it faces unique challenges. For example, except for the pur-poses of initial placement, postsecondary institutions do not use as-sessment systems to test the achievement of students. Lacking basicachievement measures, it is almost impossible to measure learning"gain." A second factor in slowing the implementation of Perkins IIat the postsecondary level is the diversity of student purposes thatpostsecondary institutions must serve. It is difficult to establish asystem to track the progress of community college students with dif-ferent purposes for attending schools. An "entering" class will in-clude eighteen year olds out of high school, adult workers who needone course to upgrade their skills, and unemployed people who needretraining.

ROOM FOR IMPROVEMENT

Although these four states were among the leaders in implementingPerkins II measures and standards, their systems were only partially

33

Page 34: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What We Found 19

complete at the time of this study. No state had yet completed a fullcycle of data collection, analysis, and reporting on all adopted mea-sures, and no local program had yet examined its own performancein light of statewide standards as a guide to program improvement.

Additionally, many states still had proposed measurement tools un-der development. Columbia proposed to measure literacy in termsof pretest to posttest gains, but had not yet taken any steps to de-velop a literacy assessment instrument. Common unresolved testingor assessment problems included the measurement of academic andoccupational skills and difficulty setting up systems to monitor theperformance of special populations.

Most states' efforts were focused on creating statewide systems thatwould be in compliance with the federal regulations; much lessthought had been given to the use of the data the system producedfor program improvement. Moreover, both state and local agencieshad done little to translate information on program status into ac-tions for program improvement. We saw only one or two instancesin which thought had been given to converting standards-based re-ports into action plans.

Finally, with the exception of one highly decentralized state, moststate departments of education were only beginning to provide ex-tensive information and training to local program staff. By the springof 1994, states had disseminated to local districts the new rules anddata requirements introduced by the statewide system. States werejust beginning to confront the need for more local staff development,for greater contact with programs as they prepared new annual ap-plications, and for linking accountability with program improve-ment.

The states in our study have made a good start in implementing theirsystems of measures and standards, as envisioned in Perkins II.However, conditions at all levels of government have resulted insome states making greater progress than others. In the next chap-ter, we attempt to explain why variations in progress have occurred.

3 4

Page 35: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Chapter Four

WHAT EXPLAINS PROGRESS TO DATE?

We identified a number of factors that contributed to the observedsimilarities and differences in states' implementation of Perkins IIsystems of measures and standards. Some of these explanatory fac-tors are within the sphere of influence of federal policy; others arewithin the domains of states and local agencies. In Chapter Five, wewill draw upon the federal factors we discuss in this chapter to makerecommendations for changes in federal policy to improve the im-plementation of Perkins II.

FACTORS SUBJECT TO FEDERAL INFLUENCE

The first set of factors we identified includes elements of the 1990Perkins legislation and other conditions that may be directly influ-enced by federal policy. These factors include the following:

Flexibility provided to the states in the law itself

Unclear integration and coordination of Perkins II provisions

Lack of models or incentives for state and local implementation

Limited expertise and resources at the state level

Mandated measurement of learning outcomes.

Modifying these features through legislative or administrative meansmay enhance future implementation of Perkins II measures andstandards in the states. We discuss each of these factors in greaterdetail below.

21

35

Page 36: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

22 Improving Perkins II Performance Measures and Standards

Flexibility Provided to the States in the Law

The framers of the legislation sought to impose a common outcome-oriented program-improvement framework. They also wanted toenact policy that was sensitive to state differences and that permittedlocal adaptation, which research suggests should foster implementa-tion (Berman and McLaughlin, 1978). As a result, Perkins H gavestates considerable latitude in choosing measures, setting standards,deciding how much responsibility to delegate to local districts andinstitutions, and designing program-improvement interventions.This flexibility permitted states to create systems that weie respon-sive to local colAditions but also increased the influence of contextualfactors, which had both positive and negative effects.

On the positive side, the flexible mandate engaged states actively indeveloping their systems of measures and standards, gave states theopportunity to develop a sense of ownership, and enabled states toadapt the accountability system to their existing program review andmonitoring efforts, potentially reducing costly duplication of effort.Furthermore, Perkins II gave states complete control over the natureof state assistance to failing programs, so these efforts could bealigned with ongoing program review and improvement activities.On the negative side, flexibility has increased the influence of localcontext on the structure of the accountability system, reducing thecomparability of state systems and program results. The act pro-vided little guidance about the nature of the relationship that stateagencies and local programs were to have in these new systems orabout the state's role in providing technical assistance to local pro7grams. Some states have done little or nothing to elaborate this rela-tionship or build effective technical assistance procedures. In thesecases, state disr retion led to decisions that might not be consideredto be in the spirit of the legislation or in the best interests of programimprovement.

The overall result of the legislative flexibility is mixed. In 1994, wefound little evidence that states had created a dynamic program-im-provement culture based on outcome data, which many believe to bethe intent of Perkins II. Furthermore, it appears to us that opennessin the law, combined with strong state contextual factors, has length-ened substantially the timeframe t, development and implementa-

36

Page 37: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 23

tion. Even states that are moving in the right direction are movingmore slowly than envisioned in the legislation.

Unclear Integration and Coordination of Perkins IIProvisions

Although Perkins II appears to emphasize measures and standards,service to special populations, integrating academic and vocationalcurricula, and tech-prep education) equally, state agencies treatedthese as separate requirements and assigned them different priori-ties. The four states we visited differed in the emphasis they placedon developing and implementing performance measures and stan-dards relative to the other Perkins II initiatives. These differences instate priorities accounted for some of the differences we observed inprogress toward systems of measures and standards.

Similarly, Perkins II is unclear about how performance measures andstandards are to be used to evaluate its other new p;- ;:,pities serviceto special populations, tech-prep education, and the integration ofacademic and vocational education. Perkins II offers little guidanceabout how any of these activities are to be coordinated. Further-more, by giving states nearly two years to develop their performancemeasures and standards systems, Perkins II makes coordinationmore difficult.2 While state administrators were developing systemsof measures and standards, state and local administrators were be-ginning to implement tech-prep education programs and integrationstrategies. The two-year development phase for performance mea-sures and standards inhibited the use of these systems as evaluationtools for the other Perkins initiatives.

1 "The term 'tech-prep education program' means a combined secondary and post-secondary program ,vhich leads to an associate degree or 2-year certificate; providestechmca. preparation in at least 1 field of engineering technology, applied science,mechanical, industrial, or practical art or trade, or agriculture, health, or business;builds student competence in mathematics, science, and communications (includingthrough applied academics) through a sequential course of study; and leads to place-ment in employment." (The Carl D. Perkins Vocational and Applied TechnologyEducation Act Amendments of 1990. P.L. 101-392; Part E, Sec. 347.)2According to Rahn and Alt (1994), 45 states reported an increase in the amount of

time spent developing performance measures and standards from 1990 to 1993 at thesecondary education level. At the postsecondary level, 41 states reported an increasein time spent in this area.

37

Page 38: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

I

24 Improving Perkins II Performance Measures and Standards

Ideally, each state's system of performance measures and standardswould be used to assess the effectiveness of vocational programs, in-cluding the performance of special populations, tech-prep educationstudents, and students in programs with integrated academic andvocational curricula. In fact, Erie included specific languag in itsmeasures and standards to require their application to special popu-lations. One can imagine an ideal accountability system thatprovides information on the performance of tech-prep educationstudents, youth apprenticeship students, and each special sub-population in a program. With such data, a local administratorwould be able to compare the performance of subpopulations withina program, compare vocational programs within a school, andcompare a particular program to overall school performance. More-over, this system would facilitate coordination between academicand vocational teachers, by providing measures of relevant academicand vocational skills on each student. Most importantly, compar-ative program information would allow administrators and instruc-tors to target program-improvement strategies. For example, if thetech-prep education program consistently exceeded all standards, anadministrator might try to transform more vocational programs intotech-prep education programs.

Unfortunately, the Perkins II prioritiesmeasures and standards,service to special populations, integrating academic and vocationalcurricula, tech-prep educationare treated like disjointed programs,uncoordinated and at a different stage of implementation. In moststates, Perkins II priorities are not part of a coordinated system ateither the state or the local level, and performance measures andstandards are not being used comprehensively to evaluate the otherPerkins II initiatives.

Lack of Models or Incentives for State and LocalImplementation

There is a major gap between development of measures and stan-dards (which usually occurred at the state level) and implementationof standards-based program-improvement systems (which must oc-cur at the local level). Development dominated state agency effortsin the four states we visited. Certainly, it is necessary to select mea-sures and standards and to create data systems to support them be-

38

Page 39: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 25

fore these systems can be implemented. However, some state agen-cies had devoted so much attention to development that they hadbarely thought about subsequent stages of reform, such as how localadministrators and instructors would use performance measuresand standards to improve programs.

The situation in one state reminded a researcher of the void that oc-curred following the birth of his first child. All of his and his wife's at-tention had been focused on the delivery (for instance, on childbirthclasses and exercises) with little or no thought to what would comeafter. Once the child was born, it came as a sudden shock to tae par-ents that they now had to care for the child on a daily basis. It ap-peared that some states were in a similar position vis-a-vis Perkins IIperformance measures and standards. They focused extensively onthe "birth" of measures and standards, but they devoted little time tothinking about implementing them or using them once they werecreated. States were unprepared for the next step, because the de-mands of the development stage overshadowed concerns about im-plementation,

This shortsightedness may have occurred because Perkins II mod-eled a two-stage reform but did not provide two stages of incemives.The Office of Vocational and Adult Education monitored the sub-mission of state plans and the adoption of measures and standards,but there were no incentives for ensuring that performance mea-sures and standards would be used at the local level to improve pro-grams. Furthermore, while programs that did not make substantialprogress toward the state standards for two years were required todevelop a local improvement plan in conjunction with the state,there were no other explicit mechanisms in the law for monitoringlocal use of the measures and standards.

For example, Erie adopted nine measures and standards (six ofwhich were based on data that were already being collected). Localagencies were required to select or develop measurement tools forthe three new outcome areas. However, local administrators re-Ce ived no training in how to select approiniate instruments and havehad very little contact from the state to see whether they were takingappropriate actions. Furthermore, the state has shown no interest inknowing whether they were using the measures and standards forprogram improvement. One postsecondary site responded by

39

Page 40: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

MN/

26 Improving Perkins II Performance Measures and Standards

initiating a campuswide effort to develop new measures, but anotherdid almost nothing. With very little guidance from he state, localimplementation depended almost entirely on individual initiative at

the local level.

Despite the lack of models or incentives provided in the legislation,one state agency succeeded in promoting the use of performancemeasures and standards at the local level, in part because the stateadministrator believed in the usefulness ofperformance data for lo-cal program improvement and seized the opportunity to implementsuch a system. The secondary vocational education agency inPiedmont adopted an approach in which local programs chose mea-

sures from among several outcome areas and selected their own spe-cific measurement tools. Furthermore, the state agency providedtraining, developed guidebooks and report formats, and took othersteps to make the local programs feel that their participation was im-portant. Although the state agency did not offer rewards or le ypenalties, it created an atmosphere that fostered local participationand, ultimately, implementation.

Limited Expertise and Resources at tl .e State Level

The Perkins II performance measures ar d standards mandate cre-ated new responsibilities for state staff, requiring them to developexpertise in new and often unfamiliar ar,...as. At the same time, theact also reduced the set-aside for state administration, to put moreresources into direct services. This left some states in a bind: Theylacked both the expertise to address new demands and the resourcesneeded to develop it.

Vocational education agencies in these four states differed in thenumbers of staff members; the sizes of the administrative budgets;the levels of expertise in th areas of psychometrics, statistics, andlabor-market economics; and their capacities to implement local re-forms. In Erie, a single individual was responsible for statewide im-plementation at the secondary and postsecondary levels, in additionto other duties. This severely limited the time and energy that couldbe devoted to performance measures and standards. In other cases,staff did not possess the expertise needed to develop a program-im-provement system based on outcome data. For example, one statedirector commented, "My staff are in over their heads. There is not

40

Page 41: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 27

one trained research person on staff. We wanted initially to conducta labor market analysis, but we just didn't have anyone who knewhow to do it."

Some state agencies made performance measures and standards ahigh enough priority to allocate staff arK1 funds to the effort; othersdevoted only limited resources to this area. Moreover, most statesfocused the bulk of their resources on developing their systems ofmeasures and standards. Staff in some state education agenciescomplained about not having enough funds to provide the technicalassistance necessary to implement performance measures and stan-dards effectively. Our site visits revealed a continuing need for addi-tional staff training in using outcome data for program improvementand a growing demand for technical assistance in this area.

Mandated Measurement of Learning Outcomes

Perkins II requires states to measure academic learning gain andsuggests that states adopt measures of job or work skill and compe-tency attainment as well. The scarcity of valid assessment tools inthese areas led states to adopt alternatives that were less than opti-mal. For example, some states delayed or phased in their perfor-mance measures and standards, because these states lacked mea-surement tools in some areas. All four states we visited lacked one ormore measurement tools needed to complete their systems. Somestates assumed the responsibility for developing new instruments,while other states selected from among measurement tools that wereavailable. All were deficient in some manner. Still other states chosefrom among available instruments but recognized their shortcom-ings and planned to revisit the choices in the future. In addition, theact does not include definitions for key terms, such as "basic" and"more advanced" academic skills, which has led to widely varying in-terpretations and approaches. As one state director put it, the per-formance measures and standards mandate is "putting the cart be-fore the horse." He would prefer developing industry skill standardsfirst, then appropriate assessments, and only last a system of mea-sures and standards.

Alcorn and Columbia used standardized tests of academic achieve-ment at the secondary level to measure basic academic skills. Thesetests have been criticized, because they do not measure the aca-

41

Page 42: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

28 Improving Perkins II Performance Measures and Standards

demic skills for which vocational programs should be held account-able and do not measure these skills in a manner suitable for voca-tional students. To address some of these concerns, one state wasdeveloping an academic assessment instrument specifically de-signed for vocational students.

In Erie and Piedmont, state agencies delegated to local agencies theresponsibility for selecting and/or developing academic assess-ments. This decentralized approach defers difficult decisions to localprograms, which are less likely to have the expertise required totackle difficult measurement problems. This strategy raises concernsabout the appropriateness of locally selected tests and the technicalquality of locally developed measures. Decentralizadon of mea-surement choices also precludes the direct comparison of programperformance.

None of the states in our study possessed statewide academic tests atthe postsecondary level. The only postsecondary academic tests inthese states were those used to place students into appropriateEnglish and math courses at the time of enrollment. Piedmont andErie delegated responsibility for selecting or developing measures ofpostsecondary academic skills to local institutions. The remainingtwo states used remedial course completion as a proxy for basic aca-

demic gain. One used general education and related academiccourse completion as a proxy for advanced academic gain. In bothcases, it was assumed that studentswho received a passing grade in acourse had gained the skills and knowledge covered by the course.3

Course completion has been criticized as a measure of academicgains, because it does not directly compare studentachievement be-tween separate times. Furthermore, institutions usually define"successful course completion" as receiving a final grade of D orbetter. None of the instructors interviewed believed that a studentwho received a D could have mastered the body of knowledge con-tained in a course. Another criticism of measures of course comple-tion relates to remedial courses. Some instructors felt it was inap-propriate to hold vocational programs accountable for student gains

31n a nationwide survey, more than one-third of all states at the postsecondary levelreported using course completion as a proxy for academic gain (Rahn, Hoachlander,and Levesque, 1993).

42

Page 43: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 29

in preprogram, remedial courses, while others noted that it waswrong to credit vocational programs for academic gains that occur inthese courses prior to enrollment in a vocational program.

FACTORS SUBJECT TO STATE AND LOCAL INFLUENCE

The second set of factors affecting successful implementation of thePerkins II system of measures and standards flows from the contextsof individual states and localities. As a result, these factors are pri-marily responsive to state or local action. Without state or local ac-tion, the following factors may continue to exercise a dampening ef-fect on the results of federal policy:

Choice between central and local implementation strategies

Existence of centralized data systems

Availability of state-level assessment tools

Ongoing educational accountability and program review mecha-nisms

Historical relationships among agencies

Efforts to reduce burdens on local program staff

Influence of key administrators.

State responses to the Perkins II accountability initiatives have beeninfluenced by local factors; some of these advanced the developmentand/or implementation of state performance measures and stan-dards systems and others impeded it.

Choice Between Central and Local Implementation Strategy

A key factor that contributed to differences in state implementationis the division of responsibility between state and local agencies fordeveloping and implementing measures and st-P.7.4dards. Centralizedapproaches tended to be more efficient in the short term, but de-centralized ones may have greater long-term potential for local pro-gram improvement.

43

Page 44: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

30 improving Perkins II Performance Measures and Standards

In this study, state vocational education agencies with a history ofcentralized decisionmaking selected outcome areas, stipulated spe-cific measurement tools, set standards, and identified criteria thatwould trigger local improvement plans. In contrast, states with ahistory of strong local control tended to confine their roles to select-ing general measurement areas and defining broad standards. Localeducational agencies and postsecondary institutions were given re-sponsibility for selecting specific measurment instruments and fordeciding whether their institution was making sufficient progresstoward meeting state standards. In some cases, the approach toperformance measures and standards differed at the secondary andpostsecondary levels in the same state.

To illustrate, in Alcorn, the state staff (with input from the committeeof practitioners) selected measurement areas, developed instru-ments, and established overall state standards and aggregate per-formance goals for the whole system. For example, academicachievement at the secondary level was to be measured using a stan-dardized statewide test, and employment outcomes at both the sec-ondary and postsecondary levels were to be measured using stateunemployment insurance data during particular quarters followingprogram completion. In a similar vein, Columbia required local re-cipients either to use state-developed measurement tools or to usestate-defined methods of calculating performance on locally chosenmeasures.

In contrast, Piedmont and Erie, with stronger traditions oflocal con-trol, delegated responsibility for developing or selecting some or allmeasurement instruments to local recipients. (Standards were setcentrally in these two states.) In Piedmont, local school districtswere allowed to develop their own academic and occupational corn-petencies and measurement instruments or to adopt state-devel-oped ones, while districts were required to administer their ownprogram follow-up surveys. In the four states we examined, wherelocal agencies were responsible for development and/or implemen-tation of some measures and standards, more delays, confusion, andtechnical problems occurred than in states where these decisionswere made centrally.

Although systems with increased local responsibility faced early dif-ficulties, one state may be poised to achieve greater success because

44

Page 45: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 31

of the investments the state made in local expertise. In contrast, theearly successes of systems with more centralized control may dimin-ish if steps are not taken to engage local administrators in the pro-gram improvement process. For example, the data generated by thecentralized performance measures and standards system in Alcornwere rarely being used by instructors to improve their course offer-ings. Although some district and institution administrators wereusing the data to identify broad problem areas, instructors statedthat the data were too voluminous, complicated, and general to pro-vide them with the information they needed and desired to improvetheir programs. Rather than numerous pages of tabular data, in-structors preferred a simplified presentation format, and rather thanan overall employment rate for program completers, they desiredinformation on the specific employers who had hired their corn-pleters, on the specific jobs those completers had taken, and on theskills they were using or needed in those jobs. Furthermore, severaladministrators had not yet passed the performance measures andstandards data on to their instructors, believing the instructorswould be overwhelmed, and some spoke candidly that they did notexpect their instructors ever to use the data.

In contrast, at the secondary level in Piedmont, the highly decentral-ized performance measures and standards system led to widespreaduse by instructors of related academic assessments and the perfor-mance data generated by them. Because these assessments hadbeen developed locally with support and encouragement from thestate, instructors found them relevant for assessing student out-comes and improving programs. Furthermore, instructors and ad-ministrators found the performance report formats supplied by thestate useful and understandable.

These examples illustrate strong differences in local use of perfor-mance measures and standards related to the degree of centralizedor decentralized development, but the differences are not always sogreat. A centralized approach may be more efficient, may lead tomore elaborate and well-organized data, and may provide greater in-terprogram comparability. A decentralized approach may help en-sure that performance measures and standards are relevant locallyand may lead to greater local participation in and commitment to theprogram review and improvement process.

45

Page 46: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

32 Improving Perkins II Performance Measures and Standards

Existence of Centralized Data Systems

Statewide student databases and data processing capabilities af-fected state responses to Perkins II performance measures and stan-dards initiatives. Agencies with centralized data systems were morelikely to assume a strong role in collecting and reporting perfor-mance measures and standards than those without such systems.Although this is not always true, states with more centralized record-keeping experience and the computer capacity to maintain and ana-lyze large-scale databases were more likely to construct centralizeddata systems in response to Perkins II.

For example, in Alcorn and Columbia, the state vocational educationagencies obtained data for some performance measures from othereducational departments or state agencies, compiled those data, andreported them to local school districts and community colleges. Forother measurement areas, local administrators reported data to thestate vocational education agency, which then summarized perfor-mance on the measures and standardsand returned the data to theiroriginators in a revised form. In both cases, the state vocational edu-cation agencies played an important role in either collecting or re-porting the data. As a consequence, even though many of the datawere locally generated, administrators and instructors in these statesviewed performance measures and standards data as belonging tothe state. One postsecondary dean said, "The data don't match ours,but this lets us know how the state views us."

In contrast, at the secondary level in Piedmont, the state vocationaleducation agency played only a minimal role in data collection andreporting. Local school districts compiled their own data, tallied theappropriate statistics, and reported them to the state agency onstandard report forms that had been supplied by the state. Localadministrators and instructors we interviewed in this state weremore knowledgeable about and interested in the performance mea-sures and standards than local staff in other states.

A second consequence of a centralized data system is that it con-tributed to states focusing on the data more than how the data couldbe used. For example, Alcorn's data-computing capacity led state-level vocational education staff and the committee of practitioners toenvision a sophisticated system of performance measures and stan-

4 6

Page 47: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 33

dards that would provide local school districts and community col-leges with a broad range of outcome data at many levels of aggrega-tion, including district, school or institution, program, and specialpopulations' status. However, while devoting substantial time to de-veloping this data system, stat .3 staff spent less time thinking throughhow the data would be used at the local level to improve programs.As one state-level administrator stated, "We've provided them withall the data; now all they need to do is use it." Although the systemsucceeded in generating hundreds of pages of data per school orcommunity college district, there was little evidence that the datawere being used at the local level, particularly by instructors. At theend of the second year of implementation of performance measuresand standards, state-level staff still placed a higher priority on im-proving data quality than on improving local use for program eval-uation and improvement.

In contrast, secondary-level state staff in Piedmont had minimal ca-pacity to collect and manipulate data. They devoted their time to de-veloping guidelines for local agencies to follow and to providingother technical assistance to support local development efforts.

Availability of State-Level Assessment Tools

As noted above, certain provisions of Perkins II create assessmentproblems for all statesfor example, the requirement that state sys-tems include measures of academic learning gains has been a uni-versal s+umbling blockand it is appropriate for the federal govern-ment .o take steps to address these concerns. However, otheraspects of assessment affect the implementation of measures andstandards at the state and local levels in ways that may not be ad-dressed best by federal action. In particular, the interplay between astate's choice of measures and standards and the state's existingtesting system affects the implementation of the measures and stan-dards. Ft. rthermore, states that delegated decisions about vocationalmeasures to local agencies added another layer of sensitivity to stateand local assessment experience and practice.

States' actions in implementing measures and standards were af-fected by the prior existence of relevant state assessment tools.Where such measurement tools existed, states generally incorpo-rated them into their performance measures and standards systems,

4 47

Page 48: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

34 Improving Perkins II Performance Measures and Standards

although the use of existing measures had mixed results. For exam-ple, Columbia and Alcorn relied on existing statewide standardizedtests of academic achievement to satisfy the requirement to measure"learning and competency gains, including student progress in theachievement of basic and more advanced academic skills" at thesecondary level. These existing academic tests served as readilyavailable measures, but they also had several limitations. For one,tests in both states assessed academic attainment in early highschoolneither test had the capacity to measure the gains studentsachieved throughout their high school years. Both state agenciesplanned to phase in an additional test to measure gains, but neitherhad yet done so. Second, these standardized academic tests were notdesigned to measure academic skills specifically related to occupa-tional competencies, which many vocational educators have arguedshould be the focus of Perkins II performance measures and stan-dards. Third, instructors do not always find results relevant to theircourses. Currently, neither state plans to develop tests of occupa-tionally related academic skills.

In contrast, Piedmont's secondary-lel,el state staff focused on help-ing instructors develop measures of the relevant academic skills andoccupational competencies appropriate for specific local programs.The state also sponsored the development of statewide prototypetests that instructors could choose to implement in lieu of develop-ing their own local tests. Although developing these tests took sub-stantial time and staff resources and although questions of test va-lidity and reliability remain to be addressed, Piedmont's vocationaleducators were able to produce tests that measured both studentgains and related academic skills.

No statewide measure of academic performance existed in Erie, sostate staff required local recipients at both the secondary and post-secondary levels to choose or develop appropriate measures of aca-demic and occupational skill gains. Although state staff identifiedthe skills that should be measured in general terms, they providedlittle assistance to local agencies in selecting appropriate assess-ments. Some staff questioned whether the instruments that hadbeen selected by some local agencies would adequately assess the in-tended skills.

46

Page 49: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 35

At the postsecondary level, none of the states in our study possesseda statewide measure of academic skills. Although Erie required localrecipients to seek or develop an appropriate measure, the other threestates measured academic gains through course completion. For ex-ample, states counted successful completion of developmental orremedial courses or of general education or related academic coursesas academic gains. However, state-level staff in Alcorn were develop-ing tests of basic reading and mathematics skills at the postsec-ondary level.

The lack of relevant instruments for measuring academic and occu-pational competency led states to adopt alternative strategies formeasuring learning outcomes. These alternatives often were not themost appropriate. Furthermore, difficulties in measuring academicand occupational skills forced states to slow implementation of thisportion of the Perkins II mandate. In a few cases, Perkins II require-ments spurred states to develop their own instruments, which ap-peared to be suitable at first glance.

Ongoing Educational Accountability and Program ReviewMechanisms

States' reactions to Perkins II were strongly influenced by the educa-tional accountability context that existed within the states. Somestates had strong accountability mechanisms that attached incen-tives or sanctions to educational performance, although weaker ac-countability systems were more common. All states had programreview and accreditation procedures designed to monitor and/orcertify educational quality. Perkins II measures and standards serveda similar purpose, and some states tried to consolidate Perkins II ac-countability with existing efforts. However, this was not always thecase, and it is interesting to explore the degree to which states triedto integrate these activities.

Erie had established an accountability system based on student out-comes for both secondary and postsecondary vocational educationseveral years prior to passage of the act in 1990. In response toPerkins II performance measures and standards requirements, thestate added three new measures to the six they had already beencollecting. By folding performance measures and standards into an

4J

Page 50: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

36 Ir droving Perkins H Performance Measures and Standards

existing accountability system, the state was able to respond quicklyto the Perkins II mandates. It is also possible that by attachingPerkins-specific measures to an established system, local vocationaladministrators and instructors would be more likely to incorporatethem into their local accountability processes. However, the' waslittle evidence that the existing accountability system was beingtransformed into the program improvement process envisioned byPe.-kins.4

Alcorn already had several components of an accountability systemfor elementary and secondary education in place prior to Perkins II.Specifically, the state had been issuing school report cards for anumber of years and had recently initiated a review process wherebyschools regularly evaluated their performance in state-specified ar-eas based on locally selected measures. However, neither the reportcards nor the review process incorporated outcome measures rele-vant to vocational education. While state vocational education staffmembers have begun a dialogue with general education staff to bringPerkins II efforts closer to other strictly academic accountability ef-forts, by 1994, performance measures and standards had been devel-oped and implemented largely in isolation from the general ac-countability initiatives.

Previous experience with statewide school report cards had bothpositive and negative effects on the development of Perkins II mea-sures and standards in Alcorn. On one hand, state and local ad-ministrators, instructors, and members of the committee of prac-titioners were already familiar with the idea of using outcome data toevaluate institutional performance, which may have facilitateddevelopment of a system of performance measures and standards.On the other hand, this same experience may have led state staff toignore how the data would realistically be used locally to improveprograms. As an accountability tool, the school report card was usedlargely by school administrators as "a public relations piece," in thewords of one superintendent, primarily in discussions with school

4Early evidence indicated that, without follow-through and assistance from state staff,local programs were largely ignoring the requirement for new measures. Because thestate had been responsible for collecting the six extant measures and had done little toencourage use of the data, local administrators were paying scant attention to themeasures as well.

5 0

Page 51: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 37

boards and parents. Instructors paid little attention to these data,because they were not provided in a form useful to them andbecause they were used to make comparisons between districts theinstructors felt were unfair. As with the school report cards, state-level staff passed the performance measures and standards reportsdown through the secondary and postsecondary education hier-archies, expecting them to reach instructors who would then use theinformation to improve programs. However, by the end of thesecond year of implementation of performance measures and stan-dards, few instructors had ever seen the data, and there was wide-spread skepticism among local administrators that instructors wouldever use them.

In addition to these accountability initiatives, states also had otherevaluation and program review systems in place prior to Perkins II.Most state vocational education agencies followed a multiyear cycli-cal program review process at both the secondary and postsecondarylevels. State-level vocational education staff indicated that perfor-mance measures and standards data would likely be incorporatedinto the multiyear program review process.

Additionally, all postsecondary institutions were subject to periodicreviews by independent regional accreditation agencies. While thecyclical program reviews traditionally focused on process measures,such as vocational enrollments and sex-equity issues, the regionalpostsecondary accreditation agencies have recently begun movingtoward outcome-based evaluations. Evidence in several states indi-cated that Perkins II performance measures and standards weretaking a back seat to the accreditation process at the postsecondarylevel, particularly in those institutions that recently went through thenew outcome-based accreditation process. In part because accredi-tation is crucial to the viability of an institution, accreditation re-quirements were receiving greater attention than Perkins II.Furthermore, the accreditation process is generally well regarded bypostsecondary institutions, because it permits them to evaluatethemselves in terms of their own goals. In contrast, some feel thatPerkins II performance measures and standards are centrally dic-tated without appropriate institutional contribution. However, sinceaccreditation reviews sometimes occur as infrequently as once everyten years, the annual Perkins requirements may gain in importanceover time.

51

Page 52: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

38 Improving Perkins II Performance Measures and Standards

Other federal accountability requirements also influenced states' re-sponses to Perkins II. For example, recent federal student right-to-know legislation requires that postsecondary institutions receivingfederal financial aid track the educational experiences of a cohort offull-time students. At least one state in our sample tried to consoli-date requirements by using this same cohort in its Perkins II perfor-mance measures and standards system. However, because first-time, full-time students are often a minority at community colleges,instructors generally found the performance data they received use-less for program improvement. For example, one Piedmont instruc-tor had only three students in the selected cohort (and one of thesehad been sent to prison). The instructor did not believe that the re-sulting 33 percent incarcerated outcome rate accurately reflected hisprogram performance and did not find that it provided him withuseful information for improving his program.

In summary, the relationship between existing accountability initia-tives and development of Perkins II performance measures andstandards was complex. Combining these various efforts may ulti-mately bolster them all. However, Perkins II emphasizes programimprovement, while many of the other initiatives do not. This em-phasis may be lost if Perkins II accountability is joined with systemsthat use performance data for other purposes. Furthermore, wherefederal regulatory demands are in conflict, practical constraints mayforce states to satisfy one requirement at the expense of another.

Historical Relationships Among Agencies

Existing governance structures and collaboration patterns amongstate agencies affected states' responses to Perkins II requirements.For example, unified secondary and postsecondary vocational edu-cation agencies or agencies that were housed in the same state officeproduced coordinated Perkins H efforts: Erie agencies adopted asingle set of performance measures and standards for both levels ofeducation, while Alcorn agencies adopted parallel sets of measuresand standards and worked together to develop new outcome data.In contrast, Columbia secondary and postsecondary vocational edu-cation agencies historically acted separately, and they developedwholly different and separate performance measures and standardssystems. In Piedmont, the secondary vocational education agency

52

Page 53: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

All

What Explains Progress to Date? 39

took the lead in developing its performance measures and standardssystem, after which the postsecondary agency scaled down the sec-ondary system for its own implementation.

In addition to historical relationships between secondary and post-secondary vocational education agencies, responses to Perkins II re-quirements were also affected by existing relationships betweenthese agencies and those responsible for general education, the JobTraining Partnership Act (JTPA), and other workforce programs. Forexample, the Alcorn state vocational education agency operatedlargely separately from the elementary and secondary educationagency and the community college board. They had to establishformal links to obtain data from these agencies, a process that wasfacilitated because of historically good relations among them. Insome cases, the federal performance measures and standards initia-tive provoked new inter-agency collaboration. For example, thePerkins emphasis on placement outcomes spurred the Alcorn voca-tional education agency to collaborate with the state's JTPA agency,which had substantial experience using state unemployment insur-ance data for obtaining employment and earnings outcomes.

Efforts to Reduce Burden on Local Administrators

Reducing the data-collection and reporting burden on local adminis-trators was foremost in the minds of many state vocational educationagencies when they designed their systems of measures and stan-dards. This approach appeared to expedite the selection of perfor-mance measures and standards and the collection and reporting ofdata and to increase initial acceptance of performance measures andstandards by local administrators. However, efforts to reduce theburden of measures and standards may also reduce the effectivenessof the system in the long run by limiting local familiarity and "buy-in." Moreover, the focus on burden reflects a narrow perception ofthe system of measures and standards as a reporting exercise, not asan essential core activity in a meaningful program management andimprovement system.

Three of the four states we visited attempted to design their mea-sures and standards to minimize data-collection demands on localinstitutions. In Columbia, state staff at the postsecondary level"made every effort" to adopt measures for which data were already

Page 54: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

40 Improving Perkins II Performance Measures and Standards

being collected by local institutions. In Erie, six of nine measuresadopt a were already being collected. Similarly, Alcorn, adoptedmeasuz::. for which local schools already collected data or measuresthat could be satisfied by data obtained through another stateagency. All were proud of these actions, and felt they expedited theprocess of developing their state systems of measures and standards.

On the other hand, reducing the burden of data collection and re-porting also reduced local awareness of the reforms, particularly forinstructors. Well into the second year of implementation of PerkinsII measures and standards, few instructors in Columbia, Erie, andAlcorn had either heard of performance measures and standards orunderstood what their role was to be in using them. Furthermore, inat least one case in which performance measures and standards datahad been fed back to administrators or instructors, few found thedata relevant or felt committed to using them.

In sharp contrast, high school instructors in the sites we visited inPiedmont responded well to the "burden" that was placed on themto develop and implement local assessments of related academicskills and occupational competencies. While some local personnelinitially resisted the new requirements, most became more positivewith time and with continued encouragement and support from thestate. In this case, local educators who were invited to participate inthe process and now have greater ownership over the results re-sponded well to the additional responsibilities placed on them byfederal or state mandates. Furthermore, local educators who wereinvolved in data definition, instrument development, and the likewere in a better position to use the data when they were produced.Protecting local educators from the burden of developing a new sys-tem may ultimately lessen their capabilities and incentives to usethem.

Influence of Key Administrators

The presence or absence of strong leadership affected the nature ofstates' responses to Perkins H and the progress states made in devel-oping and implementing performance measures and standards. Inmost of the states, a single individual had primary responsibility forthe implementation of performance measures and standards at thesecondary or postsecondary level for both). The commitment of and

54

Page 55: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 41

approach used by this key administrator largely determined theprogress made by the state. At one extreme, some key state voca-tional education staff acted decisively to develop expanded systems.For example, a key administrator in Columbia took advantage ofPerkins II mandates to promote reforms she had long wanted to im-plement. At the other extreme, some key administrators took onlythose few steps that were necessary for compliance with the law.

These four states revealed dramatic contrasts in the speed and effec-tiveness with which administrators implemented their statewide sys-tems. Sometimes, secondary and postsecondary systems within thesame state varied dramatically. For example, the primary state-leveladministrator for secondary vocational education in one state was anactive and effective change agent who seized upon the Perkins IImandate as an opportunity to promote greater local accountability, acause in which he believed strongly. He supplied a vision of how thereform would work, rallied enthusiasm for it, and provided supportto those who were responsible for enacting the changes. In contrast,the administrator for postsecondary vocational education in thesame state did very little to encourage local implementation, statingcandidly that he expected the performance measures and standardsrequirements to go the way of many national initiatives and be for-gotten within a few years.

In another state, the vocational coordinator took the performancemeasures and standards mandate seriously, but kept his role to aminimum. This key administrator communicated with local staffabout Perkins II requirements primarily in writing, assuming thatschool-level staff would take his general guidelines and use their owninitiative to select and implement several new performance mea-sures. However, since the state devoted minimum attention totraining and support, few schools took steps to implement the newmeasures. Some local administrators called colleagues to find outwhat they were doing in response to the state guidelines. When theylearned that others were waiting for further direction before acting,they held up their own actions, effectively putting the entire initiativeon hold.

Although leadership styles varied widely, most of the administratorswe interviewed actively promoted the implementation of perfor-mance measures and standards. However, the power of individuals

55

Page 56: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

42 Improving Perkins II Performance Measures and Standards

to influence the development and implementation of these systemsin both positive and negative ways was demonstrated repeatedly.

FUTURE CONCERNS

Our ability to predict is limited, but some potential problems alreadyseem likely. The following issues and events are likely to have a ma-jor impact on statewide systems of measures and standards in thenear future. They deserve careful consideration in federal and statevocational education planning.

Skills Standards and Curricdum Standards

The Departments of Education and Labor are funding the develop-ment of industry skill standards in more than 20 occupational areasand curriculum standards in six educational fields. What will theseefforts mean for statewide systems of measures and standards?Certainly, valid national standards deserve consideration by states.They might provide invaluable common criteria for occupationalpreparation. However, no one knows how best to accommodatesuch standards in systems of measures and standards, or how diffi-cult (or expensive) such coordination might be. What supplementalwork will be required to integrate industry standards and curriculumstandards with state systems (e.g., connection between state occu-pational task lists and industry standards)? How will this work besupported?

Not only will the development of industry skill standards have an ef-fect on statewide systems of measures and standards, but the organi-zations developing the industry standards need to be informed aboutthe Perkins II measures and standards systems used in vocationaleducation. The key question is, what effect should states' systems ofmeasures and standards have on the structure of the industry skillstandards? It is important that those developing the industry stan-dards be aware of the accountability mechanisms states have beencreating under Perkins II.

56

Page 57: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

What Explains Progress to Date? 43

Data Quality

From 1991 to 1994, states devoted most of their energies to selectingand adopting measures and standards and developing data systemsto support them. As yet, little attention has been paid to the qualityof the measures and standards.' As soon as statewide systems be-come operational and programs are required to judge their perfor-mance against state standards, questions of reliability and validitywill become more important. In states that adopt high stakes for per-formance (as in the case of Columbia), potential data quality prob-lems will be exacerbated, because high stakes are known to corruptthe interpretation of educational data. The need for evaluations ofdata quality will increase once systems are fully implement?d.However, states are unlikely to have the resources or expertise neces-sary to conduct these investigations. If these issues are not resolved,the whole enterprise may suffer, so it is important for the federalgovernment to develop mechanisms to facilitate assessment of thequality of measures and standards.

Integration of Academic and Vocational Education

Potentially the most far-reaching reform embodied in Perkins II isthe integration of academic and vocational education. This move-ment raises several problems for defining, measuring, and attribut-ing outcomes and therefore threatens the validity of existing systemsof measures and standards. First, states are unlikely to have includedintegrated outcomes in their current systems, so major changes willbe required. Second, the development of measurement tools for in-tegrated outcomes will probably lag behind the development of cur-riculum. Third, the conception of a vocational program will have tobe changed to include academic components. Fourth, it will be diffi-cult to attribute outcomes to specific courses and to use outcomedata for program improvement. Although fully integrated programsare many years away, many states already have begun implementingsome integrated academic curricula, and these states are alreadyfacing problems defining and measuring outcomes. Early attention

5This is one of the questions being addressed by the Batelle study of Perkins II.However, that study relies on data collected in 1993, before most state systems wereoperating on a regular basis.

57

a

Page 58: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

44 Improving Perkins II Performance Measures and Standards

to the questions raised by integrated academic and vocational edu-cation is warranted.

Consistency in Federal Vocational Education Policy

Some state vocational educators explained their less-than-enthusi-astic response to Perkins H by saying, "It's going to go away." Theybelieve new laws will supplant many of the initiatives contained inPerkins II. When implementers believe mandates are only tempo-rary, they reduce their efforts and adopt a mind-set of minimumcompliance rather than commitment. Unfortunately, the volatility offederal vocational education policy discourages rapid and effectiveresponse to federal initiatives. Long-term change is unlikely underthese circumstances.

Perkins II measures and standards embody a radically different ap-proach to accountability. It may take five years of implementationfor a fair test of this initiative. However, there already is talk withinthe states that drastic changes in the regulations are imminent. As aresult, some administrators are curbing their efforts; many an. -

ceeding with.caution. To promote effective implementation of mea-sures and standards, the federal government must make it clear thatit intends to stay the counc. ,.urther, it must give vocational educa-tors ample time to develop, implement, and fine-tune the system be-fore making judgments about its efficacy. A minimum of five years isrequired to see whether outcome-based local program improvementin the Perkins II model will work or not.

56

Page 59: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Chapter Five

This research suggests specific actions federal policymakers shouldtake to improve the implementation of Perkins II measures andstandards. Although our case studies were limited to four states,both our informal conversations with other vocational educators andprevious research on the implementation of Perkins II measures andstandards suggest that these concerns are widespread enough towarrant such actions. Potential responses include changes in regu-lations, investments in the solution of common problems, and newcapacity-building initiatives at the state and local levels. These rec-ommendations follow directly from the factors identified as beingsubject to federal influence in the previous chapter.

CLARIFY THE INTERRELATIONSHIPS AMONG PERKINS IIMANDATES AND THE COORDINATION OF PERKINS IIINITIATIVES

Perkins II requires that states emphasize systems of measures andstandards, the integration of academic and vocational education,tech-prep education, and service to spe 7.ial populations, but it doesnot tell states how these activities should be interrelated, nor does itsuggest how states should coordinate their efforts. As a result, statesimplement activities in isolation and assign them different priorities.For example, some put tech-prep education on hold while focusingon measures and standards; others assign their most talented staff totech-prep education while delegating measures and standards to oneindividual.

45

59

Page 60: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

46 Improving Perkins II Performance Measures and Standards

Perkins II gives little direction to states about how to coordinate theirefforts on measures and standards, tech-prep education, integrationof academic and vocational education, and special populations. Theomission of models for coordination decreases the potential effec-tiveness of the act, particularly the effectiveness of the systems ofmeasures and standards. States need guidance regarding how tomake their systems of measures and standards support their effortsto integrate academic and vocational education, create tech-prepeducation programs, and promote service to special populations.For example, performance measures and standards could be seen asa mechanism for evaluating all the Perkins II priorities. At present,states do not always see how the required elements can be interre-lated, e.g., how needs assessment fits into the state plan, how thestate plan fits with the local application process, and how the localapplication process fits into the annual review cycle. Neither do theysee how these are all elements of an evaluation system that applies toall the Perkins H initiatives. Instead, there is wide variation in therelative priority assigned to these four critical initiatives and little co-ordination. Changes in the language of the legislation could clarifythese relationships.

CREATE MODELS FOR OUTCOME-BASED PROGRAMIMPROVEMENT

To date, states have emphasized the "nuts and bolts" of creatingsystems of measures and standards. Over time, they need to build onthis foundation and focus on using these systems for program im-provement. At present, most state action is still driven by the man-date to develop a structure for accountability, i.e., a system of mea-sures and standards. The law says little about how that structure issupposed to be used to make programs better. Sites must developprogram improvement plans, but nothing suggests how one trans-lates outcome deficiencies into action plans either at the local levelor at the state level if the joint planning provision comes into effect.We talked to one state administrator who had begun to tackle thisquestion by developing a set of specific questions linked to the state'smeasures and standards for use at a site to organize its search forimprovement. However, such thought about the use of measuresand standards was rare.

GO

Page 61: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Recommendations 47

One way to encourage states to build toward program improvementis to require them to develop models for using outcome-based in-formation for program reform. Certainly, it is time for states to thinkseriously about this phase of accountability. Yet, some states mightfind any additional requirements distasteful, and many would arguethat additional paperwork is counterproductive. The alternative weprefer would be to commission another agency to collect examples ofeffective practices and disseminate them widely for states and localprograms to use.

PROVIDE FOCUSED TECHNICAL ASSISTANCE REGARDINGCHOICES AND RESOURCES

Perkins II gives states far greater programmatic discretion while lim-iting the resources they can devote to administration. This placesgreater demands on state administrators, but limits their capacity toact. Under these circumstances, federal actions that helped statesrespond to their choices and make better use of resources mightsignificantly improve implementation of the act.

The "flexible mandates" contained in Perkins II placed demands onstates and local agencies but offered no guidance in how to meetthose demands. States were required to create systems of measuresand standards with certain properties but were allowed considerablediscretion in the design of these systems, including the choice ofoutcomes, the selection of measurement tools, and the setting ofstandards. Many lacked the expertise to make these decisions wisely.Furthermore, some states delegated many of these choices to localagencies 1 This cascade of delegated decisionmaking placed newand unfamiliar demands on state and local staff. Few had anyprevious experience implementing outcome-based accountabilitysystems in vocational education.

As a result, states looked to the U.S. Department of Education for as-sistance in clarifying the meaning of the regulations and providingexpertise in solving problems of measurement, assessment devel-opment, and standard setting. Unfortunately, little help was offered.

This led to considerable variation of measures and standards between states as wellas large within state variation in selected states.

61

0

Page 62: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

48 Improving Perkins II Performance Measures and Standards

The federal agency focused its resources on compliance monitoring(for all the provisions of Perkins II) rather than on providing guid-ance about measurement problems or acting energetically to in-crease a state's capacity to respond through training. Other agenciesstepped in to try to address the most pressing problems (e.g., NCRVEworkshops on selecting measures and setting standards). This sup-port was helpful, but it did not satisfy the demand. Furthermore, thedemand for technical assistance still exists, as many states lag behindthe four we visited in implementing their systems, and some of themore difficult measurement issues remain unresolved.

A further demand for technical assistance arose within those statesthat delegated the choice of measures to the local level. Those withultimate responsibility for deciding on measurement tools and pro-cedures lacked the expertise to make these decisions and looked tothe state as well as to their colleagues in other districts for assistance.The states' record in responding is mixed. Piedmont provided ade-quate technical assistance by training district vocational educationadministrators, who then trained program instructors. Erie did not.

Furthermore, we predict that a new set of technical assistance needsis on the horizon in the area of program improvement. Statewidesystems will be operational on a large scale in the 1994-95 schoolyear, and local programs will begin to compare their performance tostate standards. Few at the local level have any experience usingoutcome data for creative program planning, so programs that donot make the grade will look to their states for guidance with pro-gram improvement. States, in turn, will turn to the federal govern-ment for assistance, for they too lack expertise in using performancemeasures as program-improvement tools.

The federal government should act now to address these needs bycreating a mechanism for providing relevant technical assistance tostates. In the long run, the goal should be to build capacity at thestate level for using the standards framework as a program improve-ment tool. One approach might be to press for greater integration ofvocational education measures and standards with existing stateprogram review procedures. Because many states have the samequestions, we believe the most efficient way to provide these servicesis centralized, but we would not preclude other options. Whateverthe form, resources should be devoted to solving measures -and -

62

Page 63: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Recommendations 49

standards problems and increasing the capacity of states to maketheir systems work. One state coordinator described his experiencewith statewide measures and standards as "trying to build a 747 inflight." Trying to do this without appropriate technical assistanceand support is a prescription for failure, and much more could bedone to promote success.

Resource constraints exacerbate the problem. Perkins II placesgreater demands on state agencies than its predecessor, but restrictsthe use of funds for state-level services. This comes at a time whenmany state administrative budgets are also being reduced. Stateadministrators believe they lack the resources necessary to respondto Perkins II mandates effectively. These limitations are particularlysevere at the postsecondary level, where some states allocate verysmall proportions of their Perkins II funds. Some efficiencies arepossible through better coordination of efforts on measures andstandards, tech-prep, integration, and special populations, and fed-eral efforts to solve difficult measurement problems will also lessenthe demands on state agencies. Furthermore, states could be en-couraged to increase local participation in the technical assistanceprocess over time, substituting regional networks and collegial assis-tance for some of the centralized expertise required initially.

Other resource problems might be addressed by increasing adminis-trative allocations for certain activities. We are not advocating a re-turn to the previous model, which many felt diverted too many dol-lars from students to central bureaucracies. As we see it, the need forsupplemental resources and for technical assistance is not constantand can be fulfilled in many ways. Therefore, the response can beflexible. State departments of education need additional resourcesduring times of initial development. We suggest increasing the fundsavailable for state administration during the start-up phase, so statescan meet initial demands and develop some of the expertise they willneed to operate reformed vocational education systems.

ADDRESS COMMON MEASUREMENT PROBLEMS

One reason the implementation of Perkins II has been so challengingto states is that the vision embodied in the guidelines exceeds the ca-pacity of existing measurement tools. In particular, the technologyto measure learning and occupational performance in reliable, valid,

6 3

Page 64: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

50 Improving Perkins II Performance Measures and Standards

and efficient ways is not widely available. For example, althoughmuch work is being done to develop performance measures of occu-pational skills, this is an emerging assessment area with few opera-tional systems. The situation is worse in some areas, e.g., measuringgains in advanced academic skills appropriate for the postsecondarylevel, where none of the four states found existing measurementtools or had the expertise to develop their own direct measures.

Most states are not equipped to develop tools for measuring learningand occupational outcomes, and it is unfair to drop this difficult taskin the lap of state vocational educators. The federal governmentneeds to take responsibility for addressing these problems, since theyare largely the result of the provisions ofPerkins IL

At the same time, the federal government is supporting a variety ofnew assessment initiatives whose relationships to Perkins II mea-sures and standards are unclear. These range from the developmentof mot than 20 sets of industry skill standards to the establishmentof natbnal goals and curriculum standards for academic education.There should be some connection between the assessment demandsof Perkins II and the assessment development efforts being fundedseparately. At a minimum, states need guidance about how the skillstandards should relate to their statewide systems. Other assessmentdevelopment projects to meet the specific needs of Perkins H shouldalso be considered.

CONCLUSIONS

The 1990 amendments to the Carl D. Perkins Vocational and AppliedTechnology Education Act entailed major changes in federal voca-tional education priorities and practices. Among the most visible in-novations in Perkins II is the creation of statewide systems of mea-sures and standards of performance. The purpose of these systems isto create an objective, outcome-based framework for program evalu-ation that can be used as the cornerstone of local program improve-ment efforts. We found that states had made much progress creatingthe structure of measures and standards but had devoted much lessattention to creating a process for using standards as program-im-provement tools. We also identified a number of factors that affectedstate actions on measures and standards, many of which reduced thelikelihood that state systems would function as envisioned in the law.

61

Page 65: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Recommendations 51

Some of these factors are subject to federal intervention, and weidentified steps the federal government could take to ameliorate thesituation. Incorporating these changes into the reauthorization offederal vocational education will increase the efficacy of statewidesystems of measures and standards. Legislators also should antici-pate difficulties and conflicts that may be created by pending reformsof education, training, and workforce preparedness programs, andthey should work to coordinate the accountability requirements ofall these initiatives.

Page 66: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Appendix A

STATE INTERVIEW GUIDE

53 66

Page 67: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

54 Improving Perkins II Performance Measures and Standards

RAND/MPR INTERVIEW GUIDE

Secondary/Post-Secondary Differences

Clarify the appropriate "level(s)" for the interview. Is the respondentknowledgeable about secondary vocational education, post-secondary, or both? Is s/he willing to discuss secondary issues, post-secondary or both? [Adjust questions accordingly.)

Nature of statewide measures and standards

I mailed MPR's summary description of [state's) standards andmeasures and asked you to update it to reflect any errors or changes.[Show document] Are any revisions necessary?

Are these measures or standards likely to change this year? Ifso, how?

Are there plans to review or revise the measures / the standardsin the future? If so, what will be done?

States can use their measures and standards for many purposes, forexample, evaluating instructor effectiveness; which purposes aremost important in [state] at the secondary / post-secondary level?

(Other purposes: monitoring statewide performance, holdingprograms accountable for outcomes, allocating resources,providing individual student information)

Can you rank the purposes in order of importance? How doyou know?

Importance of measures and standards vis-à-vis othervocational education initiatives

How important are measures and standards relative to othervocational education efforts?

relative to other Perkins requirements, such as Tech Prep,integration of academic and vocational education, specialpopulations, etc.?

67

Page 68: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

State Interview Guide 55

relative to other state-initiated vocational efforts?

important in terms of actions (e.g., staff activities, resourcecommitments)

What kinds of activities have occurred? Can you give examples?

important in terms of long -range impact (e.g., broader reformagenda)

Which changes will be the most lasting? Why do you think this is so?

Note: Clarify secondary / post-secondary differences

Clarify importance in terms of the interests of specialpopulations

Would [state] have taken these actions if there had not been a federalrequirement?

What evidence is there that [state] was moving in thisdirection on its own?

Are measures and standards supportive of or antagonistic to othervocational education reform efforts?

Which other vocational initiatives? How are theycomplementary or how do they conflict?

Integration of vocational education accountability with othereducational reforms

Looking at educational reform more broadly, what are [state's]current reform efforts in general and academic education?

How do these initiatives relate to accountability in vocationaleducation?

For example,

Is accountability an important issue in general and academiceducation? What is being done to promote accountability?How have vocational education measures and standardsaffected this process? Why or why not?

68

Page 69: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

56 Improving Perkins II Performance Measures and Standards

Is reform of assessment an important item in general andacademic education? What assessment reforms are occurring?How do they relate to vocational education measures? Why orwhy not?

Is (state] trying to develop or improve its student data systemfor general and academic education? What efforts are underway? How do these relate to vocational education datacollection? Why or why not? Can you identify specialpopulations in your current data system?

Are there new programs to provide services to handicappedstudents, disadvantaged students or students with limitedEnglish proficiency? What is the nature of these programs?How do they relate to the requirementsfor service to specialpopulations in vocational education? Why or why not?

(Possibilities: general and vocational reforms arecontradictory, are not related at all, are using similarapproaches but remain separee, are integrated into a singlesystem)

Do you think vocational education accountability will play a greateror lesser role in gene --al education reform in the future? Why?

Implementation of measures and standards

What steps have been taken to implement performance measuresand standards in [state] from the time they were adopted until thepresent?

What have state agencies done in the areas of:

Dissemination of information about the measures andstandards?

Staff Development and Technical Assistance?

Data Collection?

Reporting?

Monitoring state and local actions?

Which agencies /personnel have been involved?

69

Page 70: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

State Interview Guide 57

What actions have they taken?

How do these activities differ from the activities they weredoing in the past?

What level of resources is being devoted to the implementation ofstandards and measures? (e.g., How many staff are involved?)

What have local programs done in these same areas? How does thisdiffer from what they did in the past?

What are the major problems that have been encountered inimplementing measures and standards?

How have attitudes toward measures and standards changed as thesystem has been developed and implemented? Improved? Grownmore hostile?

Technical issues regarding measures and standards

What is the state doing to assess the reliability and fairness of themeasures?

Do you have any specific information about the quality of themeasures?

How important is comparability of data between programs?

Will your system provide comparable data at the programlevel? How do you know?

What is the state doing to assess the validity of the standards?

How were standards set?

Do you plan to review or revise the standards in the future?

Do you permit local modifications to standards? Why or why not? Ifso, how well are they working?

What is being done to encourage service to special populations?

Are you using incentives or adjustments? Why or why not?

7 0

Page 71: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

58 Improving Perkins II Performance Measures and Standards

Use of measures and standardsHow are measures and standards supposed to be used at the statelevel (e.g., for monitoring, accountability, funding)?

How will they be reported, and to whom?

What use will be made of the information?

How are measures and standards supposed to be used at the locallevel (e.g., for student counseling, program improvement)?

How will data be reported, and to whom?

What use will be made of the information?

What is the state doing to affect the local use of thisinformation?

Are people concerned about the possible abuse of the measures andstandards (e.g., creating incentives for programs to admit only themost qualified students)?

What problems do you foresee? Why or why not?

Assistance and Support for Measures and Standards

What kinds of technical support do state offices need to implementmeasures and standards (e.g., specialized training)?

Have you tried to obtain outside support? From whom?

What kinds of technical assistance is needed at the local level?

What is the state doing to provide it?

What support will be provided specifically to encourage the use ofmeasures and standards for local program improvement?

What is the state's role in local program improvement?

71

Page 72: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

State Interview Guide 59

Effects of measures and standards

What effects have the measures and standards had so far (e.g.,shifting responsibility from the state to the local level)?

on state agencies?

on local programs?

on the relationship between state agencies and localprograms?

What are the advantages and disadvantages of standards from thepoint of view of:

your agency?

other state agencies?

local programs?

business and industry?

7 °

Page 73: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Appendix B

LOCAL INTERVIEW GUIDE

7361

Page 74: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

62 Improving Perkins II Performance Measures and Standards

RAND/MPR Interview Guide

Visits to Local Districts

Notes

Respondents have been classified into categories, and sections of theinterview are assigned to specific types of respondents:

Superintendentincludes state, district and college chief ad-ministrators, e.g., chief state school officer, superintendent,chancellor, president.

Principalincludes local school chief administrators, e.g., prin-cipal, assistant principal.

Vocational Directorincludes district or college vocationaladministrator, e.g., director of vocational education, vocationalcoordinator.

Teacherincludes college instructors and secondary teachers.

Otherincludes administrators responsible for placement, spe-cial education, evaluation, data systems, etc.

The general term "institution" is used in the interview guide.Substitute "state," "district," "college" or "school" as appropriate tothe particular interview.

71

Page 75: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Local Interview Guide 63

RAND/MPR Interview Guide

Visits to Local Districts

PERFORMANCE MEASURES AND STANDARDS[SUPERINTENDENT, PRINCIPAL]

Adjust questions for standards and measures as implemented ineach state, e.g., centralized vs. decentralized.

Are you familiar with your state'sperformance measures andstandards? (Ask by measure category)

Basic and advanced academicskillsOccupational competencyJob or work skill attainmentRetentionPlacementOther

[decentralized] Which specificmeasures and standards has yourinstitution adopted?

When was the first time you wereintroduced to these measures andstandards?Were you involved in theirdevelopment?Has your institution addedmeasures and standards to thestate's set?

How did you decide what toadopt?Who participated in the process?

75

Page 76: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

64 Improving Perkins II Performance Measures and Standards

THE NATURE AND USE OF STATEWIDE MEASURES AND

STANDARDS [SUPERINTENDENT, PRINCIPAL]

For what purposes will measures andstandards be used in your institution(local level)?

How will data be reported, and towhom?How will the information be used?

(e.g., for student counseling, program Will the information be used toun a rovement) compare programs?How will the data you report be used How will the data be reported, andat the state level? to whom?(e.g., for monitoring, accountability,funding)

How will the information be used?Will the information be used tocompare programs or institutions?How has the state communicatedits intentions to ou?

Are people in this institution con- Who has expressed concern? Why?cerned about the possible abuse of What problems do you foresee?measures and standards?(e.g., creating incentives forprograms to admit only themost qualified students)

Why or why not?

76

Page 77: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

41

Local Interview Guide 65

IMPORTANCE OF MEASURES AND STANDARDS VIS-A-VISOTHER VOCATIONAL EDUCATION INITIATIVES[VOCATIONAL DIRECTOR]

Viihich vocational education reformsor initiatives have received the mostemphasis in your institution in thepast two years?

How important are measures andstandards in your institution relativeto

other Perkins requirements, suchas Tech Prep, integration of aca-demic and vocational education,special populations, etc.?other state initiated vocationalefforts?district initiated vocational ef-forts?school site initiated vocationalefforts?

What activities have occurred thatjustify your judgment?Can you give specific examples?

Which of these reforms will bemore lasting?Why do you think this is so?

How do measures and standardsrelate to the local program applica-tion process?How do they fit in to the annualprogram evaluation process?How do the federal requirementsdiffer from preexisting state policiesor procedures?

What do you do differently nowthan you did prior to measuresand standards?How have reporting requirementschanged under Perkins I andPerkins il?

Are measures and standards sup-portive of or antagonistic to othervocational education reform efforts?

Which other vocational reforms?How are they complementary orhow do they conflict?

77

Page 78: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

66 Improving Perkins II Performance Measures and Standards

INTEGRATION OF VOCATIONAL EDUCATIONACCOUNTABILITY WITH OTHER EDUCATIONAL REFORMS[SUPERINTENDENT, PRINCIPAL]

In education more broadly, whichrecent reform activities have receivedthe most attention in yourinstitution?

How do these initiatives relate toaccountability in vocationaleducation?

Is accountability an important aspectof general education reform?

What is being done to promoteaccountability?How have vocational educationmeasures and standards affectedor been affected by this process?Why or why not?

Would the institution have takenactions to create or improve anaccountability system without thefederal requirement?

What evidence is there the theinstitution was moving in thisdirection on its own?What changes in assessment wouldhave occurred?What changes in data collectionand use would have occurred?

Is reform of assessment an importantitem in general education? Invocational education?

What assessment reforms areoccurring?How do they relate to vocationaleducation measures?Why or why not?

Is the institution trying to develop orimprove its student data system forgeneral education?

What efforts are under way?How do these efforts relate tovocational education datacollection? Why or why not?How are special populationsidentified?How are tech prep studentsidentified?How do you define a "vocationalstudent?"

Are there new programs to provideserve es to handicapped students,disad,antaged students or studentswith limited English proficiency?

What is the nature of theseprograms?How do they relate to therequirements for service to specialpopulations in vocationaleducation?Why or why not?

7 o0

Page 79: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

1

Local Interview Guide 67

Do you think vocational educationaccountability will play a greater orlesser role in general educationreform in the future?

Why?

79

Page 80: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

06 Improving Perkins II Performance Measures and Standards

IMPLEMENTATION OF MEASURES AND STANDARDS[SUPERINTENDENT, VOCATIONAL DIRECTOR, PRINCIPAL]

Do the measures and standardsapply to individual programs or tovocational education as a whole?

Which local programs will beevaluated based on measures andstandards?

Did you choose to concentratePerkins funds on a limited numberof sites or on a limited number ofvocational programs?Does the accountability systemonly apply to the projects,services, and activities assisted byPerkins funds? Or will measuresand standards be applied morebroadly to other programs?Who is going to be required toreport data?

What steps have been taken toimplement performance measuresand standards from the time theywere adopted until the present?

Which agencies or personnel havebeen involved?What actions were taken?Has your district added measuresand standards to those adopted bythe state?

Who will be responsible forimplementing the perfo;-,ancemeasures and standards?

What responsibilities does thisinclude?Who else will be involved in theprocess?What level of resources is beingdevoted to implementation?

How much discretion do you have inhow you implement the measuresand standards?

What changes have you made, oryou make?

What supplemental requirementhave you imposed?

What are the major problems thathave been encountered inimplementing measures andstandards?

What local conditions have been abarrier to implementation?What has changed in the area ofassessment and data collection?How reasonable is the timelinerequired by the state?

Page 81: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Local Interview Guide 69

What is being done to encourageservice to special populations?

Does the system includeincentives or adjustments toencourage service?How do your programs monitorand insure access, retention, andcompletion of specialpopulations?

How are performance measures andWhat is the connection betweenperformance measures and

standards linked with the annual standards and the annual evalua-program evaluation? tion?

What is the connection betweenperformance measures and

How are performance measures and standards and the yearly appli-standards involved in the program cation process for Perkins funds?application process? What other new Perkins data

requirements have come from thestate?What new special population data

Do performance measures and requirements are there?standards play a role in other Perkins . What are you doing to identifyrequirements? barriers to access or success and to

evaluate the progress of specialpopulations?What are you doing to monitorteaching "all aspects of theindustry?"

What directions has the stateUnder what circumstances will you given you for developing a localbe required to write a local improvement plan?improvement plan ? Who is responsible fordetermining improvement?What is meant in this state by"substantial progress" towardachieving standards?Who decided on the definition ofthis term?

81

Page 82: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

70 Improving Perkins II Performance Measures and Standards

Effects of Measures and Standards [Superintendent,Principal, Vocational Director]

What changes have occurred inprogram evaluation procedures as aresult of measures and standards?

in staff evaluation procedures?

in student evaluation procedures?

How are programs/staff/studentsevaluated?How has this process changed in thepast two years?What caused these changes? (Perkinsor other factors)How will it change this year or next?

[collect examples of relevantdocuments]

What changes have occurred in datacollection or data reporting?

What changes are anticipated in thefuture?

Which data do teachers andadministrators use to judge thequality of programs?Have there been changes in theavailable data during the past twoyears?If so, which data are no longer used,which are newly available?Will there be changes this year ornext?!collect examples of relevant reports]

What changes have occurred in staffresponsibilities as a result ofmeasures and standards?

What changes are anticipated in thefuture?

How many staff are involved withmeasures and standards?What do they do?

What additional roles orresponsibilities will evolve in thefuture?

Have measures and standardscreated any additional burdens foradministrators or staff?Any other problems?

That other demands do measuresand standards make?What other problems have occurred?

Do you have a Technical AdvisoryGroup or something similar?How have measures and standardsaffected the role of the TechnicalAdvisory Group?

How often does it meet?Who participates?Have there been any changes inthe nature or function of thegroup? Why?

82

Page 83: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

Local Interview Guide 71

How have attitudes toward measuresand standards changed as the systemhas been implemented?

What were your first reactions?Other people's reactions?How have those changed?Can you give examples?

Have you had to write any programimprovement plans?

What criteria did you use toidentify sites requiring programimprovement plans?What actions did the plansinclude?[collect examples of relevantdocuments]

What effects have the measures andstandards had so far on (e.g.,shifting responsibility from thestate to the local level):state agencies?local programs?the relationship between stateagencies and local programs?

What are the advantages anddisadvantages of measures andstandards from the point of view of:

district?local programs?business and industry?

What do you anticipate the effectswill be on program planning andinstruction?

What kinds of changes might youmake based on the data?What kind of training or supportwould you need to use the dataeffectively?[collect examples of relevantdocuments]

What effects will performancemeasures and standards have onspecial populations?

What steps have been taken tomonitor access, progress, orsuccess of special populations inquality vocational programs?Are you applying the samestandards to all programs, orseparate standards for specialpopulations?Have the special populationsserved under Perkins changedover the past two years?

83

Page 84: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

72 Improving Perkins II Performance Measures and Standards

TECHNICAL ASSISTANCE AND SUPPORT FOR MEASURESAND STANDARDS [VOCATIONAL DIRECTOR]

What technical assistance has beenprovided by the state? by the district?

What kinds of technical support do Have you tried to obtain outsidedistricts/schools/institutions need to support?implement measures and standards? From whom?(e.g., specialized train; g) What is the state doing to provide

it?How is the state receiving theinformation on what technicalassistance/support the local levelneeds?

What are the most effective vehicles Has the state held workshops? thefor technical assistance at the local district?level? Have you attended AVA or other(e.g., guidebooks, workshops,conferences, electronic mail, etc.)

national conferences?Do you receive information fromNCRVE?

What support will be provided What is the state's role in localspecifically to encourage the use of program improvement? themeasures and standards for local district's role? the school's role?program improvement? the teacher's role?

81

Page 85: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

0'

Local Interview Guide 73

"GRASS-ROOTS" ISSUES [TEACHER]

How do you evaluate your program? What kinds of data are collected?How are the data used to judge thequality of programs andinstructional strategies?Is this a burden on you or yourprogram?

Do you have a Technical Advisory What are you doing this year that isGroup or something similar? different than two years ago?

Does it play a role in programevaluation?If so, how often does it meet andwho participates?How long has the group beenaround?Have there been any changes in thenature or function of the group?Why?

Are you familiar with your state's When was the first time you wereperformance measures and introduced to these measures andstandards? (Ask by measure cat- standards?egory) Were you involved in theirBasic and advanced academic skills development?Occupational competencyJob or work skill attainmentRetention

* PlacementOther

How have performance measures and Have they changed the way youstandards affected you or your evaluate the program? If so, how?program? Have they changed the data you

collect, report or use? If so, how?Have they changed staffresponsibilities? If so, in what

What do you anticipate the effects will ways?be on program planning and Have they created additionalinstruction in the future? burdens or other problems? If so,

please describe.

85

Page 86: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

74 Improving Perkins II Performance Measures and Standards

How useful will performancemeasures and standards be formeasuring the effectiveness of yourprogram?

Will measures and standards helpyou improve your program? Inwhat way?Will measures and standards beused to compare your program toother programs?If yes, how do you feel about that?

What assistance do you think you What assistance have you received

need in order to make the data useful? From whom?Has tt e state held workshops? thedistrict?What kind of technical assistance ismost helpful to you? (e.g.,guidebooks, workshops,conferences, electronic mail, etc.)Have you attended AVA or othernational conferences?Do you receive information fromNCRVE?

What does your program do to How do you monitor and ensureencourage service to special access, retention, and completionpopulations? of special populations?

Have the types of specialpopulations you serve changed overthe past two years?

88

Page 87: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

REFERENCES

Berman, P., and M. W. McLaughlin, Federal Programs SupportingEducational Change: Vol. VIII, Implementing and SustainingInnovations, Santa Monica, Calif.: RAND, R-1589/9-HEW, 1978.

Hoachlander, E. G., K. Levesque, and M. L. Rahn, Accountability forVocational Education: A Practitioner's Guide, Berkeley, Calif.:National Center for Research in Vocational Education, MDS-407,1992.

Rahn, M. L., ai.d M. Alt, "Performance Standards and Measures,"Chapter 2 in National Assessment of Vocational Education FinalReport to Congress, Vol. ill, Program Improvement: Education Re-form, Washington D.C.: U.S. Department of Education, Office ofEducational Research and Improvement, Office of Research, July1994.

Rahn, M. L., E. G. Hoachlander, and K. Levesque, State Systems forAccountability in Vocational Education. Berkeley, Calif.: NationalCenter for Research in Vocational Education, MDS-491, 1992.

Stecher, B. M., and L. M. Hanser, Beyond Vocational EducationStandards and Measures: Strengthening Local Accountability Sys-tems for Program Improvement, Berkeley, Calif.: National Centerfor Research in Vocational Education, MDS-292 1993. (Alsoavailable from RAND as R-4282-NCRVE/UCB, 1993.)

U.S. Congress, Carl D. Perkins Vocational and Applied TechnologyEducation Act Amendment of 1990, P.L. 101-392, September 25,1990.

8775

Page 88: Four States. Berkeley, CA. Jan 95 88p.Mika la L. Rahn, Karen Levesque, E. Gareth Hoachiander, David Emanuel, and Steven G. Klein, MPR Associates, Inc. National Center for Research

76 Improving Perkins II Performance Measures and Standards

U.S. Congress, Student Right-to-Know and Campus Security Act,P.L. 101-542, November 8, 1990.

88