Upload
oecd-governance
View
598
Download
3
Tags:
Embed Size (px)
Citation preview
Donald P. Moynihan
THE GOOD, THE BAD AND THE UGLY: LESSONS FROM THE US PERFORMANCE
MANAGEMENT SYSTEM
OECD PARIS, NOVEMBER 24, 2014
I will examine how performance reforms have changed use of performance data for management Mark – will focus on budgeting side, and
aspects of US system I will not cover, using a center-of-government perspective
OVERVIEW
BASIC PROBLEM
We define performance budgeting by our aspirations for it
OECD definition: Performance budgeting is the use of performance information to link funding with results with the purpose of increasing efficiency, effectiveness, transparency and accountability
A realistic definition of performance systems A set of formal rules that seek to disrupt strongly
embedded social routines Identify: the empirical effects of these rules: how is data used?
Who are the most likely users? acknowledge and guard against perverse use of data lessons to make the use of performance data more likely
BEING REALISTIC
NATIONAL GOVERNMENT-WIDE CHANGES
Government Performance and Results Act - GPRA
(1993-2010) Program Assessment Rating Tool (2002-2008) GPRA Modernization Act (2010-)
State level variations on these models
Policy-specific reporting requirements, e.g. No
Child Left Behind Act
Well-regarded former general
Lauded for successful efforts to reduce homelessness among veterans
ERIC SHINSEKI, FORMER SECRETARY OF VETERANS AFFAIRS
VHA has goal of admitting patients within 14 days of preferred date
Tied to performance evaluations and pay Schedulers pressured not to exceed goal Indicated preferred date was first one doctor was available, not
patient’s actual preference Cancelled appointments, then rescheduled appointments for the
same time close to the appointment Don’t put people on official wait list
In Phoenix Official: average wait time is 24 days, and 43% of patients seen in 14
day window Reality: 115 day average wait, with 84% of patients seen within that
time Additional 1700 veterans not even on the official waiting list.
PROBLEMS IN VETERANS HEALTH ADMINISTRATION
No claim he knew about or encouraged problems First high profi le causality of per formance metrics at federal level
SHINSEKI RESIGNS
WHY DOES PERFORMANCE PERVERSITY OCCUR?
Mixture of rational perversity and rationalized perversity Rational response to rewards Intrinsic motivations have been “crowded out” by extrinsic
incentives Becomes a cultural norm “Two to three times a month, you would hear something about it
(zeroing out)… It wasn’t a secret at all.” Street level bureaucrats may rationalize perversity Harms no one: “It didn’t affect the veteran’s care.” System is unjust, justifying cheating Do not have power to resist – cheating signals lack of power, not
agency
Can no longer make the case that these are “unanticipated” consequences – empirical regularity
Campbell's law: “The more any quantitative social
indicator (or even some qualitative indicator) is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor”
SOME LESSONS
Stories of cheating draw more media attention than stories of success, or even failure for the agency, or the performance system more generally
Very difficult for researchers to identify perverse use of data Need: detailed qualitative investigations, multiple measures,
statistical tools, sense of skepticism
SOME LESSONS
Did PART and GPRA change how employees used performance data?
EFFECT OF GOVERNMENTWIDE PERFORMANCE REFORMS
Analytical approach: reforms create new routines – exposure to these routines is a proxy for effect of reforms
Assumptions: Routines structure organizational life Behavior is shaped by routines you engage in Performance information use is a social process
About half of employees involved in GPRA, 1/3 in
PART
REFORMS AS ROUTINES
Using federal surveys from 2000 and 2007: Same basic patterns with PART and GPRA
Controlling for other factors, being involved in PART/ GPRA (e.g. setting goals and measures): Associated with passive use (using data to modify strategic
goals and measures that are required) of performance data But not purposeful use: use of data to manage employees,
identify and solve problems
RESULTS
LESSON: MOVE FROM COMPLIANCE TO USE
Governments built performance systems on routines of measuring and disseminating data
Simple supply of performance data does not guide people how to use it, or create demand to use it
Goal prioritization – selecting or being subject to high-priority goals
Goal coordination – selecting or being subject to cross-agency goals
Data-driven reviews – participation in quarterly reviews
NEW ROUTINES IN THE MODERNIZATION ACT
Does being exposed to new routines predict use of data? Does the quality of quarterly reviews
predict use of data?
QUESTIONS
Performance measurement (passive)
Problem solving
PI use for program management
Employee management
Different types of performance information use
RESULTS
In contrast to PART/GPRA, routines created by Modernization Act are associated with purposeful use of data Having data-driven reviews matters, but even
among those involved in reviews, perceived quality of reviews matters
Dominant approach to performance management – rely on extrinsic incentives and standards, assume that will change behavior Reforms as routines Foster performance information use by changing
organizational routines Assume norms will become gradually embedded via
experience of routines
LESSON
Welcome your feedback and questions
Google: Performance Information Project
@donmoyn
CONCLUSION
Involvement in cross-agency goals: (After listing of existing cross-agency goals): To what extent, if
at all, do you agree with the following statements as they relate to one or more of the cross-agency priority goals listed above? I have been involved in creating the cross-agency goals; The
program(s)/operation(s)/ project(s) I have been involved in contribute to the achievement of one or more cross-agency priority goals; I have collaborated outside of my program(s)/operation(s)/project(s) to help achieve the cross-agency priority goals. 56% involved
KEY MEASURES
(After listing of agency priority goals) To what extent, if at all, do you agree with the following statements as they relate to [agency name] priority goals?
I have been involved in creating my agency's priority goals; The program(s)/operation(s)/ project(s) I am involved with contribute to the achievement of one or more of my agency's priority goals; I have collaborated outside of my program(s)/operation(s)/project(s) to help achieve one or more of my agency's priority goals.
72% involved
INVOLVEMENT IN HIGH-PRIORITY GOALS
To what extent, if at all, do you agree with the following statements as they relate to [agency name] quarterly performance reviews? Overall, the program(s)/ operation(s)/project(s) that I am involved with has been the subject of these reviews.
24% involved
INVOLVEMENT IN DATA DRIVEN REVIEWS
To what extent, i f at al l , do you agree with the fol lowing statements as they relate to [agency name] quar terly per formance reviews?
These reviews are held on a regular, routine basis; These reviews focus on goals and objectives that are al igned with my agency’s strategic and per formance plans; Agency leadership actively par t icipates in these reviews; These reviews include staf f with relevant knowledge needed to faci l itate problem solving and identify improvement opportunit ies; My agency has the per formance information needed for these reviews; My agency has the capacity to analyze the information needed for these reviews; Per formance information for these reviews is communicated in an easy -to-understand, useful format; My agency has a process in place for fol lowing up on problems or opportunities identified through these reviews; Program managers/supervisors at my level are recognized for meeting per formance goals discussed at these reviews; Discussion at these reviews provides a forum for honest, constructive feedback; The reviews have led to similar meetings at lower levels
QUALITY OF DATA DRIVEN REVIEWS