34
Donald P. Moynihan THE GOOD, THE BAD AND THE UGLY: LESSONS FROM THE US PERFORMANCE MANAGEMENT SYSTEM OECD PARIS, NOVEMBER 24, 2014

Lessons from the US Perfromance Management System by Donald Moynihan

Embed Size (px)

Citation preview

Donald P. Moynihan

THE GOOD, THE BAD AND THE UGLY: LESSONS FROM THE US PERFORMANCE

MANAGEMENT SYSTEM

OECD PARIS, NOVEMBER 24, 2014

PART I: OVERVIEW

I will examine how performance reforms have changed use of performance data for management Mark – will focus on budgeting side, and

aspects of US system I will not cover, using a center-of-government perspective

OVERVIEW

WHAT DO WE MEAN BY PERFORMANCE

BUDGETING?

BASIC PROBLEM

We define performance budgeting by our aspirations for it

OECD definition: Performance budgeting is the use of performance information to link funding with results with the purpose of increasing efficiency, effectiveness, transparency and accountability

Presenter
Presentation Notes
The gap between our aspirations and the observed effects of performance systems are usually large, resulting in disappointment

A realistic definition of performance systems A set of formal rules that seek to disrupt strongly

embedded social routines Identify: the empirical effects of these rules: how is data used?

Who are the most likely users? acknowledge and guard against perverse use of data lessons to make the use of performance data more likely

BEING REALISTIC

Presenter
Presentation Notes
Not much evidence that public or legislature uses data Focus on managers of public organizations – have incentives to improve performance, and can shape the environment of use

PERFORMANCE MANAGEMENT POLICIES

AT FEDERAL LEVEL

NATIONAL GOVERNMENT-WIDE CHANGES

Government Performance and Results Act - GPRA

(1993-2010) Program Assessment Rating Tool (2002-2008) GPRA Modernization Act (2010-)

State level variations on these models

Policy-specific reporting requirements, e.g. No

Child Left Behind Act

THE UGLY: PERVERSE USE OF

PERFORMANCE DATA

Well-regarded former general

Lauded for successful efforts to reduce homelessness among veterans

ERIC SHINSEKI, FORMER SECRETARY OF VETERANS AFFAIRS

VHA has goal of admitting patients within 14 days of preferred date

Tied to performance evaluations and pay Schedulers pressured not to exceed goal Indicated preferred date was first one doctor was available, not

patient’s actual preference Cancelled appointments, then rescheduled appointments for the

same time close to the appointment Don’t put people on official wait list

In Phoenix Official: average wait time is 24 days, and 43% of patients seen in 14

day window Reality: 115 day average wait, with 84% of patients seen within that

time Additional 1700 veterans not even on the official waiting list.

PROBLEMS IN VETERANS HEALTH ADMINISTRATION

No claim he knew about or encouraged problems First high profi le causality of per formance metrics at federal level

SHINSEKI RESIGNS

WHY DOES PERFORMANCE PERVERSITY OCCUR?

Mixture of rational perversity and rationalized perversity Rational response to rewards Intrinsic motivations have been “crowded out” by extrinsic

incentives Becomes a cultural norm “Two to three times a month, you would hear something about it

(zeroing out)… It wasn’t a secret at all.” Street level bureaucrats may rationalize perversity Harms no one: “It didn’t affect the veteran’s care.” System is unjust, justifying cheating Do not have power to resist – cheating signals lack of power, not

agency

Presenter
Presentation Notes
Rational respnse: According to a 2012 VHA guideline on how to evaluate performance, being "results driven" constituted half of the evaluation for VHA network directors. The only easily measurable factor listed under the "results driven" category was that patients not wait more than 14 days from their desired date for an appointment. Cultural norm At VA, schedulers reality was that supervisors were telling them how to do this. A few weeks later, he said, a supervisor came by to instruct him how to cook the books. “The first time I heard it was actually at my desk. They said, ‘You gotta zero out the date. The wait time has to be zeroed out,’ ” Turner recalled in a phone interview. He said “zeroing out” was a trick to fool the VA’s own accountability system, which the bosses up in Washington used to monitor how long patients waited to see the doctor. Cheating was made easier by the VA’s ancient computer systems, designed decades ago. Normally, for cultural variables, we look at the role of leadership. But Shinseki “tried hard to show he was open to bad news. Three times a year, in fact, Shinseki spent a solid week meeting with regional VA medical directors. That was 63 separate four-hour interviews, every year.” For many clerks, the choice between the bureaucrats they knew and the secretary they didn’t was obvious. “They would say, ‘Change the “desired date” to the date of the appointment,’ ” said one employee knowledgeable about scheduling practices at a VA medical center. The employee, who spoke on the condition of anonymity for fear of retaliation, decided to go along with those requests. Rationaliztion Fighting the order to lie wasn’t worth it. “You know, in the end, the veteran got the appointment that was available anyway,” the employee said. “It didn’t affect the veteran’s care.” Of course this is not true, since it prevents performance data from fruitfully infomring policy decisions Way back in 2005, federal auditors found evidence that clerks were not entering the numbers correctly. By 2010, the problem seemed to be widespread, the VA health service sent out a memo listing 17 different “work-arounds,” including the one that Turner was taught in Texas. Stop it, the VA said. They didn’t. By 2012, in fact, one VA official told Congress he wasn’t sure how to force people to send in the real numbers. “Because of the fact that the gaming is so prevalent, as soon as something is put out, it is torn apart to look to see what the work-around is,” said William Schoenhard, who was then the deputy undersecretary for health for operations and management, an upper mid-level official that VA employees call the “Dushom.” “There’s no feedback loop.” altruists feel they have little choice but to engage in perverse behavior; rationalize behavior they know is wrong It is tempting to think of cheating as a reflection of agency; a willing conspirator cheating us; but it may be felt, at the frtone lines, quite differently; as a reflection of how little autonomy the individual has Lewis: “I couldn’t believe what we had been reduced to” Principal Waller “We’re helping them. They’ll catch up” Sense that cheating was widespread at elementary level and other high schools

Can no longer make the case that these are “unanticipated” consequences – empirical regularity

Campbell's law: “The more any quantitative social

indicator (or even some qualitative indicator) is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor”

SOME LESSONS

Stories of cheating draw more media attention than stories of success, or even failure for the agency, or the performance system more generally

Very difficult for researchers to identify perverse use of data Need: detailed qualitative investigations, multiple measures,

statistical tools, sense of skepticism

SOME LESSONS

THE BAD: ESTIMATING THE EFFECTS

OF GPRA AND PART

Did PART and GPRA change how employees used performance data?

EFFECT OF GOVERNMENTWIDE PERFORMANCE REFORMS

Analytical approach: reforms create new routines – exposure to these routines is a proxy for effect of reforms

Assumptions: Routines structure organizational life Behavior is shaped by routines you engage in Performance information use is a social process

About half of employees involved in GPRA, 1/3 in

PART

REFORMS AS ROUTINES

Using federal surveys from 2000 and 2007: Same basic patterns with PART and GPRA

Controlling for other factors, being involved in PART/ GPRA (e.g. setting goals and measures): Associated with passive use (using data to modify strategic

goals and measures that are required) of performance data But not purposeful use: use of data to manage employees,

identify and solve problems

RESULTS

LESSON: MOVE FROM COMPLIANCE TO USE

Governments built performance systems on routines of measuring and disseminating data

Simple supply of performance data does not guide people how to use it, or create demand to use it

Presenter
Presentation Notes
Transaction costs Cannot observe, therefore cannot enforce

THE GOOD(?): BETTER NEWS FROM THE

MODERNIZATION ACT

Goal prioritization – selecting or being subject to high-priority goals

Goal coordination – selecting or being subject to cross-agency goals

Data-driven reviews – participation in quarterly reviews

NEW ROUTINES IN THE MODERNIZATION ACT

Does being exposed to new routines predict use of data? Does the quality of quarterly reviews

predict use of data?

QUESTIONS

Performance measurement (passive)

Problem solving

PI use for program management

Employee management

Different types of performance information use

RESULTS

In contrast to PART/GPRA, routines created by Modernization Act are associated with purposeful use of data Having data-driven reviews matters, but even

among those involved in reviews, perceived quality of reviews matters

Dominant approach to performance management – rely on extrinsic incentives and standards, assume that will change behavior Reforms as routines Foster performance information use by changing

organizational routines Assume norms will become gradually embedded via

experience of routines

LESSON

Presenter
Presentation Notes
Analects of Confucius: “If you use laws to direct the people, and punishments to control them, they will merely try to evade the punishments, and will have no sense of shame. But if by virtue you guide them, and by the rites you control them, there will be a sense of shame and right”

Welcome your feedback and questions

Google: Performance Information Project

[email protected]

@donmoyn

CONCLUSION

Involvement in cross-agency goals: (After listing of existing cross-agency goals): To what extent, if

at all, do you agree with the following statements as they relate to one or more of the cross-agency priority goals listed above? I have been involved in creating the cross-agency goals; The

program(s)/operation(s)/ project(s) I have been involved in contribute to the achievement of one or more cross-agency priority goals; I have collaborated outside of my program(s)/operation(s)/project(s) to help achieve the cross-agency priority goals. 56% involved

KEY MEASURES

(After listing of agency priority goals) To what extent, if at all, do you agree with the following statements as they relate to [agency name] priority goals?

I have been involved in creating my agency's priority goals; The program(s)/operation(s)/ project(s) I am involved with contribute to the achievement of one or more of my agency's priority goals; I have collaborated outside of my program(s)/operation(s)/project(s) to help achieve one or more of my agency's priority goals.

72% involved

INVOLVEMENT IN HIGH-PRIORITY GOALS

To what extent, if at all, do you agree with the following statements as they relate to [agency name] quarterly performance reviews? Overall, the program(s)/ operation(s)/project(s) that I am involved with has been the subject of these reviews.

24% involved

INVOLVEMENT IN DATA DRIVEN REVIEWS

To what extent, i f at al l , do you agree with the fol lowing statements as they relate to [agency name] quar terly per formance reviews?

These reviews are held on a regular, routine basis; These reviews focus on goals and objectives that are al igned with my agency’s strategic and per formance plans; Agency leadership actively par t icipates in these reviews; These reviews include staf f with relevant knowledge needed to faci l itate problem solving and identify improvement opportunit ies; My agency has the per formance information needed for these reviews; My agency has the capacity to analyze the information needed for these reviews; Per formance information for these reviews is communicated in an easy -to-understand, useful format; My agency has a process in place for fol lowing up on problems or opportunities identified through these reviews; Program managers/supervisors at my level are recognized for meeting per formance goals discussed at these reviews; Discussion at these reviews provides a forum for honest, constructive feedback; The reviews have led to similar meetings at lower levels

QUALITY OF DATA DRIVEN REVIEWS