21
CREATING ALIGNMENT BETWEEN THE CENTER AND AGENCIES Mark Bussow Office of Management and Budget Executive Office of the President

Aligning the centre and line ministries - Mark Bussow, United States

Embed Size (px)

Citation preview

CREATING ALIGNMENT BETWEEN THE CENTER AND AGENCIES

Mark Bussow Office of Management and Budget Executive Office of the President

• The Center: White House offices (POTUS/VP, OMB, Policy Councils), General Services Administration, Office of Personnel Management

• Agencies: Focus primarily on 24 major Federal agencies, which are diverse on multiple levels • Mission • Size (Budget/personnel/geographic reach/management

capacity) • Organizational complexity (e.g., mission diversity,

subcomponent autonomy, distinctive cultural norms) • Types of activities (direct services, research, regulatory,

grants, contracts, etc.) • Legal autonomy • Political context • Management process maturity

2

• Quarterly Performance Reporting on Central Site (Performance.Gov)

Goals Timing Performance Reviews

Current Number

1. Federal Cross-Agency Priority (CAP) Goals

Every 4 years (next in Feb 2018)

Quarterly reviews by OMB Director/PIC

16

2. Agency Priority Goals (APGs)

Every 2 years (next in Feb 2018)

Quarterly reviews by agency COO/PIO

99

3. Strategic Goals and Objectives

Every 4 years (next in Feb 2018)

Annual strategic reviews/Risk Mgmt by agencies and OMB

380

Context: the Federal Performance Framework

4

President’s Budget Agency Strategic Review

Risk Profiles FedStat

HR Stat

Tech Stat

Agency Priority Goals

SES performance management and

awards

Employee Engagement Reviews

GS performance management and

awards

WHCOS led high level review of priorities and major risks

OMB and agency data-driven reviews

Individual Performance

POTUS- Engagement

High Impact IT List

Acq- Stat

Alternative Pay Plan performance

management/awards

Comprehensive Assessment of Mission and Management Performance, and Key Risks, Aligned with Budget Formulation

Bench-marking

Context: Overview of OMB and Agency Reviews

Priorities We Are Tracking

Portfolio-stat

GAO High Risk List

CAP Goals

OMB Work Plan

• Need for agency ownership • Accommodating diversity • Promoting innovation • Expertise needed • Political risk • Need for capacity

5

• Each policy and process is different • Getting the right balance is key (sometimes is zero-sum) • Your choices will impact relationships

• Leadership / Administration priority • Need for consistency • Cross-cutting issues • Confidence in theory of change/design • Need for accountability

Some factors to consider

Centralizing Factors Decentralizing Factors

When deciding

6

Support

(Example: Strategic

Reviews/ERM)

Negotiation (Example:

APGs)

Legal Compliance

(Example: Annual

Performance Report)

Command and Control

(Example: CAP Goals)

Central Ownership Lower Higher

Hig

her

Low

er

Age

ncy

Ow

ners

hip

Determining your policy structure

7

Determining your governance structure

Central organizational structures OMB: Authority, Policy and Process

Performance Improvement Council (PIC): Operations and best practices

Coordination Mechanisms Bi-weekly policy check-ins PIC working Groups OMB policy rollout meetings Summits and workshops PIO Principals meetings Community Meetings

Agencies Chief Operating Officers (COOs)

Performance Improvement Officers (PIOs) Goal Leaders

APPENDIX

Senior agency leaders typically focused on policy, communications, budget, legislation —

not delivery of results

To achieve the Administration’s goals, we need leaders to focus on driving results

through improved implementation

1. Why Agency Priority Goals?

Agency Priority Goal Overview

Engage Agency Heads

Identify Goal Leaders Action Plans Quarterly

Updates Data-Driven Performance

Reviews

Public Updates on

website

Senior Goal Leader

Goal Lieutenant

Quarterly Targets

Quarterly Milestones

Agency Reviews

OMB Reviews based on:

Quarterly Data,

OMB Surveys,

Goal Leader Surveys on Likelihood of Success

Progress on Priority Goals Reported on Website

3-8 set by agency heads Ambitious, Meaningful Measurable Within Current Budget/Legislation

Identify problems Strategy Measures Milestones Contributing Programs Management Review Processes

10

Some results

2. Why Cross-Agency Priority Goals (CAP Goals)?

12

Ingredients Outcomes

Right Goal Faster Decisions

Senior Level Attention Better Decisions

Engaged Goal Team Barriers Removed

Clear Focus Increased Coordination

Defined Delivery Chain

Increased Accountability

More Visibility

• Tool to address the longstanding challenge of tackling horizontal problems across vertical organizational silos.

• We have existing inter-agency processes for budget, legislation, and policy decisions, but lack mechanisms for coordinating implementation activities across multiple agencies

What are CAP Goals? • Presidential priorities which are long-term in nature. Set one year after a

new term begins. We currently are in our second round of CAP goals with a 4-year time horizon.

How we implement and hold people accountable? • Establish Co-Goal Leaders and Deputy Goal Leaders from both EOP and

Agencies • Develop action plans with metrics, quarterly milestones, governance

structures and contributing programs • Goal Leaders responsible for reviewing progress regularly with contributing

programs. OMB reviews progress quarterly with monthly deep dives; public updates reported each quarter on performance.gov to increase accountability

FY16 President’s Budget proposed additional capacity for cross-agency work. • Realized lack of capacity was major risk to long-term success. • Two proposals released in February – 1) New $15 million fund to support

cross-agency activities for CAP Goals; 2) New White House Leadership

CAP Goal Overview 13

15 CAP Goals: Deep Dives, Dashboards, 16 WH Leadership Development Fellows, and $15M CAP Goal Fund

3. Why Strategic Reviews? • Too often, there is not sufficient questioning on the

results/impact of Federal programs.

• To ensure agencies conduct a comprehensive assessment of all outcomes, OMB worked with agencies to develop an annual “strategic review” using strategic objectives as the unit of analysis.

The Strategic Reviews will:

• Identify opportunities for budget decision making, reform proposals, executive actions, communications opportunities, etc.

• Synthesize a broad evidence and information base (indicators, evaluations, risk management, partner contributions, external factors, research, etc.) and prioritize findings for decision-making

• Make meaningful distinctions in performance, such as identifying 10-20% of areas of noteworthy progress and

15

3. Strategic Review Process & Timeline 16

Agency Methodology Developed

Agency Conducts Review

OMB Engagement

Agency Submission Publication

• Annual Performance Report includes findings and Performance Plan proposes improvement actions

• President’s Budget reflects key proposals

• Agencies provide OMB a “summary of findings” from their review for deliberation

• OMB provides feedback and priorities for policy and budget development

• Agencies assess each objective

• Agency leaders determine proposed changes to operations or budget and legislative proposals

• Agencies develop a method to assess progress

• OMB reviews method

• Agency budget and performance submissions incorporate findings and OMB feedback

Winter Spring May Sept. Feb.

The “strategic review”

3. Why Strategic Reviews?

Performance Plan: What did we want to

happen? Performance Report: What

actually happened?

Retrospective Decision Point

Evaluation: What would have happened without us, and why did things

occur like they did?

Reporting, Review, & Evaluation

Learning • What happened and why • Lessons learned • Research and improved understanding • Exploration and innovation

Planning & Foresight Target

Projection

Baseline

Improvement Actions • Changes to strategy and tactics • Operational improvements • Budget and legislative proposals

Prospective

Target

Actual

Data lag

Policy lag

Impact

Implementation

Risks Opportunities Evidence, Evaluation, and Measurement

Backward Looking

Forward Looking

• Are strategies having the intended impact?

• Were outcome targets met? • How big of impact did the program

have compared to what would have happened otherwise?

• Were there unintended outcomes as a result of the strategies employed?

• Were output targets met? • Was the program cost effective? • Were there unanticipated challenges

in program design, delivery or implementation?

• What organizational, process and technical factors presented challenges?

• Did we have adequate mission support?

• Have their been any significant innovations from our partners or peers we can replicate?

• Are there any new technologies becoming available?

• Do we anticipate any changes in our level of support from key partners?

• Are there any upcoming changes to human capital or resource levels?

• Are there changes in anticipated need?

• Are there any external factors which could disrupt progress?

• Are there changes in anticipated need?

• Are there any external factors which contribute to progress?

Conceptual Model

Summary of Strategic Review Findings

19

Mission 1 Prevent Terrorism and

Enhance Security

Mission 2 Secure and Manage

Our Borders

Mission 3 Enforce and Administer Our Immigration Laws

Mission 4 Safeguard and Secure

Cyberspace

Mission 5 Strengthen National Preparedness and

Resilience

Goal 1.1: Prevent Terrorist Attacks

Goal 1.2: Prevent and Protect Against the Unauthorized Acquisition or Use of CBRN Materials and Capabilities

Goal 1.3: Reduce Risk to the Nation’s Critical Infrastructure, Key Leadership, and Events

Goal 2.1: Secure U.S. Air, Land, and Sea Border and Approaches

Goal 2.2: Safeguard and Expedite Lawful Trade and Travel

Goal 2.3: Disrupt and Dismantle Transnational Criminal Organizations and Other Illicit Actors

Goal 3.1: Strengthen and Effectively Administer the Immigration System

Goal 3.2: Prevent Unlawful Immigration

Goal 4.1: Strengthen the Security and Resilience of Critical Infrastructure against Cyber Attacks and other Hazards

Goal 4.2: Secure the Federal Civilian Government Information Technology Enterprise

Goal 4.3: Advance Cyber Law Enforcement, Incident Response, and Reporting Capabilities

Goal 4.4: Strengthen the Cyber Ecosystem

Goal 5.1: Enhance National Preparedness

Goal 5.2: Mitigate Hazards and Vulnerabilities

Goal 5.3: Ensure Effective Emergency Response

Goal 5.4: Enable Rapid Recovery

Noteworthy Progress

Satisfactory Progress

Focus Areas for Improvement

Summary of Findings- Safety Despite the lowest level of transportation fatalities in in the past two decades, DOT is tracking some troubling trends:

• Nearly 5,000 non-vehicle occupants were killed in 2014. – Pedestrian and bicyclist fatalities have been on a steady increase since 2009. – Non-occupant fatalities have increased by 2% since 2009.

• Public Transit fatalities have increased from 2009-2014 – Transit ridership has grown by 25%, but fatalities have only increased by 0.2%. – Transit riders are 40 times LESS likely to be involved in a fatal accident than car and

truck passengers.

• DOT sets a high bar for safety – Toward Zero Deaths – DOT continues to study human factors that contribute to unsafe behavior, including

distraction, impairment, and system design. – DOT continues to promote technology and innovation to improve safety.

Impact

Implementation

Risks Opportunities

Evidence, Evaluation, and Measurement

Backward Looking Forward Looking

• Are strategies having the intended impact?

• Were outcome targets met? • How big of impact did the program have

compared to what would have happened otherwise?

• Were there unintended outcomes as a result of the strategies employed?

• Were output targets met? • Was the program cost effective? • Were there unanticipated challenges in

program design, delivery or implementation?

• What organizational, process and technical factors presented challenges?

• Have their been any significant innovations from our partners or peers we can replicate?

• Are there any new technologies becoming available?

• Do we anticipate any changes in our level of support from key partners?

• Are there any upcoming changes to human capital or resource levels?

• Are there changes in anticipated need?

• Are there any external factors which could disrupt progress?

• Are there changes in anticipated need?

• Are there any external factors which contribute to progress?

Example summary of findings