7
Scorecard Design Diagnostic Guidance: Consult this diagnostic for selecting individual metrics and designing your measurement scorecards. This packet includes: 1.) Metric Selection Decision Rules: These interactive decision tree and checklist helps to aid in the selection of individual metrics. These decision trees pressure test individual metrics for relevance within a performance scorecard. 2.) Scorecard Diagnostic: This checklist assesses the current effectiveness of your performance measurement scorecards. 3.) The Performance Measurement Lifecycle: This performance measurement lifecycle describes the key steps in infrastructure scorecard development and outlines the stakeholders involved in each step.

CIO Metric Design Diagnostic

Embed Size (px)

DESCRIPTION

CIO Metric Design Diagnostic

Citation preview

Page 1: CIO Metric Design Diagnostic

Scorecard Design Diagnostic

Guidance: Consult this diagnostic for selecting individual metrics and designing your measurement scorecards. This packet includes:

1.) Metric Selection Decision Rules: These interactive decision tree and checklist helps to aid in the selection of individual metrics. These decision trees pressure test individual metrics for relevance within a performance scorecard.

2.) Scorecard Diagnostic: This checklist assesses the current effectiveness of your performance measurement scorecards.

3.) The Performance Measurement Lifecycle: This performance measurement lifecycle describes the key steps in infrastructure scorecard development and outlines the stakeholders involved in each step.

Page 2: CIO Metric Design Diagnostic

Metric Selection and Attributes Checklist

Has the audience for the scorecard been identified?

Yes No

Is this a problem of relevance to the intended audience?

Yes No

Begin scorecard design from audience perspective.

Determine if the metric will provide supporting data for another metric

or historical data for future analysis. If yes, measure it, but don’t include in the scorecard.

Discard Metric

Include Metric

Determine if the problem is relevant to another audience. If

yes, include it in a different scorecard. Otherwise, discard the

metric.

Include Metric

Page 3: CIO Metric Design Diagnostic

Metrics Selection and Attributes

For each metric selected using the decision rules on the previous page, determine:

Examples

Unit of Measure ($, %, score, number of people)

Quantitative Target (increase 20%, decrease headcount 10%)

Thresholds (When should it be considered a significant problem?) (>5% green, 6-7% yellow, <7% red)

Frequency of Measurement (monthly, quarterly, semi-annually)

Frequency of Reporting (monthly, quarterly, semi-annually)

Expected “Life” of Metric (When will this metric be reviewed for relevance/ target adjustments?) (one year, six months, permanent)

Formula for Calculating Metric (Including critical assumptions, sub-metrics, etc.)(value of 1 day) x (days of inventory removed) x 15%

Owner and Provider (Responsible for acting on metric and collecting data on metric, respectively) (Josh Levy, David Bernstein, Ira Abramson)

Data sources (Finance Department, CRM system

Estimated Cost of Measurement (In personnel hours per year, system cost per year, etc.) (40 hours per year)

Use this checklist to decide the final outcome of your individual metric

Selected for the given scorecard Potential inclusion in another scorecard Will be measured but not reported on

(i.e., data source for another metric) Discarded

Page 4: CIO Metric Design Diagnostic

2. Scorecard Design Diagnostic

Tailored to Audience Meets RequirementsYes No

a. Do you have customized reports for key audiences (e.g., business, IT)? Yes Nob. Are metrics included in the report screened for relevance to the intended audience? Yes Noc. Are reporting frequencies decided based on audience decision-making needs? Yes Nod. Does the level of detail provided enable actionability for audience context/priorities? Yes Noe. Is the reporting language adapted to the audience for ease of consumption? Yes NoClear Definitions of Metrics Meets Requirements

Yes Noa. Is there a standard, enterprise-wide definition for every metric? Yes Nob. Was audience feedback sought while defining the metrics to verify relevance to decisions? Yes Noc. Does each metric have a target? Yes Nod. Are performance thresholds defined for each metric? Yes Noe. Does each metric have an “owner” responsible for taking actions when metric goes below target? Yes Nof. Do you publish a taxonomy outlining metrics definition, intent, and rationale? Yes NoSimplicity of Performance Meets Requirements

Yes Noa. Do you have a “one-page” scorecard with the high-level metrics? Yes Nob. Is the scorecard formatted to immediately draw attention to exceptions and key issues? Yes Noc. Do you provide easily accessible drill-downs for metrics included in the high-level scorecard? Yes Nod. Is each metric that is below target accompanied by a concise mention of action taken/status? Yes No

Assessment: Total “Yes” Answers

( ≤5 ) Potential redesign required (6-10) Some Revision Required ( ≥11 ) Effective Scorecard Design

Page 5: CIO Metric Design Diagnostic

3. The Performance Measurement Lifecycle

Key Process Steps Involved in Creating and Adopting an Infrastructure Performance Report

Strategy Mapping

Annual infrastructure strategy devolved from IT strategy

Scorecard audience identified

Metric Selection and Dashboard Definition

Performance measurement team creates initial list of metrics.

List is refined by using analysis of each potential metric’s strengths and weaknesses.

Assigning Metric Ownership

Owners assigned to each metric

Owner responsible for defining metric, setting action limits, and establishing data sources

Metric Definition

Performance measurement team creates standard definitions for all metrics, defines measurements and data collection processes, and outlines initiatives that must be completed to allow tracking of metrics.

Data Collection and Quality Assurance

Metrics provider collects individual measures.

Data collection frequency varies by metric based on need, cost of collection, and level of automation.

Data Analysis

Performance measurement team compiles monthly scorecard, noting trends and observations

Scorecard Review

Scorecard reviewed monthly by IT leadership and IT managers to ensure corrective actions are in place.

Managers review appropriate metrics with their respective staff

1

2

3

4

5

6 7