20
Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Embed Size (px)

Citation preview

Page 1: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Metrics & Dashboards

Survey ResultsWith help from Marty Klubeck at Notre Dame and Brenda Osuna at

USC

Page 2: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

2

Who are we?Brown

Carnegie Mellon

Columbia

Cornell

CU-Boulder

Duke University

Georgetown University

Harvard*

Michigan State University

New York University

Penn State

Princeton

Stanford University UC San Diego

UCSF

University of Chicago*

University of Iowa

University of Michigan

University of Minnesota

University of Notre Dame

University of Washington

University of Wisconsin

Virginia Tech

Page 3: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

3

How often do we collect the following types of metrics around service health (effectiveness)?

Demographics Usage/demand Performance Customer Satis-faction

0%

20%

40%

60%

80%

45% 48%

59%

26%34% 31%

25%

11%10% 7% 6%

30%

10% 14%9%

33%

Weekly/Daily/Continuously Monthly or by Semester Annually

Less frequently than annual or not at all

Page 4: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

For what services do we collect metrics?

Good news is that no one said zero!

N=16

4

Most of our services

A few services

Only key services

All of our services

No metrics collected

0%

20%

40%

60%

80%

100%

33%29% 29%

10%0%

Page 5: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

And, our metrics to measure business efficiency and delivery of goals?

OTHER:1) It widely varies depending on the service 2) We do not collect any business efficiency metrics3) Project delivery# of calls abandoned; # change requests; # e-mails; # abandoned calls, resolution time, cycle time, abandonment, etc.; capacity, mean time to repair 5

Time (speed of delivery)

Cost Quality (de-fect/error

rates)

Other0%

20%

40%

60%

80%

100%

Page 6: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Expectations based on our

Service-Level-

Agreements

Targets based on his-torical/trend

data

Expectations based on our customers’ requests/

needs

Targets based on our

peers’ per-formance

Other (please specify)

We don’t use any demarca-tion of “health”

0%

20%

40%

60%

80%

100%

68%63%

53%

32%

21%16%

Our use of targets

OTHER:1) Working on the use of ITIL Information Technology Infrastructure Library 2) Note: we don't do this consistently though 3) We do not use any service target range metrics4) Industry Practices / Standards 6

Page 7: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Metrics are collected and analyzed primarily as…

7

OTHER:System performance metrics in transition to organizational effort.

Grass roots effort Organizational/departmental ef-

fort

Institutional ef-fort

Other (please specify)

0%

20%

40%

60%

80%

100%

67%

43%

10%5%

Page 8: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Who is the audience for our metrics?

8OTHER:Post publicly

Internal IT staff

IT man-agement

University executive leadership

Your user community

Peers in other institu-

tions (for benchmark-

ing)

Other lead-ership

0%

20%

40%

60%

80%

100% 95% 95%

67%

43%38%

29%

Page 9: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

How do we share them?

9

Published for the organiza-tion (Intranet)

Other (please specify)

Published publicly (web

with open access)

Directly to customers

(electronic or hardcopy re-

ports)

Published for current and

potential cus-tomers (web

with con-trolled ac-

cess)

0%

20%

40%

60%

80%

100%

67%

53%

40%33%

27%

Page 10: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Benefits so far

10

OTHER:1) right-sizing the organization; metrics enable us to tune documentation and training and better prepare support providers

Mad

e pr

oces

s/pr

ojec

t ...

Com

mun

icat

e be

tter w

i...

Com

mun

icat

e be

tter w

i...

Impr

ovem

ents

Insig

hts t

o th

e ca

uses

...

Early

-war

ning

-sys

tem

, ...

Oth

er

0%

20%

40%

60%

80%

100%81%

71% 71% 71%62%

43%

10%

Page 11: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

11

How do we rate the maturity of our organization’s use of metrics?

Fully Ma-ture

Maturing Managed Develop-ing

Novice Totally novice

0%

20%

40%

60%

80%

100%

0% 0%

10%

52%

33%

5%

Page 12: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

12

Our use of external data sources

OTHER:1) Gartner for Benchmarking 2) Used to participate in the campus computing survey 3) Gartner

Educause Core Data

IPEDS COFHE0%

20%

40%

60%

80%

100%

84%

72%

23%

68%

56%

8%16%

28%

8%11%

17%

77%Provide data to

Compare data to

Use for defining metrics

Don't use

Page 13: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Any BI action?

OTHER:Currently considering an environment, platform selection pending

13

Have con-sidered

Have not considered

In the process of developing

Have a func-tioning BI

environment

Other 0%

20%

40%

60%

80%

100%

45%

30%

15%10%

5%

Page 14: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Our biggest challenges

OTHER:1) Continuing engagement from mid-level leadership to respond to metrics findings 2) Organizations ability to identify specific KPI's to measure specific objectives 3) Changing leadership/definition of what is necessary and relevant; metrics must mean something to be used effectively; lack of a plan; staff resent 14

Lack

of

dedic

ate

d .

..

Lack

of

auto

mate

d .

..

Lack

of

expert

ise .

..

Lack

of

expert

ise i..

.

Lack

of

consi

sten..

.

Oth

er

(ple

ase

spec.

..

Lack

of

support

fr.

..

0%

20%

40%

60%

80%

100%81%

67%

43%33% 33%

14%5%

Page 15: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

What would we find useful?

OTHER:1) None of the above2) Unified approach to metrics from an organizational perspective; lack of a plan; dedicated resources would be better. No one is going to use another template and different services would be measured by different metrics unless the metrics were provided at a very very very high level

15

Stan-dard

defini-tions

Guide-lines on devel-oping/report-

ing

A tem-plate for basic

metrics

Review by

peers of possible

toolsOn-call exper-tise Other

(please specify)

0% 20% 40% 60% 80% 100%

90%

68%

63%

58%

37%

11%

Page 16: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

16

Tools – what have we used, what do we think?

Excel Cognos SPSS SAS Other BSC Power Pivot SigmaXL Tableau iDashboard Vision0

2

4

6

8

10

12

14

16

18

20

0%

20%

40%

60%

80%

100%

1 1 1

6

21

11

5

43

21 1

1

2

2

21

95%

44%41%

38%

18%13% 13%

7% 7%0% 0%

Inadequate Fair Good Excellent Outstanding Used

“Believe that the process and commitment to consistent data collection is far more important than the tool”

Page 17: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

17

Lessons learned

• Metrics have helped to highlight areas of significant service difficulties (e.g., with BlackBerry services) and to note some low-level points of problems (e.g., around some of our network measures.) At the same time, our current metrics processes are highly manual in nature and require significant time investment to collect and report. We have seen challenges in getting service management engaged on the data writ large, which can lead to problems when errors due to service changes are missed thus impacting trending. Goals for us in coming year include focusing on trend analysis/reporting through executive summarization (done), gaining more mileage out of system-generated metrics on availability and low-level alarms, improving automated collection of non-availability data, and looking to focus data aggregation of human-generated, automated and other data into a dashboard to reduce effort level required to visualize service data.

• Benchmarking is very challenging because of the variance environments at each institution. Cost components may be different, service features and SLAs may not match, accounting practices can be problematic, tracking labor is different, etc.

Page 18: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

18

Lessons Learned• We had a nascent metrics program under development with dedicated

resources, focusing on helping service managers to develop metrics with their local data. With the departure of that resource in October, we are choosing to re-prioritize the work away from dedicated attention to metrics at this time. Instead, we watch with great interest the aggressive agenda that EDUCAUSE has developed with the reinvigoration of ECAR under Susan Grajeck. We will continue to monitor the progress of the various EDUCAUSE initiatives around research, data, and analytics and pursue collaboration opportunities based on our own priorities and resources.

• We did quite a push to get a metrics dashboard going a couple of years ago which was quite successful. However, the backend work of building a data metrics repository was never completed. This has limited us from getting deeper analytics questions answered and still requires us to perform manual queries often. On the other hand, when we recently needed to pull together a metrics dashboard for a large client (a hospital) we were able to reuse much of the work we had done previously.

Page 19: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

19

Lessons Learned• We collect a lot of operational performance data using traditional tools

(Cricket, Nagios, home-grown scripts) but don't have a reasonable dashboard or approach to making the data useful. We have recently started measuring performance of our service desk and groups behind them to track delivery against SLAs in our service catalog. We've started a Service-Now implementation and expect to use metrics delivered by that tool.

• Challenge getting consistent operational definitions both for internal use and benchmarking; Data collection is still a time consuming, manual process that we are working to automate through the collection of metrics from disparate systems into a BI environment; We are exploring the use of Microsoft BI tools (e.g. PowerPivot, SQL Server 2012, PowerView)

Page 20: Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

20

THANK YOU