42
Session 206: Performance Dashboards: The University of Miami Approach

2013 HDI Session 206: Performance Dashboards The University of Miami Approach

Embed Size (px)

DESCRIPTION

Session 206: Performance Dashboards: The University of Miami Approach Wednesday, April 17 at 11:30 AM Measurement is important because it puts vague concepts in context; it is simply not enough to say you want to deliver quality service—you must define it before you can know if you’re succeeding. In this session, Eddie Vidal will introduce the tools and templates you’ll need to grade and rank your service desk analysts in eight different categories (based on the same program he implemented at the University of Miami). Since all of these numbers and measurements will get you nowhere without the buy-in and contribution of the analysts you are measuring, Eddie will walk attendees through the steps he followed to gain that buy-in, and provide you with a starting point for implementing analyst dashboards in your organization and improving the quality of service you provide to your customers.

Citation preview

Page 1: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

Session 206: Performance Dashboards: The University of Miami Approach

Page 2: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Eddie Vidal • HDI & Fusion Track Chair • HDI & Fusion Conference Speaker • HDI Strategic & Member Advisory

Board • HDI Southeast Regional Director • President of South Florida HDI Local

Chapter • Published in Support World

Magazine • HDI Support Center Manager

Certified • ITIL V3 Foundation & OSA Certified

Manager, Medical IT Service Desk [email protected]

[email protected] 305-439-9240 2

@eddievidal

http://www.linkedin.com/in/eddievidal

Page 3: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Objectives

• UM Approach – Keeping it simple • Useful information – Why it’s important

to use metrics. • Obtain Buy-in • Create Professional Development

Program

3

Page 4: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Setting Expectations

• Do we know what is expected of us? • If you knew, would you do your job

better? • If you knew the results of your work?

– Know your strengths – Work on weaknesses

• Praise, Praise, Praise

4

Page 5: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

UM Approach – Why?

• Recognize top performers • To demonstrate management cares • Our goal is to achieve better morale, fair

treatment to each team member and obtain consistent performance on a daily basis

5

Page 6: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

UM Approach – Why?

• Specify required performance levels • Track individual and team performance • Plan for head count • Allocate resources • Justification for promotions and salary

increases 6

Page 7: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

UM Approach

• Researched best practices, contacted ITSM peers and used HDI Focus Books, HDI Research Corner

• Several Revisions • Involved and Gained Acceptance from

Team • Obtained buy-in from Management

7

Page 8: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Acceptance As a team member of the IT Support Center I have

participated, provided feedback and helped develop the measurements used for our annual review and recognition plan. I, hereby, acknowledge that I have read and understand the IT Support Center Measurement procedures. By signing, I acknowledge and agree to the criteria by which I will be measured and understand what is expected of me.

_________________ ____________________ Employee Signature Print Name _________________ ____________________ Authorized Signature Print Name _________________ _____________________ Date Date

8

Page 9: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Service Desk Analyst Employee of the Month Spotlight on Success

• Reward • Must reach score

of 90% or higher • One employee

eligible per calendar month*

*If we have a tie, the employee entering the most Service Requests and Incidents will be the winner.

9

Page 10: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

UM Tools

• Service Desk ACD – Nortel/Symposium • Incident Management System – Compco

by MySoft • Reports - Crystal and Excel • Database Administrator

10

Page 11: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

Year Phone Calls Incidents Service Requests

1 26,344 9,830 5,044

2 35,922 15,008 5,339

3 40,719 25,447 6,076

4 40,270 27,791 5,832 11

Page 12: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

What is Measured?

1. Call Monitoring 15% 2. Incident Tracking 15% 3. Average Talk Time 10% 4. Percent Available/Logged in Time 10% 5. First Call Resolution 10% 6. Percent of Service Requests Entered 15% 7. Percent of Team Calls Answered 10% 8. Service Request/Incident Tracking

Accuracy 15%

12

Page 13: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Metrics – Call Tracking

• Percent of incidents entered based on total calls answered – Example: 75 incidents

entered / 100 calls received = 75%

• Weight 15% • Goal 70%

– 70% or higher 15 points – 50% to 69%, 12 points

13

Page 14: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Why Do We Track Incidents?

• To build a repository to identify customer training and education needs

• Ability to build self-help solutions to allow customers to resolve many issues with less impact on the support staff – Level 0 support

• Leads to Problem, Change, Knowledge and Release Management?

14

Page 15: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Service Requests/Incident Accuracy

• Weight 15% • Goal 95% accuracy • Criteria used for grading

• Location • Location • Location

15

Page 16: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Service Request/Incident Tracking Accuracy

• Has the customer been contacted with in 24 hours?

• Are diary entries user friendly? – Does the customer understand it?

• Was the customer kept in the loop? • Was customer sign-off obtained?

16

Page 17: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Ticket Evaluation Template

17

Page 18: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

University of Miami Approach - Scoring

• Subjective – Maybe – Not sure – Hmm – I think so

• Objective – Yes – No

18

Page 19: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Percent of Calls Answered

• Are users calling published number?

• Do you have one Analyst answering most of the calls?

Percent of Service Requests Entered

19

Page 20: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Percent of Workload

20

Agent 1 Agent 2 Agent 3 Agent 4 Agent 5Total Answered 7703 6919 7347 5439 8808Trouble Tix Entered 5120 4682 4514 5176 6268

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

Total Calls Answered and Trouble Tickets Entered

Page 21: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

First Call Resolution (FCR)

• Percentage of incidents resolved on the initial contact with the customer

•Used to measure the knowledge and skill level of the analyst

• Weight 10% • Telecom Goals: 60%

21

Page 22: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Percent Available Time

• Percentage of total time the analyst has been available to take incoming or make outgoing calls

• Talk time (ACD + DN) + Waiting time – Not Ready time = % Available

• Weight 10% • Goal: 6 hours 30 minutes of time

logged in to the ACD

22

Page 23: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Average Talk Time

• Average talk time per analyst • Average time an analyst spends talking to

a customer on each call • Used to determine staffing and training

needs • Weight 10% • Goal 5 minutes

• 5 minutes or less 10 points • 5 minutes or over 0 points

23

Page 24: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Call Monitoring

In order to improve the customer experience, evaluation of calls are

reviewed and graded

24

Page 25: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Four Part Scoring

• Greeting the customer • Key points during the call • Ending the call • Behavioral Questions

25

Page 26: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami 26

Page 27: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Bonus Points

• Knowledge Database Document Contribution • Training, ULearn • Seminars attended

• Must return and present to the team what you learned from the seminar and how it can be applied to the job or team

• Presentations to the Team (SME) • Unsolicited Customer Commendations

27

Page 28: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Additional Performance Appraisal Requirements

• Professional Development • 20 hours of class time per calendar year:

• PDTO CBL • Conflict Resolution in Everyday Life,

Customer Service for the Professional, Setting Personal Goals

• Certification once per year • Microsoft Certified Desktop Support Technician

(MCDST) • Microsoft Certifications for IT Professionals • A+, Network +, Security +, ITIL, VoIP

28

Page 29: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Page 30: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Customer Surveys

I am satisfied with…. 1. The courtesy of the support

representative? 2. The technical skills/knowledge of the

support representative? 3. The timeliness of the service provided? 4. The quality of the service provided? 5. The overall service experience?

30

Page 31: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Customer Surveys

Overall Survey Results Start Date 6/4/12, Scores after 238 surveys 1/18/13

Customer Satisfaction with 1 (Very

Dissatisfied) 2

Dissatisfied 3

Neutral 4

Satisfied 5 (Very

Satisfied)

Total Satisfaction (Combined

4/5 Percentage)

The courtesy of the representative 1% 0% 3% 4% 91% 96%

The technical skills/knowledge of the support representative? 1% 0% 3% 6% 89% 95%

The timeliness of the service provided? 1% 3% 3% 5% 87% 92%

The quality of the service provided? 1% 2% 3% 4% 90% 95%

The overall service experience? 2% 1% 2% 8% 88% 95%

Page 32: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami 32

Page 33: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami 33

Page 34: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami 34

Page 35: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Performance Dashboards

• Who is your customer/audience? • What is the information requested? • How often do they want it? • What format? • How do they want to receive

information?

35

5 Reasons Nobody Reads Your Reports - Plexent http://www.plexent.com/5-reasons-nobody-reads-IT-reports/

Page 36: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Dashboards – My Audience

36

Page 37: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Emails Phone Tickets Requests

Received Sent Offered Answered Abandoned Abandoned

Rate Opened Closed Opened Closed

ITES

Jan 365 0 775 564 145 19% 893 894 0 0

Feb 58 0 581 411 133 23% 162 155 0 0

Mar 134 0 536 447 73 14% 282 296 0 0

Apr 503 0 657 467 154 23% 1061 1129 0 0

May 805 0 569 465 80 14% 1357 1396 3 2

Jun 525 0 659 524 103 16% 1318 1269 2 1

Jul 617 0 545 463 65 12% 1257 1250 8 11

Aug 588 0 594 500 83 14% 1055 995 318 365

Sep 423 0 396 328 61 15% 275 272 575 611

Oct 447 0 365 258 87 24% 287 285 762 717

Nov 360 0 332 216 90 27% 214 213 613 595

Dec 552 0 233 194 29 12% 851 849 426 426

Grand Total 5377 0 6242 4837 1103 18% 9012 9003 2707 2728

USS Metrics Dashboard Information Requested

Page 38: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Channel Tracking

January February March April May June July August September October Novembe

rDecembe

rCalls Offered 695 550 517 644 544 632 532 561 373 345 311 226Calls Answered 506 392 431 459 445 505 452 472 307 246 207 188Email 304 50 120 313 525 289 408 329 286 244 225 352Work Orders 0 0 0 0 2 0 3 240 501 436 377 295Trouble Tickets 809 138 278 799 943 803 850 630 125 143 96 647

0

100

200

300

400

500

600

700

800

900

1000

Page 39: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Trending Year-to-Year

Jan Feb Mar Apr May Jun Jul Aug Sept Oct Nov DecYear 1 3745 3484 3365 4254 2522 4344 2955 2263Year 2 3536 3222 3208 3464 2967 3347 3575 4128 3908 3692 2640 2583Year 3 2970 2476 2619 2637 2714 3332 3707 4595 4504 4406 3141 2848

0

500

1000

1500

2000

2500

3000

3500

4000

4500

5000

Num

ber o

f cal

ls

Total Calls for Year 1, 2 & 3

Page 40: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

40

Page 41: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

University of Miam

i University of M

iami

Three Takeaways

UM Approach – Keeping it simple Useful information – Why it’s

important to use metrics. What we have gained – can you apply

it to your job?

41

Page 42: 2013 HDI Session 206: Performance Dashboards The University of Miami Approach

Thank You for Attending

Contact Information Eddie Vidal 305-439-9240 [email protected] [email protected]

@eddievidal

http://www.linkedin.com/in/eddievidal

Please Complete the Session Evaluation