Upload
trinhthu
View
217
Download
1
Embed Size (px)
Citation preview
Stanford / MIT BenchmarkingStanford / MIT BenchmarkingIT Help DeskIT Help Desk
Final PresentationFinal Presentation
November 13, 2002November 13, 2002
1/5/052
AgendaAgenda Project GoalsProject Goals
Help Desk BenchmarkingHelp Desk Benchmarking–– Goals &Goals & benchmark metrics benchmark metrics
–– Initial data comparisonsInitial data comparisons
–– Findings & hypothesesFindings & hypotheses
–– Quick wins and lessonsQuick wins and lessons
–– Tools and timelines to get thereTools and timelines to get there
Benchmarking as a MethodologyBenchmarking as a Methodology
1/5/053
Benchmarking Project GoalsBenchmarking Project Goals
Help Desk SpecificHelp Desk Specific–– Enable comparisons between institutionsEnable comparisons between institutions–– Develop ongoing management toolDevelop ongoing management tool
»» Determine metricsDetermine metrics»» Develop initial comparisons & findingsDevelop initial comparisons & findings»» Identify tools needed, future plansIdentify tools needed, future plans
Benchmarking in Higher EducationBenchmarking in Higher Education–– Enable comparisons among schools and industryEnable comparisons among schools and industry–– Develop methodologyDevelop methodology–– Provide a test-caseProvide a test-case–– Develop strategy to expandDevelop strategy to expand
See additional context in Appendix 1 Project History & Goals
1/5/054
Benchmarks must tie toBenchmarks must tie tomanagement goalsmanagement goals
1.1. Support client needs with quality serviceSupport client needs with quality service
2.2. Be responsiveBe responsive
3.3. Be cost effectiveBe cost effective
4.4. Provide appropriate level of investmentProvide appropriate level of investment
5.5. Develop effective, mature processesDevelop effective, mature processes
6.6. Maintain high-performing, competent teamMaintain high-performing, competent team
7.7. Support rollout of new systemsSupport rollout of new systems
1/5/055
Goals must tie to specific metricsGoals must tie to specific metricsInvest Appropriately
• % of budget• Clients served/FTE
Be Cost Effective• Cost per case by topic• Total costs by topic• Cases by media, including
self-help
Be Responsive• Elapsed time per case• Call abandonment• Hold time• Time to answer
Support Customer Needswith High Quality Service
• Annual customer survey• Spot-surveys on selected
transactions
Develop Effective, Mature Processes• # of contacts vs. # of days to resolve• Origin of Help Desk cases
Develop High Performing,Competent Teams
• Employee satisfactn survey• Individual perf. metrics• Team performance metrics• Training $$ / FTE• % Help Desk certification• Case volume compared to
staff skills mixSupport Rollout of New Systems• Case volume by topic 3 months
before and after launch• Minutes per case
See Appendix 6 for discussion of specific data elements to calculate benchmarks
1/5/056
Caveat:Caveat:Best benchmarking is yet to comeBest benchmarking is yet to come Problems with Problems with historicalhistorical data data
–– AssumptionsAssumptions
–– ExtrapolationsExtrapolations
–– Simply missing some dataSimply missing some data
Similar but very different operationsSimilar but very different operations
Common metrics going forwardCommon metrics going forward
DonDon’’t focus too heavily on current data;t focus too heavily on current data;look to future!look to future!
1/5/057
Context for Comparison: Help Desk OperationsContext for Comparison: Help Desk Operations
Multiple, distributedMultiple, distributedSingle, separate locationSingle, separate locationLocationLocation
Distributed support modelDistributed support modelacross ITSSacross ITSS
Consolidated + unquantifiedConsolidated + unquantifiedsupport in acad. depts.support in acad. depts.
OrganizationOrganization
Heavy Web-formHeavy Web-formHeavy email 50%+Heavy email 50%+MediaMedia
Customized RemedyCustomized RemedyHome-grown JavaHome-grown JavaToolTool
1 published #, then phone tree1 published #, then phone tree4 phone numbers4 phone numbersACDACD
Full-time assignmentsFull-time assignments2-4 hour blocks; many 2-4 hour blocks; many stdtsstdtsStaffingStaffing
Individual officesIndividual officesCubes + Cubes + ““call centercall center””OfficesOffices
Tier 1 & 2Tier 1 & 2
(10 min limit Tier 1)(10 min limit Tier 1)Single unit; junior/seniorSingle unit; junior/seniormixed; informal limit 15 minmixed; informal limit 15 min
StructureStructure
StanfordStanfordMITMIT
See more details in Appendix 3. “How Each Help Desk Works”
1/5/058
Context for Comparison: FY 02 SizingContext for Comparison: FY 02 Sizing
29%29%56,125 56,125 2243,55343,553Tickets ProcessedTickets Processed
-32%-32%18.6 18.6 2227.327.3Full Time Staff (FTE)Full Time Staff (FTE)
5%5%$x $x 22$x$xAnnual Base BudgetAnnual Base Budget
No StudentsNo StudentsWith StudentsWith StudentsHelp Desk InformationHelp Desk Information
59%59%430 430 11270270Full Time Staff (FTE)Full Time Staff (FTE)
83%83%$x $x 11$x$xAnnual Base BudgetAnnual Base Budget
IT Department InformationIT Department Information
26%26%$1,937,900$1,937,900$1,535,949$1,535,949University Consolidated BudgetUniversity Consolidated Budget
28%28%24,96524,96519,43419,434Total PopulationTotal Population
39%39%14,17314,17310,20410,204StudentsStudents
17%17%10,79210,7929,2309,230Faculty & StaffFaculty & Staff
VarianceVarianceStanfordStanfordMITMITDemographicsDemographics
1 Includes providing telecommunications for Stanford’s hospital2 Does not include Student Help Desk due to no tracking/ticketing system. Approximate increase with students would be +$275K and +5 or 6 FTEs.
1/5/059
Gauging investment and effectivenessGauging investment and effectiveness
-19%-19%$33.92$33.92$41.83$41.83Help Desk Budget / TicketHelp Desk Budget / Ticket
89%89%3,0173,0171,5951,595Tickets / Help Desk FTETickets / Help Desk FTE
89%89%1,342 1,342 11712712Population per HD EmployeePopulation per HD Employee
0%0%2.25 2.25 112.242.24Tickets / School PopulationTickets / School Population
2.3%2.3%4.2%4.2%Help Desk Budget / IT BudgetHelp Desk Budget / IT Budget
4.2%4.2%2.9%2.9%IT Dept Budget / University BudgetIT Dept Budget / University Budget
VarianceVarianceStanfordStanfordMITMIT
1 This ratio’s meaningfulness is affected because it does not include Student Help Desk numbers due to no tracking/ticketing system.
Cost per Ticket
$0
$20
$40
$60
$80
$100
$120
$140
$160
$180
$200
Accounts
Backup
Business
Apps
Business
Func
Cluster
Connect
ivity
Course
wareEmail
Hardware
OS Soft
wareOther
Printin
g
Desk Soft
ware
Securit
y/Viru
sWeb
MIT Stanford
Goal: Be cost effective
See supporting data in Appendix 9
1/5/0511
Total Annual Cost by Help Desk Topic
$0
$80,000
$160,000
$240,000
$320,000
$400,000
$480,000
Accounts
Backup
Business
Apps
Business
Func
Cluster
Connect
ivity
Course
wareEmail
Hardware
OS Soft
wareOther
Printin
g
Desk Soft
ware
Securit
y/Viru
sWeb
Annu
al C
ost
MITStanford
Goal: Be cost effective
See supporting data in Appendix 9
1/5/0512
MIT First Contact Helpdesks
0
2000
4000
6000
8000
10000
12000
Accts/I
D's/Auth
Backup
Bus App Suppt
Bus. Functi
ons
Cluster
Connect
ivity
Course
wareEmail
Hardware
OS Soft
wareOther
Printin
g
Producti
vity S
W
Securit
y/Viru
sWeb
Case
s & C
ompl
exity
s
4-101-40-1
Estimated Complexity of Case
Goal: Be cost effective
See supporting data in Appendix 9
1/5/0513
Stanford Helpdesks: Tier Where Resolved
0
5000
10000
15000
20000
Accts/I
D's/Auth
Backup
Bus App Suppt
Bus. Functi
ons
Cluster
Connect
ivity
Course
ware Email
Hardware
OS Soft
wareOther
Printin
g
Producti
vity S
W
Securit
y/Viru
sW
eb
Cas
es &
Com
plex
ity
OtherLevel 2Level 1
See supporting data in Appendix 9
Goal: Be cost effective
1/5/0514
Impact: New Business Applications
-
1,000
2,000
3,000
4,000
5,000
6,000
7,000
A S O N D J F M A M J J12 Months FY02
Num
ber
of T
roub
le T
icke
ts ITSS Help DeskLevel 1
ITSS Help DeskLevel 2
LegacyApplications
Rollout NewApplications
Total HelpTickets
Some rollout application tickets included in HDLevels 1 & 2.
Kronos/HR People Soft
Jan.
HR Salary Setting
May
Axess Probs. March
Goal: Support rollout of new systems
See supporting data in Appendix 9 (Stanford data only).
1/5/0515
Preliminary data offers initial observationsPreliminary data offers initial observations
Implementation choices affect Help Desk costsImplementation choices affect Help Desk costs–– MITMIT
»» EmailEmail»» Connectivity to desktopConnectivity to desktop»» Security, KerberosSecurity, Kerberos
““Time is money.Time is money.”” Topics are expensive when they Topics are expensive when theyare complex, must escalate, or relate to uniqueare complex, must escalate, or relate to uniqueapplicationsapplications–– Specialists are required more frequently for unique,Specialists are required more frequently for unique,
proprietary issuesproprietary issues
System rollouts create overall spikes and some dipsSystem rollouts create overall spikes and some dipsin specific areasin specific areas
–– StanfordStanford»» Accounts, authenticationsAccounts, authentications»» Business appsBusiness apps
1/5/0516
Initial observationsInitial observations
Student employeesStudent employees»» MIT Help Desk employs more students at a lower overallMIT Help Desk employs more students at a lower overall
budgetbudget»» More FTEs but difficult now to gauge overall effectiveness ofMore FTEs but difficult now to gauge overall effectiveness of
using studentsusing students
Structured tiers:Structured tiers:»» Using structured tiers Using structured tiers maymay support a greater number of cases support a greater number of cases»» Resolving at Tier 1 Resolving at Tier 1 significantlysignificantly reduces costs reduces costs»» You can either You can either ““tier the worktier the work”” through process choices, or can through process choices, or can
““tier the stafftier the staff”” to handle only certain types of work to handle only certain types of work
““MediaMedia”” of case submission of case submission maymay affect costs affect costs»» Web submission Web submission maymay support greater number of cases/FTE support greater number of cases/FTE
1/5/0517
Responsiveness to Phone CallsResponsiveness to Phone Calls
5 min.5 min.4%4%59.559.5HDI IndustryHDI IndustryComparisonsComparisons
20392039415415
(6.9 min)(6.9 min)999913%13%4545MIT-MIT-
All desksAll desksJul 01Jul 01–– Jun 02 Jun 02
403240324.9 min4.9 min252252
(4.2 min)(4.2 min)686817%17%5151StanfordStanford
Tier 1 onlyTier 1 onlyJan-Aug 02Jan-Aug 02
AverageAverage
# of Calls# of CallsMonthlyMonthly
TimeTimebetweenbetween
CallsCalls(for staffer)(for staffer)
AverageAverageCallCall
LengthLength(seconds)(seconds)
TimeTimebeforebeforeCallerCaller
AbandonsAbandons(seconds)(seconds)
AbandonAbandonRateRate
%%
AverageAverageSpeed toSpeed toAnswerAnswer
(seconds)(seconds)
Speed to Answer = Time a call waited before being answered by a person.Abandon rate = % of calls where customer hung up; includes callers that hung up after reaching voice mail.Call Length = Time from when a call is answered to when it is released. Time between Calls = Time between calls that an ACD agent handles. Agent closes out ticket before getting back in queue to receive calls.
Goal: Be responsive
1/5/0518
Customer satisfaction appears comparable acrossCustomer satisfaction appears comparable acrossinstitutions.institutions.
Improvement effortsImprovement effortsdo bear fruit asdo bear fruit asshown in MITshown in MIT’’sstwo annual surveytwo annual surveyresults.results.
MIT Computing Help Desk (All Levels) and Stanford Help Desk (Level 1)"Snapshot" Surveys after Recently Closed Cases
Annualized Average
0.0
1.0
2.0
3.0
4.0
5.0
MIT 2001 SU 2002
5 Po
int
Like
rt S
cale
Timeliness ofResponse
Quality ofResolution
Courtesy andProfessionalism
TechnicalCompetence
OverallSatisfaction
MIT Computing Help Desk (All Levels) from IS Annual Customer Satisfaction Surveys
0.0
1.0
2.0
3.0
4.0
5.0
2000 2002
Lik
ert
Sca
le
Timeliness of Response
Quality of Resolution
Courtesy and Professionalism
Technical Competence
Overall Satisfaction
Ability to Get Through
Turnaround Time
Goal: Support customers with high quality service
See supporting data in Appendix 5
1/5/0519
The vast majority of cases are resolved quickly.The vast majority of cases are resolved quickly.
0
5
10
15
20
25
30
35
40
0 10 20 30 40 50
n of Interactions
Day
s to
Clo
se
Help Desk Overall
Goal: Develop effective, mature processes
Case neglected orCustomer unresponsive
Desirable
Time to Close Cases -- Help Desk
61.53%
8.46% 9.23%10.95%4.09% 3.40% 2.35%
0
500
1000
1500
2000
2500
3000
3500
4000
4500
5000
1 3 7 14 21 31 More
Days
Cases
.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
90.00%
100.00%
1/5/0520
Individual performance can vary greatly;Individual performance can vary greatly;must consider Hawthorne effectmust consider Hawthorne effect
Goal: Maintain high-performing, competent team
Stanford IT Help Desk (Level 2)
Hours Logged FY02
1086
676
630
623
532
461
414
294
130
0 200 400 600 800 1000 1200
Staff Member F
Staff Member G
Staff Member H (80%)
Staff Member B
Staff Member E (55%)
Staff Member I
Staff Member A
Staff Member C (75%)
Managers D
1/5/0521
More team or employee-related metrics areMore team or employee-related metrics aredesirable but need more high level discussiondesirable but need more high level discussion
Employee satisfaction surveyEmployee satisfaction survey
Customer satisfaction tied to individualCustomer satisfaction tied to individualservice providersservice providers
IndividualIndividual performance metrics performance metrics–– MIT already tracks by teamMIT already tracks by team
–– Stanford tracks by individualStanford tracks by individual
Help Desk CertificationsHelp Desk Certifications–– Behavioral and/or technical competenciesBehavioral and/or technical competencies
Goal: Maintain high-performing, competent team
1/5/0522
Initial performance data also yields someInitial performance data also yields someobservationsobservations
Customer satisfactionCustomer satisfaction»» Appears comparable across both institutionsAppears comparable across both institutions
»» Improvements efforts did increase satisfaction over 2 years (MIT)Improvements efforts did increase satisfaction over 2 years (MIT)
Process effectivenessProcess effectiveness»» Better categorization, identification of the work may help with fasterBetter categorization, identification of the work may help with faster
escalation. Get to the escalation. Get to the ““right placeright place”” faster. faster.
»» Vast majority are resolved quicklyVast majority are resolved quickly
Employee performanceEmployee performance»» Can vary significantly among individualsCan vary significantly among individuals
»» Metrics do affect behavior (Stanford)Metrics do affect behavior (Stanford)
1/5/0523
The data also raise good questionsThe data also raise good questions
ProcessesProcesses–– Which cases should be escalated more quickly?Which cases should be escalated more quickly?–– Should you tier the Should you tier the ““workwork”” or the or the ““organization?organization?””–– How does web-submission affect cost?How does web-submission affect cost?
StaffingStaffing–– Is student employment effective?Is student employment effective?–– What additional training should What additional training should ““tier 1tier 1”” receive? receive?–– How should each institution use employee performance data?How should each institution use employee performance data?
Support intensive systemsSupport intensive systems–– Should support intensive systems be replaced?Should support intensive systems be replaced?–– How can we help with new system design to minimize Help DeskHow can we help with new system design to minimize Help Desk
requirements?requirements?
Investments to improve efficienciesInvestments to improve efficiencies–– Which tools should we acquire to improve performance?Which tools should we acquire to improve performance?
1/5/0524
Quick Wins for ImplementationQuick Wins for Implementation
Quick WinQuick WinIn PlaceIn Place
Reconfigure SpaceReconfigure Space Reconfigure space and move staff to allow for more efficienciesReconfigure space and move staff to allow for more efficiencies
and collaborationand collaboration
Quick WinQuick Win
Quick WinQuick Win
Quick WinQuick Win
Not quickNot quick
Proactively use dataProactively use data Generate & review weekly metric reportsGenerate & review weekly metric reports Generate weekly Generate weekly ““standardsstandards”” for tickets processed or time spent for tickets processed or time spent
and use as part of individual performance managementand use as part of individual performance management
Quick WinQuick WinQuick WinQuick Win
Customer feedbackCustomer feedback Initiate or increase Initiate or increase transaction-basedtransaction-based ““spotspot”” surveys surveys
Quick WinQuick Win
In PlaceIn Place
Quick WinQuick Win
Quick WinQuick Win
In PlaceIn Place
Quick WinQuick Win
In PlaceIn Place
Quick WinQuick Win
Quick WinQuick Win
In PlaceIn Place
Quick WinQuick Win
Quick WinQuick Win
TrackingTracking Track tickets at student-staffed Unix deskTrack tickets at student-staffed Unix desk
Track Track internalinternal hand-offs or hand-offs or ““tiers/escalationstiers/escalations”” explicitly explicitly
Standardize work reporting categoriesStandardize work reporting categories
Track type of media for each caseTrack type of media for each case
Consolidate reporting functions into one ticket systemConsolidate reporting functions into one ticket system
Examine excess ticket counts in specific categoriesExamine excess ticket counts in specific categories
StanfordStanfordMITMIT
1/5/0525
Next Phase of Help Desk BenchmarkingNext Phase of Help Desk Benchmarking
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
Long TermLong Term
High Performing TeamHigh Performing Team Solicit employee feedback for process and job improvementSolicit employee feedback for process and job improvement
Track % of HD Certifications and training $ per employeeTrack % of HD Certifications and training $ per employee
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
In PlaceIn Place
ManagementManagement Create cross-functional ITSS team for Delphi rolloutCreate cross-functional ITSS team for Delphi rollout
Institute regular review of metrics with financeInstitute regular review of metrics with finance
Create Help Desk Create Help Desk ““Standard Standard Oprtng ProcdrsOprtng Procdrs”” & & HandbkHandbk
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
In PlaceIn Place
Customer feedbackCustomer feedback Collaborate on annual customer surveyCollaborate on annual customer survey
Define process for using customer survey responses Define process for using customer survey responses
Next PhaseNext Phase
Next PhaseNext PhaseNext PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext PhaseNext PhaseNext Phase
In PlaceIn Place
Next PhaseNext Phase
Long TermLong Term
Software or Hardware InvestmentsSoftware or Hardware Investments Scope RemedyScope Remedy changes needed for changes needed for bmarkingbmarking; engage consultant; engage consultant
CasetrackerCasetracker to allow consultant to track to allow consultant to track touch minutestouch minutes per case and per case andescalationsescalations (tiers) both within and outside HD (tiers) both within and outside HD
Knowledge ManagementKnowledge Management system system
Pilot use to help HD staff retrieve standard answersPilot use to help HD staff retrieve standard answers
Evaluate usefulness of client use (self-help)Evaluate usefulness of client use (self-help)
ACD ACD ““call-boardcall-board”” to display calls waiting in queue to display calls waiting in queue
Create Create ““dashboarddashboard”” reports and process for regular reporting reports and process for regular reporting
Self-serve password reset toolsSelf-serve password reset tools
StanfordStanfordMITMIT
1/5/0526
Cost to Implement MetricsCost to Implement Metrics
$15K$15KSelf Creation of DashboardSelf Creation of Dashboard
$32K$32KCreation of Standard Operating ProceduresCreation of Standard Operating Procedures
$10K $10K$10K $10KJoint customer satisfaction surveyJoint customer satisfaction survey
$ 60K$ 60KSelf-serve password reset toolsSelf-serve password reset tools
$ 15K$ 15KACD ACD ““call-boardcall-board”” to display queued calls to display queued calls
$ 300K $300K$ 300K $300KKnowledge ManagementKnowledge Management system system
Pilot use to help HD staffPilot use to help HD staff
Evaluate use of client self-serveEvaluate use of client self-serve
$ 75K$ 75KCasetrackerCasetracker consultantconsultant
$ 60K$ 60KRemedyRemedy consultant to program changes consultant to program changes
Next 6 monthsNext 6 months
MITMIT StanfordStanfordSoftware or Hardware InvestmentsSoftware or Hardware Investments
1/5/0527
Stanford: Help Desk Dashboard Stanford: Help Desk Dashboard –– MONTHLY MONTHLY
Cases by Media
012345678
J F M A M J J A S O N D
EmailCallsWalk-InSelf-Help
0
1
2
3
4
5
Courtesy TechKnowledge
Overall Satis
TO BE DEVELOPEDCustomer Satisfaction - Spot Ticket Surveys
IT Help Desk
Tech Support
Student
Phone-In Statistics
0
5
10Abandon
Time to Answer
Hold TimeCall Length (Tier 2)
Another MetricActual
Goal
Problems by Cause - Tier 1 -- for Month of XXXXX
0
1
2
3
4
5
6
7
Account IDAuthority
Bus AppCluster
Securty/Virus
Connectvty
Data BackupEmail
OtherHW Print
SW - OS
SW -PersonalWeb
Telecom
Current
Previous
0%
20%
40%
60%
80%
100%
% P
robl
em R
esol
utio
n
Tier 3Tier 2Tier 1
Level 1 Help Desk # of Tickets Created by Employee - 2002
-200400600800
1,0001,2001,4001,6001,8002,0002,2002,4002,6002,8003,0003,2003,400
F M A M J J A S Avg
987654321
1/5/0528
MIT: Help Desk Dashboard MIT: Help Desk Dashboard –– MONTHLY MONTHLY
0
1
2
3
4
5
Courtesy TechKnowledge
Overall Satis
TO BE DEVELOPEDCustomer Satisfaction - Spot Ticket Surveys
IT Help Desk Tech Support Student
Time to Close Cases
61.53%
8.46%9.23%10.95%4.09%3.40%2.35%
0500
1000
1500
2000
25003000
3500
400045005000
1 3 7 14 21 31 More
Days
Cas
es
.00%10.00%20.00%
30.00%
40.00%
50.00%60.00%
70.00%
80.00%90.00%100.00%
MIT First Contact Help Desks
Cases By Method
0%
10%
20%
30%
40%
50%
60%
Jan-
02
Feb-
02
Mar-
02
Apr-
02
May-
02
Jun-
02
Jul-02 Aug-
02
Sep-
02Messages(Email, Web)
Internal(Transfers, Referrals)
Interactive(Voice, Char)In-Person(Walk In, Site Visit)
MIT First Contact Helpdesks
0
2000
4000
6000
8000
10000
12000
Acco
unts/ID's/
Auth
ority
Back
up
Busin
ess Ap
plica
tion
Supp
ort
Busin
ess Fu
nctio
ns
Clus
ter
Conn
ectiv
ity
Cour
sewar
eEm
ail
Hardw
are
OS S
oftw
are
Other
Printin
g
Prod
uctiv
ity S
oftw
are
Secu
rity/
Viru
sW
eb
Case
s /
Co
mp
lexit
y
MIT First Contact HelpdesksInteractive Communications
(Phone, Chat)
Answered %
Phone Time to Answer (0-60)
Hold Time (0-60)Call Length (simple) (0-600)
Call Length (specialist) (0-600)
Actual
Goal
Notes:1) The MIT ACD currently does not collect hold time.2) Times are in seconds3) MIT currently cannot distinguish specialist fromsimple calls; actual is average
MIT First Contact HelpdesksMessage Communications
(Email, Web Requests)
Answered%
Time to first response (0-600)
Number of exchanges (simple) (0-6)
Time between exchanges (simple) (0-600)
Number of exchanges (specialist) (0-6)
Time between exchanges (specialist)(0-6000)
Actual
Goal
Notes: This is a sample dashboard showing metrics to be delivered on a monthly basis pending some tool improvements.1. The Cases by Complexity graph shows real ticket counts and categorizations. Complexity is currently estimated based on time-to-resolve.2. We do not currently have complete Helpdesk Interactive Communications data such as time to first response. This graph is a placeholder.3. Cases by Method shows additional data. The graph breaks out email and web requests. Ultimately we want to consolidate this data into the categories shown in the graph’s legend.
Specialist (Tier 2)Simple (Tier 1)
Referral (Tier 3)
1/5/0529
Next Steps for MIT/StanfordNext Steps for MIT/StanfordHelp Desk BenchmarkingHelp Desk Benchmarking
JJ JJ AA
Consider inviting others to benchmarkConsider inviting others to benchmark
Implement management, Implement management, oprtns oprtns changeschanges
Implement customer spot surveys; thenImplement customer spot surveys; thenannual customer survey (?)annual customer survey (?)
MM SSJJ FF
Implement Knowledge ManagementImplement Knowledge Managementsystem and ACD call boardsystem and ACD call board
Begin software modificationsBegin software modifications
Track common metrics; re-evaluateTrack common metrics; re-evaluate
Implement selected metrics (dashboard)Implement selected metrics (dashboard)and quick winsand quick wins
AAMMDDMonths:Months:
= On-site visits
Ongoing
Ongoing Annual?