Upload
mara-schmitt
View
221
Download
0
Tags:
Embed Size (px)
Citation preview
Slide 1
Driving Service Quality and Efficiency in Customer
Contact Centres
Andy Cranshaw
COPC Asia Pacific Inc
Slide 2
Agenda
– The New Reality– Definitions of Service, Quality and Efficiency– Some Contact Centre Myths– Service level – How to set your target– Quality – Being right or being nice?– Efficiency – Driving down AHT the easy way– The COPC-2000® Standard
Slide 5
Service….
• Is the speed with which we do things
• For inbound phone it’s our service level
• For inbound e-mail or other non-phone activities it’s our cycle time or turnaround time
Slide 6
Quality….
• Quality is the accuracy or defect rate of our transactions
• It’s consistency
• It’s our ability to resolve issues first time every time
Slide 7
Efficiency….
• Is the amount of output we get for our input
• It’s AHT – the amount of calls that we can handle in a given time period
• It’s utilisation – the percentage of time in a day that our staff spend doing productive work
Slide 8
Some Contact Centre Myths
• Having a faster service level makes our customers happier
• Driving down AHT will have a negative impact on our quality and Customer Satisfaction
• It’s the quality of the interaction between our staff and customers that drives satisfaction
• You can’t measure defect rate in a service environment
Slide 9Slide 15© 2003 Customer Operations Performance Center Inc
Service And Customer Satisfaction
Must Be
Delighters M
ore
Is Bet
ter
Absent Fulfilled
Neutral
Delight
Dissatisfaction
Cu
sto
mer
Sat
isfa
ctio
n
The Kano Model
Slide 10
Customer Satisfaction vs Service Level
7071727374757677787980
20 30 40 50 60 70 80 90 100 110
Service Level
Cu
sto
mer
Sati
sfa
cti
on
Customer Sat
Service Level Doesn’t Drive Customer Satisfaction
Malaysia, Telco
Slide 11
Service Level Benchmarks
• COPC finds an approach adopted by High Performance Centres is to determine the slowest speed of answer (not the fastest) that can be achieved without adversely affecting customer satisfaction.– This can only be done with frequent (e.g., monthly) customer
satisfaction data that can be correlated with actual service levels.
• COPC has worked with several companies that have done definitive research on this– All slowed their speed of answer target because they found that
speed of answer was not as critical to customer satisfaction as they had originally thought
– In one case, they changed their Service Level target to 80/40 from 80/20 for customer service in the U.S. and Europe
Speed of Answer BenchmarksSpeed of Answer Benchmarks
Slide 12
Service Level Benchmarks
• High performance centers manage consistency and calculate Service Level based on the percentage of (prime) intervals the Service Level is achieved for the time period instead of the overall Service Level.– “Prime Intervals”—periods where 2/3-3/4 of daily volume
arrives– “Targeted Band”—acceptable range for Service Level
(e.g., 83% to 88% where overall goal is 85%)– Best observed target is Service Level must be within
Targeted Band for at least 75% of Prime Intervals each day
Speed of Answer BenchmarksSpeed of Answer Benchmarks
Slide 13
Quality Monitoring Doesn’t Work
Call Quality vs. Customer Satisfaction
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
0% 20% 40% 60% 80% 100%
Customer Satisfaction
Qu
alit
y S
core
India, Third party CSP
Slide 14
Call Quality vs. Accuracy/Fatal Errors
0%
20%
40%
60%
80%
100%
50 60 70 80 90 100
Quality Score
Acc
urac
y S
core
Being Nice Isn’t The Same As Being Right!
India, Third party CSP
Slide 15
Being Nice Doesn’t Drive Customer Satisfaction
% Sat, Vsat% Impact on Overall CS Satisfaction
Authority to Handle Request
Timely Resolution
Showed Genuine Concern
Knowledgeable
Courteousness
Understood Request
Answered Call Promptly
81
77
87
67
71
56
58
80
74
84
65
69
51
55
81
80
88
67
71
53
56
OctNovDec
2
1
18
21
29
28
2
US Customer Service
Slide 16
Being Right Does Drive Customer Satisfaction
Relationship Between Accuracy and End User Satisfaction
68
70
72
74
76
78
80
82
84
Jan Feb Mar Apr May June Jul Aug Sep Oct Nov Dec
Month
% T
op
Tw
o B
ox
75
80
85
90
95
100
Accu
racy %
Customer Sat
Accuracy
US Tech Support
Slide 17
Accuracy Benchmarks
• High performance centres define and measure fatal and non-fatal errors separately.– Fatal errors are those which, by their nature
cause a transaction to be defective – Non-fatal errors are those which may irritate the
customer slightly but will not cause a breakdown in your relationship.
Benchmarks: Transaction MonitoringBenchmarks: Transaction Monitoring
Slide 18
Fatal ErrorsA Different Calculation
TelephoneFatal Error Rate
93.1%
85.3%
95.1%
98.3%
96.2%
89.2%
85.3%
87.0%
85.1%
94.4%
89.5%
86.4%
97.3%
98.4%
99.8%98.9%
98.4%
96.7%96.0%
94.2%
85.3%
89.3%
94.5%
94.2%
96.7%
94.7%
97.4%
96.2%
80.0%
82.0%
84.0%
86.0%
88.0%
90.0%
92.0%
94.0%
96.0%
98.0%
100.0%
Jun-0
2
Jul-0
2
Aug-02
Sep-0
2
Oct
-02
Nov-02
Dec-0
2
Jan-0
3
Feb-0
3
Mar
-03
Apr-03
May
-03
Jun-0
3
Jul-0
3
Aug-03
Fa
tal E
rro
r R
ate
COPC "Correct" Scoring Client Scoring
Measured by Opportunity
Measured by Unit
US, Health Insurance
Slide 19
Accuracy Benchmarks
• High performance centres carry out quantitative calibration among monitors.– Training calibration is where monitors listen to a
call, score it, discuss the call, and agree on a final score.
– Quantitative calibration goes one step further. It involves quantifying the repeatabilty, reproducibility, and accuracy of the monitors.
Benchmarks: Transaction MonitoringBenchmarks: Transaction Monitoring
Slide 20
Accuracy Benchmarks
Phone Fatal Error Accuracy – Averaged 95%– High performance centers achieved 98%-99%
Phone Non-Fatal Error – Averaged 91% for all centers with this data– High performance centers achieved greater than 95% accuracy
Non-Phone: The best COPC has seen:– For high volume mail processing (i.e., two million pieces per month)
a CSP is averaging 99.2% accuracy (for fatal and non-fatal) with a range of 98%-99.9% (measured by opportunity).
Non-phone: High Performance Centers– 98%-99% fatal error accuracy– 95%-98% non-fatal error accuracy
Accuracy Benchmarks—COPC 2003 DataAccuracy Benchmarks—COPC 2003 Data
Slide 21
Managing Efficiency
Wages &
Benefits
Time in Productive
State
How Productive is
Productive Time
What supervisors and managers can control
How Efficient Are We?
Labor Efficiency
Support Staff Efficiency
Asset/Technology Efficiency
Slide 22
Managing Efficiency
Time in Productive
State
How Productive is Productive Time
Handle Time as Percent of Paid Time
Work Hours as Percent of Paid
Hours
Handle Time as Percent of Work
Hours
AHT Reduction
Labor Efficiency
Slide 23
Managing AHT Outliers
0:001:00
2:003:00
4:005:006:00
7:008:00
9:0010:00
11:0012:00
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
CSR
Min
ute
s
Av. ACD Time Av. HOLD Time Av. ACW Time
Slide 24
Managing AHT OutliersAHT ( secs) - Unsecured Lending
0
100
200
300
400
500
600
700
800
900
1000
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71 73 75 77 79 81 83
Agent
AH
T (
sec
s)
Target 240 secs
S.E Asia, Bank
Slide 25
No Relationship Between AHT and QualityQuality/Speed Matrix
0
10
20
30
40
50
60
70
80
90
0 1 2 3 4 5 6 7 8
Average Call Length
Av
era
ge
Qu
alit
y
US, Tech Support
Slide 26
No Relationship Between AHT and Customer Satisfaction
AHT and Customer Satisfaction
0.00
20.00
40.00
60.00
80.00
100.00
120.00
0.00 5.00 10.00 15.00 20.00 25.00 30.00
AHT
Cu
sto
mer
Sat
isfa
ctio
n
India, Tech Support
Slide 27
No Relationship Between Talk Time and Quality
ATT and QMs
0
1
2
3
4
5
6
7
8
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
CSR
AT
T M
in
15
20
25
30
35
40
QM
S ATT
QMS
US Customer Service
Slide 28
The COPC-2000® StandardThe COPC-2000® Standard is a globally recognised performance management methodology that drives results:
– Improves financial performance• Lower Costs• Enhance Revenues
– Increases operational performance• Shorten Cycle Times – how long it takes• Improve On-Time – achieving what you promise• Increase Efficiency – reduce costs
– Improves Satisfaction levels• Staff• Customers & Clients
Slide 29
The COPC-2000® StandardThe Standard is administrated by the COPC Standards Committee, an international group of senior level Contact Centre practitioners committed to raising the standards of contact centre performance.
The current Standards Committee has representatives from:– Bell Canada– Blue Cross Blue Shield – Centrelink (Australia)– ClientLogic– Convergys– COPC– General Motors
– Japanese Users Group– L.L. Bean– DHL (Singapore)– Microsoft– Motorola– Sykes B.V. (Netherlands)– TransWorks (India)