Upload
richard-dalton
View
110
Download
3
Tags:
Embed Size (px)
DESCRIPTION
How do you know if the user experience your team is slaving over is succeeding or failing? How can your team use data to make better decisions? Quantitative measures can help answer these questions and can complement more traditional qualitative research methods like usability testing.
Citation preview
1
Measures of Success: How to Quantitatively Measure Your User ExperienceRichard Dalton, @mauvyrusset
http://www.flickr.com/photos/torontorob/4044565681/
2
3
#1 Evaluate your experience against something you care about – is it meeting it’s objectives?
@mauvyrusset #WVpdx
4
5
#2 The objectives of your experience will likely differ from those of the person sitting next to you
@mauvyrusset #WVpdx
Tasks + EmotionsUsers do things to try to meet their goals
… and the business wants users to do things so it can meet its goals
Goals
Projects
Are realized through
Are enabled & encouraged by
Are created & changed by
Capabilities
6
Users have goals
… and the business has goals
Projects create new capabilities
Projects change existing capabilities
Capabilities help users to do the tasks they want to do
… and encourage users to do the tasks the business wants them to do
7
#3 Measure how well tasks are satisfied by capabilities, not projects – otherwise you have no baseline
@mauvyrusset #WVpdx
8
User-driven tasks
Business-driven tasksCapabilities
Follow Vanguard’s investing principles
Learn why Vanguard is great
Bring assets to Vanguard
Use Vanguard's products & services
Self-provision on the web
Spread the word about Vanguard
Trust Vanguard
Find an investment company
Act on my investments
Help my heirs be successful at Vanguard
Make good investment decisions
Monitor my investments
Stay current on news and commentary
Deal with taxes
Help other people be successful investors
Web
Phone
Paper
Mobile devices
Radio/TV
90 tasks groupedinto 8 categories
45 tasks groupedinto 7 categories
635 capabilities and counting …
9
Find an investment company
10http://filmfanatic.org/reviews/wp-content/uploads/2008/01/anfscd-parrot.png
Item profile(web)
Ready to buy
Tell others
Research an item
Trust us
Close the sale
Cross sell
Spread the word
User-driven tasks
Business-driven tasksCapabilities
11
Compare the item to others like it
Get information on the item
Find out how much the item costs
Buy the item
Find out shipping costs and times
Find out how to pay
Tell a friend about the item
See if other people like the itemBuy the item
See related items
Believe that the site is safe and secure
Print details about the item
Save the item to look at later
See related items
Tell other people about the item
12
Compare the item to others like it
Get information on the item
Find out how much the item costs
Buy the item
Find out shipping costs and times
Find out how to pay
Tell a friend about the item
See if other people like the item
Buy the item
See related items
Believe that the site is safe and secure
Print details about the item
Save the item to look at later
See related items
Tell other people about the item
2
3
4
5
6
1
11
7
8
9
10
12
Emotional considerations
High-level design & content approach
Success criteria & measures – “what is good?”
Shoppers are commonly fearful or unsure that they have chosen the best product for their needs and want to “comparison shop” the price and/or features of several products.
Show recently viewed items and provide access to a “compare to similar items” tool.
The ratio of users looking at recently viewed items via the “compare to similar items” tool vs. pogo-sticking back to the gallery page should be 20:1 or better.
13
Measures
14
Measure Success Criteria
Measures vs. Success Criteria
http://3.bp.blogspot.com/_iUXX3rF1Axo/TC0fGhmlV0I/AAAAAAAAAMw/atKJuTja0AM/s1600/butterfly+growth+chart2.jpghttp://www.clarklings.com/uploaded_images/IMG_1100-732092.JPG
15
S1.01 Get information on the item
S2.01 Buy the item
P1.01 Buy the item
S1.03 Compare the item to others like it
S1.02 Find out how much the item costs
S1.04 See if other people like the item
S2.03 Find out how to pay
S2.02 Find out shipping costs and times
S3.01 Tell a friend about the item
Research an item
Ready to buy
Tell others
P2.01 See related itemsP3.01 Believe that the site is safe and secure
P4.02 Tell other people about the item
OutcomeDriversCro
ss se
ll
Trust
us
Sprea
d the w
ord
Ishikawa (fishbone) diagramhttp://en.wikipedia.org/wiki/Ishikawa_diagram
16
#4 Measuring outcomes can tell you if a capability is failing. Measuring drivers can tell you why
@mauvyrusset #WVpdx
17
A. General effectiveness in satisfying the task“I’ve done it” or “Now I understand”
1. Completion or conversion rates2. Did future user behavior change as a
result of this interaction?
B. Findability of a known item“I know where I’m going, don’t get in my way”
Speed to find the first task
C. Were the user expectations met?“Oh, that wasn’t what I wanted, let me go back …”
Link bounce rate
D. Client satisfaction“Huh, that information was useless”
1. “Was this useful/helpful” surveys2. “Loss of sale” surveys
E. Was enough information provided?“Don’t leave me wanting more info”
Usage of a second source of information
F. Are users travelling the paths we expected them to?“I can’t find the link, I’ll go back to the homepage first”
Relative usage of one path vs. another to identical destination
What’s being evaluated? Measure
18
Success Criteria
Enduring Temporary
Measure a single solution against a predefined criteria
Compare two or more solutions against one another using A/B testing
Use for on-going monitoring
Criteria set by past user behavior or future expectations.
Use for point-in-time improvement
Point-in-time “winners” identified by test results.
ExampleTask: Get information on the item
Loss of sale surveys should show that problems with item information account for less than 5% of lost sales
ExampleTask: Print details about the item
A/B test two versions of the printer friendly entry point (link vs. button)
19
#5 Ask; how would the user behave if we nailed the design? How would they behave if we screwed it up?
@mauvyrusset #WVpdx
20
Cultural Challenges
21
Sometimes people aren’t ready to listen
22
#6 Be open about the uses & limitations of data & involve people early – it helps gain buy-in
@mauvyrusset #WVpdx
23
Beware good intentions
24
#7 Avoid misleading measures - temptation to use the data is too strong. Ask “what if the result is X?”
@mauvyrusset #WVpdx
25
Ostridge
http://fc05.deviantart.net/fs36/f/2008/285/4/f/You_make_kitty_scared_by_GreenLabRat.jpg
The only thing we have to fear is fear itself
26
#8 Be unbiased. Don’t be afraid to measure things that might contradict your own opinion
@mauvyrusset #WVpdx
27http://img395.imageshack.us/img395/3899/hal90001600en6.jpg
I’m sorry Dave. I’m afraid I can’t do that
28
#9 Don’t lose your perspective about how data fits into your decision-making process
@mauvyrusset #WVpdx
29http://www.flickr.com/photos/tammra/279392432/
But we have over 635 capabilities!
30
#10 Start small. Pick a capability, identify objectives, define measures and watch what happens
@mauvyrusset #WVpdx