Upload
craig-sullivan
View
1.357
Download
2
Embed Size (px)
DESCRIPTION
I'm presenting at eMetrics here in San Francisco but also attended a lot of sessions myself. I listened to the questions and included the most common CRO niggles or issues as part of my deck. It also includes details of analytics tips (particularly Google analytics but can be applied elsewhere) and details of a good CRO toolkit to have in your pocket. Lastly, I've included a BONUS DECK here. Yay. You'll find some details on the methodologies I support and some of the results that happened. I love lean techniques blended with rapid support from analytics, split testing, rapid UX techniques and many other toolsets.
Citation preview
1
Confessions of a Split Tester
@OptimiseOrDie
2
"If you go to the men's washrooms at the Schiphol airport in Amsterdam, you may notice there's a fly in the urinals. It’s screen printed .
So what do you think most men do? That's right, they aim at the fly when they urinate.
They don't even think about it, and they don't need to read a user's manual; it's just an instinctive reaction that means 85% less spillage!
The interesting feature of these urinals is that they're deliberately designed to take advantage of this inherent human male tendency.“
This is my job.
3
Director of Optimization, RUSH HairBuilding a team, conversion rate or methodology?
Shameless Promotion Slide
• 35M visitor split tests and counting
• Over $400M increases in revenue for clients, within 5 yrs
• Lifts from 12% to 200+% in site-wide conversion rates
• 21 years of my life slowly being sucked away in boring meetings with time wasting morons.
• UX and Analytics (1999)
• User Centred Design (2001)
• Startups and advisory (2003)
• Funnel optimisation (2004)
• Multivariate & A/B (2005)
• Lean UX (2008)
• Holistic Optimisation (2009)
• I love ooptimizing underperforming stuff : websites, teams, businesses and multi-country optimisation programs!
@OptimiseOrDie
4
Is there a way to fix this then?
5
Agenda
#1 The Optimisers Toolkit The best tools recommended by CRO practitioners
#2 Analytics Genius Tips Top tips from 2013 – from around the world
#3 Top CRO questionsSome answers from my Top 30 CRO questions
6
The Optimisers Toolkit
#1 Session Replay#2 Browser & Email testing#3 VOC, Survey & Feedback tools#4 Guerrilla Usability#5 Productivity tools#6 Split testing#7 Performance#8 Crowdsourcing#9 Analytics Love
7
#1 : Session Replay• 3 kinds of tool :
Client side • Normally Javascript based• Pros : Rich mouse and click data,
errors, forms analytics, UI interactions.• Cons : Dynamic content issue, Performance hit
Server side• Black Box -> Proxy, Sniffer, Port copying device• Pros : Gets all dynamic content, fast, legally tight• Cons : No client side interactions, Ajax, HTML5 etc.
Hybrid• Clientside and Sniffing with central data store
8
#1 : Session Replay• Vital for optimisers & fills in a ‘missing link’ for insight• Rich source of data on visitor experiences• Segment by browser, visitor type, behaviour, errors• Forms Analytics (when instrumented) are awesome• Can be used to optimise in real time!
Session replay tools• Clicktale (Client) www.clicktale.com• SessionCam (Client) www.sessioncam.com• Mouseflow (Client) www.mouseflow.com• Ghostrec (Client) www.ghostrec.com• Usabilla (Client) www.usabilla.com• Tealeaf (Hybrid) www.tealeaf.com• UserReplay (Server) www.userreplay.com
9
10
11
12
#2 : Feedback / VOC tools• Anything that allows immediate realtime onpage feedback• Comments on elements, pages and overall site & service• Can be used for behavioural triggered feedback• Tip! : Take the Call Centre for beers
• Kampylewww.kampyle.com
• Qualaroo www.qualaroo.com
• 4Q4q.iperceptions.com
• Usabillawww.usabilla.com
13
#2a : Survey Tools• Surveymonkeywww.surveymonkey.com (1/5)• Zoomerang www.zoomerang.com (3/5)• SurveyGizmo www.surveygizmo.com (5/5)
• For surveys, web forms, checkouts, lead gen – anything with form filling – you have to read these two:Caroline Jarrett (@cjforms)Luke Wroblewski (@lukew)
• With their work and copywriting from @stickycontent, I managed to get a survey with a 35% clickthrough from email and a whopping 94% form completion rate.
• Their awesome insights are the killer app I have when optimising forms and funnel processes for clients.
14
#3 – Testing tools
Email testing www.litmus.comwww.returnpath.comwww.lyris.com
Browser testing www.browsercam.com (BOING!)www.crossbrowsertesting.comwww.cloudtesting.comwww.multibrowserviewer.comwww.saucelabs.com
Mobile devices www.perfectomobile.comwww.deviceanywhere.comwww.mobilexweb.com/emulatorswww.opendevicelab.com
15
#4 : Guerrilla Usability Testing• All you need is a device, time and people!• Use one of these tools for session recording:
CamStudio (free)www.camstudio.org
Mediacam AV (cheap)www.netu2.co.uk
Silverback (Mac)www.silverbackapp.com
Screenflow (Mac)www.telestream.net
UX Recorder (iOS), Reflection, Webcamwww.uxrecorder.com & bit.ly/tesTfm & bit.ly/GZMgxR
16
#5 : Productivity tools
Oh sh*t
17
#5 Join.me
18
#5 Pivotal Tracker
19
#5 Basecamp
20
• Seriously wasting your time doing manual Excel?• Fed up doing stuff that takes hours?• Use the Google API to roll your own reports straight into Big G• Lots of good articles but ask for advice from:
@danbarker@timlb#measurecamp
• Google Analytics + API + Google docs integration = A BETTER LIFE!• Hack your way to having more productive weeks• Learn how to do this and to have fun with GA custom reports• Ask me about the importance of training
#5 Google Docs
21
• LucidChart
#5 Cloud Collaboration
22
• Webnotes
#5 Cloud Collaboration
23
• Protonotes
#5 Cloud Collaboration
24
• Conceptshare
#5 Cloud Collaboration
25
#6 – Split testing tools – Cheap!• Google Content Experiments
bit.ly/Ljg7Ds
• Multi Armed Bandit Explanationbit.ly/Xa80O8
• Optimizelywww.optimizely.com
• Visual Website Optimizerwww.visualwebsiteoptimizer.com
26
#7 Performance
• Google Site Speed• Webpagetest.org• Mobitest.akamai.org
Site Size Requests
www.coop.se 1200k 63m.ikea.com/se/sv/
684k 14
High St Retailer 307k 43
Department Store
100k 18
Newspaper 195k 35
Supermarket 125k 14
Auto Sales 151k 47
Autoglass 25k 10
Remote tests : iPhone
Slides : slidesha.re/PDpTPD
Show your boss this split test:
Show the e-com director this one:
30
Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)Usertesting (B) www.usertesting.comUserlytics (B) www.userlytics.comUserzoom (S) www.userzoom.comIntuition HQ (S) www.intuitionhq.comMechanical turk (S) www.mechanicalturk.comLoop11 (S) www.loop11.comOpen Hallway (S) www.openhallway.comWhat Users Do (P) www.whatusersdo.comFeedback army (P) www.feedbackarmy.comUser feel (P) www.userfeel.comEthnio (For Recruiting) www.ethnio.com
Feedback on Prototypes / MockupsPidoco www.pidoco.comVerify from Zurb www.verifyapp.comFive second test www.fivesecondtest.comConceptshare www.conceptshare.comUsabilla www.usabilla.com
#7 – UX and Crowd tools
31
#8 : Web Analytics Love• Properly instrumented analytics• Investment of 5-10% of developer time• Add more than you need• Events insights• Segmentation• Call tracking love!
32
#8 : Tap 2 Call trackingStep 1 : Add a unique phone number on ALL channels
(or insert your own dynamic number)
Step 2 : For phones, add “Tap to Call” or “Click to Call”
• Add Analytics event or tag for phone calls!• Very reliable data, easy & cheap to do
• What did they do before calling?• Which page did they call you from?• What PPC or SEO keyword did they use?• Incredibly useful – this keyword level call data• What are you over or underbidding for?• Will help you shave 10, 20%+ off PPC• Which online marketing really sucks?
33
safe
lit
safe
light
rep
air
phon
e nu
mbe
rca
r gl
ass
repa
irsa
felit
e w
inds
hiel
dw
inds
heild
rep
lace
men
tau
to g
lass
auto
win
dow
rep
lace
men
tau
to g
lass
rep
lace
men
tsa
fe li
ght
auto
gla
ss r
epai
rsa
felig
ht w
inds
hiel
dsa
felit
safe
lite
auto
gla
ssau
to w
indo
w r
epai
rau
to g
lass
spe
cial
ists
safe
lite
repa
irch
eap
win
dshi
eld
repl
acem
ent
safe
light
auto
glas
s re
plac
emen
tau
tolit
e w
inds
hiel
d re
pair
win
dshi
eld
repl
acem
ent p
rice
safe
lite
loca
tions
repl
acin
g w
inds
hiel
dsa
felit
e au
to g
lass
rep
air
win
dshi
eld
repa
irm
obile
win
dshi
eld
repa
irel
ite a
uto
glas
sre
plac
e w
inds
hiel
dsa
felit
e au
to g
lass
mn
safe
light
win
dow
rep
air
new
win
dshi
eld
cost
win
dshi
eld
crac
k re
pair
safe
ligh
t aut
o gl
ass
car
win
dow
rep
air
cost
auto
gla
ss r
epai
r ka
nsas
city
safe
ty li
teau
to g
lass
rep
air
hous
ton
txth
e gu
ys in
the
little
red
truc
ksa
felit
e pr
ices
0.0
5.0
10.0
15.0
20.0
25.0
Phone to Booking Ratio
34
What about desktop?Step 1 : Add ‘Click to reveal’• Can be a link, button or a collapsed section• Add to your analytics software• This is a great budget option!
Step 2 : Invest in call analytics• Unique visitor tracking for desktop • Gives you that detailed marketing data• Easy to implement• Integrates with your web analytics• Let me explain…
35
So what does phone tracking get you?• You can do it for free on your online channels• If you’ve got any phone sales or contact operation, this will
change the game for you• For the first time, analytics for PHONE for web to claim• Optimise your PPC spend• Track and Test stuff on phones, using web technology• The two best phone A/B tests? You’ll laugh!
36
Who?Company Website Coverage
Mongoose Metrics* www.mongoosemetrics.com UK, USA, Canada
Ifbyphone* www.ifbyphone.com USA
TheCallR* www.thecallr.com USA, Canada, UK, IT, FR, BE, ES, NL
Call tracking metrics www.calltrackingmetrics.com USA
Hosted Numbers www.hostednumbers.com USA
Callcap www.callcap.com USA
Freespee* www.freespee.com UK, SE, FI, NO, DK, LT, PL, IE, CZ, SI, AT, NL, DE
Adinsight* www.adinsight.co.uk UK
Infinity tracking* www.infinity-tracking.com UK
Optilead* www.optilead.co.uk UK
Switchboard free www.switchboardfree.co.uk UK
Freshegg www.freshegg.co.uk UKAvanser www.avanser.com.au AUS
Jet Interactive* www.jetinteractive.com.au AUS* I read up on these or talked to them. These are my picks.
37
Phone to Booking Ratio
38
#9 : Web Analytics Love
• People, Process, Human problems• UX of web analytics tools and reports• Make the UI force decisions!• Playability and exploration• Skunkworks project time (5-10%)• Give it love, time, money and iteration• How often do you iterate analytics?• Lastly, spend to automate, gain MORE
time
39
#3 2013 Tips roundup!!!!!!!
40
Analytics Genius Tips
#1 Performance tune-ups#2 Browser money#3 Keyboard shortcuts#4 Ranking data#5 Content engagement#6 Enhanced In page#7 Duplicate transactions#8 Event tracking#9 Google API + more
41
#1 : Performance Tune-upsWith thanks to @Danbarker
Add “_gaq.push(['_setSiteSpeedSampleRate', 100]);”
• Amps up sampling for small & medium websites• Use the distribution report - % of pages < 3 seconds
• DOM timings vital – let me explain• Avg. Document Content Loaded Time (sec) • This data is very accurate and helps conversion – pretty vital for landing pages,
where I find lots of stuff
• Make yourself a [pageview * content load time] report• This is called a ‘Suck Index’• Work your way down from the top
• Mobile speed doesn’t count Safari – please be careful!• Read more at:
http://p.barker.dj/sitespeedtips
42
#2 : Browser money• Create a desktop only segment [Exclude “mobile including tablets” ]• Create a mobile only segment [ For GA, Include Mobile (including tablet) &
Exclude Tablet = yes]. 3 segments Desktop, Tablet, Mobile only• Now start segmenting like this:
Browser Segment Conv rateSafari Mobile Traffic 0.79%
Internet Explorer Desktop only 1.34%
Chrome Desktop only 1.01%
Safari Tablet Traffic 1.00%
Safari Desktop only 1.28%
Firefox Desktop only 1.20%
Android Browser Mobile Traffic 0.31%
Safari (in-app) Mobile Traffic 0.69%
Chrome Mobile Traffic 0.62%
Safari (in-app) Tablet Traffic 0.89%
Chrome Tablet Traffic 0.84%
Opera Desktop only 0.10%
Android Browser Tablet Traffic 0.71%
Mozilla Compatible Agent Mobile Traffic 0.51%
Quote the opportunity!
IE8 = 1.41% revenue
IE8 converts at 20% of IE9 and IE10
Some nasty bugs cause the problem
Worth fixing?
That 1.41% problem is worth nearly 6% more checkouts!
43
#2 : Browser money• You should see something like this:
44
#3 : GA Keyboard Shortcuts• Thanks to Farid Alhadi and @fastbloke
d t Set date range to TODAY
d y Set date range to YESTERDAY
d w Set date range to LAST WEEK
d m Set date range to LAST MONTH
d c Toggle date comparison mode (to the previous period of whatever you are looking at.Example, if you’re looking at 6 days, this will compare it to the 6 days before it)
d x Toggle date comparison mode (to the previous year of the period you are looking at)
? Open keyboard shortcut help
h Search help center
a open account panel
shift + a Go to account list
s / Search reports
shift + d Go to the default dashboard of the current profile
45
#4 : Ranking data• Someone searches on Google for ‘Term’• Clicks on a link to your site• What was the keyword rank for that term?• Reverse engineer the actual rankings from users machines• More accurate than some SEO tools (IMHO)• Two articles to show you how:
http://bit.ly/Vaisnohttp://bit.ly/13lmYF2
46
#5 : Content Engagement• Get a better bounce rate metric• See more detailed engagement metrics• Measure scrolling and reading activity• Came from @fastbloke but originally Justin Cutroni• Measure your scrolling and exit points from long form
content
• Very nice technique – read more at:
http://bit.ly/13lmYF2http://goo.gl/1AZZb
47
#6 : In Page Analytics - Enhancedvar _gaq = _gaq || [];var pluginUrl ='//www.google-analytics.com/plugins/ga/inpage_linkid.js';_gaq.push(['_require', 'inpage_linkid', pluginUrl]);_gaq.push(['_setAccount', 'UA-XXXXXX-Y']);_gaq.push(['_trackPageview']);
• Correct link attribution for in-page Analytics• Very nice
48
#7 : Duplicate transactions in GA• Thanks to Matt Clarke and @timlb• Stop skewing the data with duplicates/reloads• Custom report to check if you’re affected
http://techpad.co.uk/content.php?sid=247• I’ve seen this a few places so worth checking,
particularly if figures don’t tally!• Read more at : http://bit.ly/13lmYF2
49
#8 : Event tracking• Thanks to #Measurecamp – check the stream• A beginners guide : http://bit.ly/13RFoJs• Some great ideas here : http://bit.ly/UCcptx• Don’t go for ‘Event Blizzard’• Focus on specific areas where insight is needed• Choose your naming structure carefully: http://
bit.ly/WJ4R4c• Read this complete guide : http://bit.ly/VmFSJ4
50
#9 : Google API and more• Use the Google API to get super custom reports• You can fetch different data types (on the fly as
well as pre-calculated)• Automate a HUGE CHUNK of Excel work• @timlb recorded the #MeasureCamp session:
• Deck : www.youtube.com/watch?v=JWXg1_4quwU
• Roundup : www.measurecamp.org/aftermath/
51
• http://www.seoskeptic.com/beyond-rich-snippets-semantic-web-technologies-for-better-seo/
• http://www.slideshare.net/ismepete/maximising-your-serp-potential-enhance-your-listings-with-rich-snippets
This is funnel step ZERO – work it!
#10 : Microdata / Rich Snippets
52
#10 : Microdata – SERPS UX• Reviews – huge increases in CTR and Conversion• People (Authors)• Products• Businesses and Organisations• Recipes• Events• Music• Local• Video
Helps to:• Dominate the page• Push other stuff down• Makes it more persuasive• The conversion journey starts here!
53
#11 : Measure viewport sizeThanks to @Beantin and others!
• Measure the viewport size, not the resolution• Why?• Toolbars, chrome and setup varies• UK 2011 figure was 2.2 toolbars • Code example here : http://bit.ly/4xaNYK• A common conversion issue• Your desk vs. Users = different• Turn off the wifi, reduce the viewport• The budget restroom solution
54
Best Practice?• There is no such thing as a ‘readily repeatable
best practice’ in conversion optimisation• The button color example• There are patterns! - but the context varies• The answer is always, “it depends” ;-)• It starts with your customers, your site, your
data, your insights – not an article online!• It starts and end with customer knowledge –
that’s best practice!
55
Top Conversion Questions• 32 questions, picked by Practitioners• Being recorded on ScreenR.com• What top stuff did I hear this week?
“How long will my test take?”“When should I check the results?”“How do I know if it’s ready?”“What happens if a test fails?”“What sort of QA testing should I do?”
56
#1 How long will a test take?• The minimum length
– 2 business cycles– Always test ‘’whole’ not partial cycles– Usually a week, 2 weeks, Month– Be aware of multiple cycles
• How long after that– IMHO you’ll need a minimum 250 outcomes, ideally 350 for each ‘creative’– If you test 4 recipes, that’s 1400 outcomes– Make a note of your minimum ‘length’ for 350 outcomes– If you segment, you’ll need more data – It may take longer than that if the response rates are similar*– Work out how long it might take (or you can afford it to take)
http://visualwebsiteoptimizer.com/ab-split-test-duration/
* Stats geeks know I’m glossing over something here. That test time depends on how the two experiments separate in terms of relative performance as well as how volatile the test response is. I’ll talk about this when I record this one! This is why testing similar stuff sux.
57
#2 – Are we there yet? Early test stages…• Ignore the graphs. Don’t draw conclusions. Don’t dance. Calm down.• Get a feel for the test but don’t do anything yet! • Remember – in A/B - 50% of returning visitors will see a new shiny website!• Until your test has had at least 1 business cycle and 250-350 outcomes, don’t
bother drawing conclusions or getting excited!• You’re looking for anything that looks really odd – your analytics person should be
checking all the figures until you’re satisfied• All tests move around or show big swings early in the testing cycle. Here is a very
high traffic site – it still takes 10 days to start settling. Lower traffic sites will stretch this period further.
58
#3 – What happens when a test flips on me?
• Something like this can happen:
• Check your sample size. If it’s still small, then expect this until the test settles.• If the test does genuinely flip – and quite severely – then something has changed with
the traffic mix, the customer base or your advertising. Maybe the PPC budget ran out? Seriously!
• To analyse a flipped test, you’ll need to check your segmented data. This is why you have a split testing package AND an analytics system.
• The segmented data will help you to identify the source of the shift in response to your test. I rarely get a flipped one and it’s always something changing on me, without being told. The heartless bastards.
59
#4 – What happens if a test is still moving around?
• There are three reasons it is moving around– Your sample size (outcomes) is still too small– The external traffic mix, customers or reaction has
suddenly changed or – Your inbound marketing driven traffic mix is
completely volatile (very rare)
• Check the sample size• Check all your marketing activity• Check the instrumentation• If no reason, check segmentation
60
#5 – How do I know when it’s ready?
• The hallmarks of a cooked test are:– It’s done at least 1 or 2 (preferred) cycles– You have at least 250-350 outcomes for each recipe– It’s not moving around hugely at creative or segment level
performance– The test results are clear – even if the precise values are not– The intervals are not overlapping (much)– If a test is still moving around, you need to investigate– Always declare on a business cycle boundary – not the middle of
a period (this introduces bias)– Don’t declare in the middle of a limited time period advertising
campaign (e.g. TV, print, online)– Always test before and after large marketing campaigns (one
week on, one week off)
61
#6 – What happens if it’s inconclusive?
• Analyse the segmentation• One or more segments may be over and under• They may be cancelling out – the average is a lie• The segment level performance will help you
(beware of small sample sizes)• If you genuinely have a test which failed to move any
segments, it’s a crap test• This usually happens when it isn’t bold or brave
enough in shifting away from the original design, particularly on lower traffic sites
• Get testing again!
62
#7 – What QA testing should I do?
• Cross Browser Testing• Testing from several locations (office, home, elsewhere)• Testing the IP filtering is set up• Test tags are firing correctly (analytics and the test tool)• Test as a repeat visitor and check session timeouts• Cross check figures from 2+ sources • Monitor closely from launch, recheck
63
#8 – What happens if it fails?
• Learn from the failure• If you can’t learn from the failure, you’ve designed a crap test. Next
time you design, imagine all your stuff failing. What would you do? If you don’t know or you’re not sure, get it changed so that a negative becomes useful.
• So : failure itself at a creative or variable level should tell you something.
• On a failed test, always analyse the segmentation• One or more segments will be over and under• Check for varied performance• Now add the failure info to your Knowledge Base:• Look at it carefully – what does the failure tell you? Which element do
you think drove the failure?• If you know what failed (e.g. making the price bigger) then you have
very useful information• You turned the handle the wrong way• Now brainstorm a new test
64
#9 – Should I run an A/A test first?
• No – and this is why:– It’s a waste of time– It’s easier to test and monitor instead– You are eating into test time– Also applies to A/A/B/B testing– A/B/A running at 25%/50%/25% is the best
• Read my post here :http://bit.ly/WcI9EZ
65
#10 – What is a good conversion rate?
Higher than the one you had last month!
66
Slideshare
: @OptimiseOrDie
: linkd.in/pvrg14
: slidesha.re/nlCDm6
More reading. Slides and resources on slideshare.net
67
RESOURCE PACK
68
CRO and Testing resources• 101 Landing page tips : slidesha.re/8OnBRh • 544 Optimisation tips : bit.ly/8mkWOB• 108 Optimisation tips : bit.ly/3Z6GrP• 32 CRO tips : bit.ly/4BZjcW• 57 CRO books : bit.ly/dDjDRJ• CRO article list : bit.ly/nEUgui• Smashing Mag article : bit.ly/8X2fLk
69
Ad Hoc
Local HeroesChaotic Good
Level 1Starter Level
GuessingA/B testingBasic tools
AnalyticsSurveys
Contact CentreLow budget
usability
Outline process
Small teamLow hanging
fruit
+ Multi variateSession replayNo segments
+Regular usability
testing/research
PrototypingSession replay
Onsite feedback
_____________________________________________________________________________________________ _
Dedicated team
Volume opportunities
Cross silo teamSystematic
tests
Ninja TeamTesting in the
DNA
Well developed Streamlined Company wide
+Funnel optimisationCall tracking
Some segments Micro testing
Bounce ratesBig volume
landing pages
+ Funnel analysis
Low converting & High loss
pages
+ offline integration
Single channel picture
+ Funnel fixesForms analytics
Channel switches
+Cross channel testing
Integrated CRO and analyticsSegmentation
+Spread tool use
Dynamic adaptive targetingMachine learningRealtime
Multichannel funnels
Cross channel synergy
_______________________________________________________________________________________________
________________________________________________________________________________________________
Testing
focus
Culture
Process
Analytics
focus
Insightmethod
s
+User Centered DesignLayered feedback
Mini product tests
Get buyin
________________________________________________________________________________________________Missio
nProve ROI Scale the
testing Mine valueContinual
improvement
+ Customer sat scores tied to
UXRapid iterative
testing and design
+ All channel view of
customerDriving offline using online
All promotion driven by testing
Level 2Early maturity
Level 3Serious testing
Level 4Core business value
Level 5You rock, awesomely
________________________________________________________________________________________________
70
HOMEWORK 1• I’d like you to look at how unconscious action is part of your life every week. • Several times a day, you’ll use a door. There are many different interfaces for doors like
handles, knobs, buttons, push plates, levers and more. • We all go through every day using these things and don’t consciously think about what
we’re doing. You’ll use them at work, at home, when you travel, shop or use the loo! • There are several things you’ll spot if you keep a door diary for a few days. Let’s give you
3 weeks to finish, to give you time to fit this in. Here is your work: 1. Explore.
See how many different types of door interface you can spot. Take pictures of them and add notes on your phone or using an app. Take notes if you like with a notepad and pencil. Photographs are especially useful for showing examples.
2. Patterns and Groups. Do these door interfaces have a pattern? Do they fit into groups? What would you call these groups?
71
HOMEWORK 23. The Furnishings. Take a look at the door furniture and signs:
• Is there any other stuff apart from the handle that you look at?• What signs or stuff are plastered on the door? • Are there any messages telling you stuff?
4. Error with door. What happens when it goes wrong?.
This is really hard to catch but if you keep trying for a week, you’ll spot a few. What happens when the door interface goes wrong? You get it the wrong way, curse to yourself and then do something different. What do you notice about when this happens?
5. What caused it?
What was it, when it all went wrong, that led you to ‘get the door wrong’ and have to try again. What went wrong that didn’t happen with all the other doors? If you watch the door, does it happen to other people?
6. SummarySo keep a log if you can (scribbled notes, mobile phone app, photos) and look at the five things I’ve listed. There might be more stuff than I’m hinting at so observe closely.
• How many different ‘kinds’ of door interface can you spot?• Are there groups of them – similar kinds? What would you call these?• Catch yourself when it goes wrong• Watch other people when it goes wrong• Why did it go wrong
72
Collecting the Evidence
Apps• https://itunes.apple.com/au/app/this-is-note-calendar-+-photoalbu
ms/id403746123?mt=8• https://itunes.apple.com/us/app/awesome-note-+to-do-calendar/i
d320203391?mt=8• http://www.blurb.com/mobile
Inbox or Stream based• http://www.memonic.com/tour#web-clipper• https://launch.unifiedinbox.com/
73
END SLIDES
74
BONUS DECKHope you find this useful – a small bonus here with some slides about conversion optimisation methodologies and how you should try to structure your approach.
75
#2 CRO Project Styles
76
What’s the problem?• #1 User Experience and Conversion
Optimisation are not a checkbox or a step in the process – it needs to be integrated in everything you do.
• #2 This work isn’t a one off exercise either – it’s an ongoing continuous improvement process – like Kaizen
• #3 Usability testing isn’t enough – other UX factors like the visceral and behavioural emotional responses we have to products need tuning too.
• #4 It’s not just about the user!• #5 It’s usually Self Centred Design
driven by Ego, Opinion, Assumption
77
78
Also, the dial won’t turn anymore
With thanks to @morys
PPC SEO
79
Why is this happening?• PPC changes• Advertising models flattening – i.e. mobile Google costs• Comprehensive SEO changes• Competition increasing• Fleetness of foot – Asos in Australia• Entry costs are lower now• New entrants compete without the cruft• Startups are using better ‘build and optimise’ methodologies
than nearly all corporates• The old way of doing things is going to die• The new way of doing things is your only survival ticket
80
So what do people do?• They throw tools at the problem• They try usability testing and research• They generate more data to look at• They make changes without measuring or testing• They hope to randomly create the optimal system• They get an expensive agency to help them• They push their team harder, like galley slaves• They experiment with riskier advertising models• They wonder why they’re burning rubber• They try more random things• Then they call a CRO person and say
“We’ve tried everything. It isn’t working. Help!”
81
Skinner’s Pigeon Experiment• Participants invited into a room with objects• Told to score 100 points within 30 minutes• Participants moved objects around, made
noise, jumped around, tried anything to make a counter increase the points score.
• They got horribly confused• They then created convincing lies for
themselves, to explain what they thought was working. They became superstitious and made rituals.
• The points allocation was made randomly by a goldfish, swimming back and forward in a tank.
• Know any marketing departments like this?• I’ve seen this a lot – and it paralyses companies• We need a better way. A methodology?
82
83
Lean UX
Positive– Lightweight and very fast methods– Realtime or rapid improvements– Documentation light, value high– Low on wastage and frippery– Fast time to market, then optimise– Allows you to pivot into new areas
Negative– Often needs user test feedback to
steer the development, as data not enough
– Bosses distrust stuff where the outcome isn’t known
“The application of UX design methods into product development, tailored to fit Build-Measure-Learn cycles.”
84
Agile UX / UCD / Collaborative Design
Positive– User centric– Goals met substantially– Rapid time to market (especially when
using Agile iterations)
Negative– Without quant data, user goals can
drive the show – missing the business sweet spot
– Some people find it hard to integrate with siloed teams
– Doesn’t’ work with waterfall IMHO
Wireframe
Prototype
TestAnalyse
Concept
Research
“An integration of User Experience Design and Agile* Software Development Methodologies”
*Sometimes
85
CRO
86
Lean Conversion Optimisation
Positive– A blend of several techniques– Multiple sources of Qual and Quant data aids triangulation– CRO analytics focus drives unearned value inside all
products
Negative– Needs a one team approach with a strong PM who is a
Polymath (Commercial, Analytics, UX, Technical)– Only works if your teams can take the pace – you might be
surprised though!
“A blend of User Experience Design, Agile PM, Rapid Lean UX Build-Measure-Learn cycles, triangulated data sources, triage and prioritisation.”
87
Lean CROInspection
Immersion
Identify
Triage & Triangulate
Outcome Streams
Measure
Learn
Instrument
88
Triage and Triangulation
• Starts with the analytics data• Then UX and user journey walkthrough from SERPS -> key paths• Then back to analytics data for a whole range of reports:• Segmented reporting, Traffic sources, Device viewport and
browser, Platform (tablet, mobile, desktop) and many more• We use other tools or insight sources to help form hypotheses• We triangulate with other data where possible• We estimate the potential uplift of fixing/improving something
as well as the difficulty (time/resource/complexity/risk)• A simple quadrant shows the value clusters• We then WORK the highest and easiest scores by…• Turning every opportunity spotted into an OUTCOME
“This is where the smarts of CRO are – in identifying the easiest stuff to test or fix that will drive the largest uplift.”
89
The Bucket Methodology“Helps you to stream actions from the insights and prioritisation work. Forces an action for every issue, a counter for every opportunity being lost.”
Test If there is an obvious opportunity to shift behaviour, expose insight or increase conversion – this bucket is where you place stuff for testing. If
you have traffic and leakage, this is the bucket for that issue.
InstrumentIf an issue is placed in this bucket, it means we need to beef up the
analytics reporting. This can involve fixing, adding or improving tag or event handling on the analytics configuration. We instrument both
structurally and for insight in the pain points we’ve found.
Hypothesise This is where we’ve found a page, widget or process that’s just not working well but we don’t see a clear single solution. Since we need to really shift the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by
evidence and data, we’ll create test plans to find the answers to the questions and change the conversion or KPI figure in the desired direction.
Just Do It JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the change is a no-brainer. Items marked with this flag can either be deployed in a batch or as part of a controlled test. Stuff in here requires low effort or are micro-opportunities to increase conversion and should be fixed.
Investigate You need to do some testing with particular devices or need more information to triangulate a problem you spotted. If an item is in this
bucket, you need to ask questions or do further digging.
90
How is it working out?• Methodologies are not Real Life ™• It’s mainly about the mindset of the team and
managers, not the tools or methodologies they play with
• Not all my clients have all the working parts• You should not be a methodology slave• Feel free to make your own or flexibly adapt• Use some, any techniques instead of ‘guessing’• Blending lean and agile with conversion optimisation
outcomes is my critical learning of the last 5 years• Doing rapid cycles of this outcome driven work for
Belron:• World Conversion Rate Increase:
2009 +5%, 2010 +10%, 2011 +15%, 2012 +25%
• If you’d like to develop a good one for your company, talk to me first!
• Don’t over complicate it.