Rand Fishkin's presentation on the application of skepticism in web marketing.
Text of Why Great Marketers Must Be Great Skeptics
Rand Fishkin, Wizard of Moz | @randfish | email@example.com Why Great Marketers Must Be Great Skeptics
This Presentation Is Online Here: bit.ly/mozskeptics
Great Skepticism Defining
I have some depressing news
Does anyone in this room believe that the Earth doesnt revolve around the Sun?
The Earth (and everything in the solar system, including the Sun) revolves around our systems gravitational Barycenter, which is only sometimes near the center of the Sun.
Lets try a more marketing-centric example...
In 2009, Conversion Rate Experts built us a new landing page, and increased our subscribers by nearly 25%. What did they do? Via CREs Case Study
One of the most commonly cited facts about CREs work is the long landing page.
The Crap Skeptic The Good Skeptic The Great Skeptic Lets change our landing page to be a long one right now! We should A/B test a long landing page in our conversion funnel. How do we know page length was responsible? What else changed?
The Crap Skeptic The Good Skeptic The Great Skeptic I do believe sadly its going to take some diseases coming back to realize that we need to change and develop vaccines that are safe. Listen, all magic is scientific principals presented like "mystical hoodoo" which is fun, but it's sort of irresponsible. "The good thing about science is that it's true whether or not you believe in it."
In fact, weve changed our landing pages numerous times to shorter versions and seen equal success. Length, it would seem, was not the primary factor in this pages success.
What separates the crap, good, & great?
Assumes one belief-reinforcing data point is evidence enough Doesnt question whats truly causal vs. merely correlated Doesnt seek to validate
Doesnt make assumptions about why a result occurred Knows that correlation isnt necessarily causal Validates assumptions w/ data
Seeks to discover the reasons underlying the results Knows that correlation doesnt imply causality Thoroughly validates, but doesnt let imperfect knowledge stop progress
Will more conversion tests lead to better results? Testing
Obviously the more tests we run, the better we can optimize our pages. We need to build a culture of testing around here.
Via Wordstreams What is a Good Conversion Rate?
Via Wordstreams What is a Good Conversion Rate? Do Those Who Test More Really Perform Better?
Hmm Theres no correlation between those who run more tests across more pages and those who have higher conversion rates. Maybe the number of tests isnt the right goal.
Via Factors That Drive How Quickly You Can Run New Online Tests
Trust Word of Mouth Likability Design Associations Word of Mouth Amount of Pain CTAs UX Effort Required Process Historical Experiences Social Proof Copywriting CONVERSION DECISION Timing Discovery Path Branding Price (its a complex process)
How do we know where our conversion problems lie?
Ask Smart Questions to the Right People Potential Customers Who Didnt Buy Those Who Tried/Bought But Didnt Love It Customers Who Bought & Loved It Professional, demographic, & psychographic characteristics Professional, demographic, & psychographic characteristics Professional, demographic, & psychographic characteristics What objections did you have to buying? What objections did you have; how did you overcome them? What objections did you overcome; how?What would have made you stay/love the product? What would have made you overcome them? What do you love most? Can we share?
We can start by targeting the right kinds of customers. Trying to please everyone is a recipe for disaster.
Our tests should be focused around overcoming the objections of the people who best match our customer profiles
Testing button colors
Testing headlines, copy, visuals, & form fields
Designing for how customers think about their problems & your solution
Does telling users we encrypt data scare them? Security
Via Visual Website Optimizer Could this actually HURT conversion?
Via Visual Website Optimizer
Via Visual Website Optimizer A/B Test Results They found that without the secure icon had over 400% improvement on conversions as compared to having the image. [Note: results ARE statistically significant]
We need to remove the security messages on our site ASAP!
We should test this.
Is this the most meaningful test we can perform right now? (Im not saying it isnt, just that we should prioritize intelligently)
Via Kayaks Most Interesting A/B Test vs.
Via Kayaks Most Interesting A/B Test A/B Test Results So we decided to do our own experiment about this and we actually found the opposite that when we removed the messaging, people tended to book less. - Vinayak Ranade, Director of Engineering for Mobile, KAYAK
Good thing we tested! Good thing we tested! Your evidence is no match for my ignorance!
What should we expect from sharing our content on social media? Social CTR
Just find the average social CTRs and then try to match them or do better. No brainer.
Via Signup.tos Analysis of CTR on Twitter
Via Signup.tos Analysis of CTR on Twitter
306/701 = 43.6%... WTF??
Phew! Were not alone. Via Chartbeat
Assuming social metrics and engagement correlate was a flawed assumption. We need to find a better way to measure and improve social sharing.
OK. We can create some benchmarks based on these numbers and their averages, then work to improve them over time.
That is an insane amount of variability!
There are other factors at work here. We need to understand them before we can create smart metrics or useful expectations