42
#thoughtlab @johnwromano @davidmcraney

Cognitive Rehab - David McRaney and John Romano from SXSW 2015

Embed Size (px)

Citation preview

#thoughtlab@johnwromano@davidmcraney

2 - 4 - 6

10 - 12 - 14

24 - 26 - 28

This animates in the presentation, slowly playing out a game. I’m choosing three numbers at a time using one simple rule. What is the rule? Now you pick three numbers with my rule…

1 - 2 - 3

33 - 3,371 - 99,999

Confirmation Bias

3 - 2 - 1devpsy.org

All of these numbers are also using my rule, which was…any three numbers, one bigger than the last. (original game at devpsy.org) - if you searched for an answer that confirmed your hypothesis, and then got confirmation of your hypothesis and stopped looking - you are falling victim to your own confirmation bias.

Logical Fallacies

Cognitive Biases Mental Heuristics

The Triangle of Delusion - (the pyramid of stupid) - Heuristics lead to biases, and both, as well as the process itself, are defended by fallacies, and the whole triangle is mostly invisible because the brain covers its tracks.

The brain uses heuristics to make assumptions and move on, speeding up judgments and decision making. Sure, you could test every object in your house to see if it was made of chocolate, but it is easier to just assume they aren’t. (this slide animates, revealing the doorknob is, indeed, chocolate)

The Blurriness Heuristic

Clarity Bias

The blurriness heuristic: things in the distance are blurry. This leads to a bias, making you think anything that is close can be seen with clarity. That speeds up processing of far away things like mountains so you don’t actually have to measure, but it can lead to problems. If you think a pool is deep because the bottom seems blurry and you dive in head first into shallow water…or if you think the cars in front of you in deep fog must be far away…

Snyder and Cantor (1979)

The Jane Study

Introverted Extroverted Introverted Extroverted

Real Estate Agent? Librarian? Yes!Yes!

Librarian? No! Real Estate Agent? No!

This slide explains the Jane study. A group of people heard a story about a week in the life of Jane in which she was extroverted half the time and introverted half the time. Two days later the groups were divided in two. One was asked if Jane would be a good candidate for a job as a Real Estate Agent. They mostly said yes, searching their memories through a confirmation bias seeking to prove their hypotheses (yes) correct. When then asked if she would be a good librarian, they said no, remembering the results of their biased searches instead of repeating the search. The other group was asked the questions in reverse, coming to the opposite conclusion using the same bias. Same story, same people, two realities, thanks to their confirmation biases.

Confirmation BiasWhen seeking to verify an estimation,

assumption, guess, hypothesis, hunch, or belief, you tend to stop your search after

receiving confirmation that you were right all along.

In WWII, the USA military created a “Department of War Math” to help with statistical calculations.

One of their analyses was a heat map showing where bombers were getting shot the most. Plans were put in place to put armor in those locations, but statisticians ended those plans. They explained that since those planes made it home, those damaged areas were where the planes must actually be the strongest. The missing planes were probably hit where these planes were not, and that’s where the armor should go.

Billy HathornDan Smith

Photo Credit Mike Johnston theonlinephotographer.com

Frontier log cabins are sometimes considered amazing works of construction to have survived so long. But as Mike Johnston points out, that’s not true. Most log cabins fall over within the first few years. Only the few that were extraordinarily well built or lucky to have never faced harsh weather are still around. But…since you can only take a picture of a still-standing log cabin, those cabins are incorrectly assumed to be examples of what ALL cabins were like from the time.

Super Successes

Typical Failures

“The cemetery of failed restaurants is very silent.”

Source: Nassim Taleb “The Black Swan”

Nassim Taleb points out that people often think restaurants are a great business to get into because all the restaurants they see are doing very well. But, all the restaurants that fail are also removed from view, and most restaurants fail within the first few years, leaving behind only those restaurants that were SUPER successful, which is what you must have to survive in the restaurant business - a level of successes that is uncommon and very hard to achieve, and…that is mostly luck.

Photo Sources: Wikimedia Commons Man: Marg Woman: Peter van der Sluijs

This is why advice from old people on how to be old isn’t reliable. You might hear - the secret to my long life is a shot of bourbon before every meal, a pack of cigarettes every day, and a bacon sandwich every afternoon. That lifestyle might actually kill most people, but the only people left to give advice are the ones that it hasn’t killed, and they aren’t a great representation of people in general - because they were lucky, genetically lucky, yet they attribute their success to other factors. Just like…

…advice from the successful. When we look at all these magazines that come out that look at successful people and how they got there and interviews with them and books that provide examples of how to succeed, what they did to survive a hostile environment, all these people are looking backwards, through hindsight bias - this is advice from people for whom everything worked out, they can’t tell you what you shouldn’t do, what you ought not do…

“A stupid decision that works out well becomes a brilliant decision in hindsight.” - Daniel Kahneman, Thinking Fast and Slow

“If you group successes together and look for what makes them similar, the only real answer will be luck.”

That’s why Daniel Kahneman, the great psychologist who won the Nobel prize in economics, says these quotes. And he asks to look at the biographies of mega-successful business and search for the moment they were most uncertain about the future and see if anyone in the company had any idea how they would get to where they are today, did they know the decisions they were about to make would do what they did - and he says when you do that, they never do, and you are seeing certainty in hindsight that in the moment was chaos

Survivorship BiasThe tendency to focus on survivors

instead of whatever you would call a non-survivor depending on the situation

Living / Dead Successes / Failures Winners / Losers

After any process that leaves behind survivors, the non-survivors are often destroyed, muted, or removed from view.

If failures become invisible, then naturally you will pay more attention to successes.

Not only do you fail to recognize that what is missing might have held important information, you fail to recognize there is any missing

information at all.

These are all examples of Survivorship Bias

Prentice and Miller (1993)

The Princeton Drinking Study

In this study, students were asked if they enjoyed binge drinking.

Prentice and Miller (1993)

The Princeton Drinking StudyBinge drinking

sucks.

Binge drinking sucks.

Binge

drink

ing

suck

s.

The said no. So, the scientists wondered, why was Princeton one of the most hardcore binge drinking campuses on Earth if everyone privately hated it?

Prentice and Miller (1993)

The Princeton Drinking StudyBinge drinking

sucks.

Binge drinking sucks.

Binge

drink

ing

suck

s.

They learned through research that incoming freshmen observed upperclassmen seemingly enjoying binge drinking - a lot - but on the inside, privately, each freshmen didn’t like it.

Prentice and Miller (1993)

The Princeton Drinking StudyBinge drinking

sucks.

Binge drinking sucks.

Binge

drink

ing

suck

s.

Binge drinking sucks.

Binge drinking sucks.

Prentice and Miller (1993)

The Princeton Drinking StudyBinge drinking

sucks.

Binge drinking sucks.

Still, they assumed they must be alone in that belief, and went along with the crowd to not seem like an outsider or lame. They then became the upperclassmen - each one privately opposed to the norm, but displaying support on the outside.

Prentice and Miller (1993)

The Princeton Drinking StudyBinge drinking

sucks.

Binge drinking sucks.

And so…the cycle continues…keeping alive a norm no one supports.

Pluralistic Ignorance

When a group of people are collectively unsure how to act, they hide their ignorance by mimicking each others’ outward behavior.

Common Results:

• Group support for a norm no one actually supports, which can lead to unwanted action.

• Proceeding with confidence when no one is sure of how to proceed, making everyone even more ignorant in the long run.

• Slowdown of social change because no one speaks his or her mind until it is clearly safe to do so.

When a group privately disagrees with a norm but publicly supports it, leading people to incorrectly assume they are

alone in their opposition to the majority opinion.

This is called Pluralistic Ignorance

Photo Credits Ship: Matt H. Wade Sub: U.S. Navu

If you stood in line for 10 hours for a waterproof iPhone 9 - and then you dropped in the ocean during a cruise, you COULD hire a submarine to go find it for you, that way you wouldn't feel like you wasted all that time and money - or you could just buy a new phone. Framed like that, this “sunk cost” seems easy to figure out. Don’t throw good money after bad.

Arkes and Bloomer (1985)

The Ski Trip Study

$1,000 $500

But framed differently, it becomes more difficult to make a good decision. In the Ski Trip study, people imagined learning about a great deal on a ski trup to Michigan that cost $1,000. Then, they learned on the next day about a dream vacation ski trip in Wisconsin for $500. But, after buying tickets to both, they learned the two trips overlapped, the tickets were not refundable, and they couldn’t be re-sold. They had to pick one or the other. So, most people picked the more-expensive trip even though the less-expensive trip would have been more fun.

Arkes and Bloomer (1985)

The Ski Trip Study

$100 $50

They couldn’t get back the money - it was gone forever, but to avoid feeling like they had wasted it, they chose to be less happy.

Sunk Cost Fallacy

In order to avoid the psychological pain of loss or waste, people often refuse to

“abandon a failing course of action.”

Escalation of Commitment: The more you invest in something, the harder it becomes to abandon it. You feel like you’ve come too far to “waste the

resources already expended.”

This is called the Sunk Cost Fallacy, or sometimes the Concorde Fallacy when describing an escalation of commitment like the one experienced by the designers of the famous airplane that was doomed to lose money before it was even finished.

The Extramission Theory of Vision

There are many superseded scientific theories - ways in which we believed the world worked before new evidence eventually overcame our desire to cling to old models of reality - for instance, the ancient Greeks believed a “gentle fire” escaped the eyes and mingled with objects to feel them and tell us what things looked like.

People used to believe geese grew on trees until migration revealed why you couldn't find baby geese in certain regions.

People used to believe tainted meet would magically become flies.

…that piles of dirty rags would become mice.

…that burning logs would turn into salamanders (because salamanders often ran out of fires)

…that the Earth was the center of the universe…

…that all health was the result of a balance of four fluids - black bile, yellow bile, blood, and phlegm…

Theory-Induced Blindness (disconfirmation bias)

“Once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its

flaws.” - Daniel Kahneman

“Adherence to a belief about how the world works that

prevents you from seeing how it really works.” - Daniel Kahneman

Kahneman says about such models of reality, “If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing. You give the theory the benefit of the doubt, trusting the community of experts who have accepted it.” Instead of simply saying - “This theory must be wrong,” you instead work to see how the theory can be right in light of challenging information. People also assume that long-held models of operation must be good, otherwise someone would have changed them by now…

It’s important to remember, very smart people for a very long time believed many things that turned out to be completely incorrect. We invented science to escape the shackles of unbounded philosophical speculation.

G.I. Joe Fallacy

Knowing is not half the battle. Knowing that knowing is not half the battle is half the battle.

Laurie Santos at Yale says we should be careful that we don’t fall prey to the G.I.Joe Fallacy: “Knowing is half the battle.” - in reality, knowing that knowing is not half the battle is half the battle. Just knowing about these fallacies and biases will not protect you against their effects - you must have a better plan in place for when they inevitably appear.

Adapt Your Process

• Fail faster. Move laterally • Continually study your audience • Try to disprove your assumption

Move away from waterfall

Adapt Your Environment and Culture

Create an environment where people:

• play Devil’s advocate • challenge each other • understand these cognitive patterns

Adapt Yourself to the Reality of Being Human

Be • open to criticism from all people • open to lateral thinking • OK with change • OK with saying that you were wrong • honest with client and stakeholders about expectations

David McRaney John Romano

Sources:

Snyder, M. and Cantor, N. (1979), "Testing Hypotheses about Other People: The Use of Historical Knowledge," Journal of Experimental Social Psychology, 15, 330-342

Nisbett, R. E., and Wilson, T. D. (1977), “Telling more than we can know: Verbal reports on mental processes,” Psychological Review, 84, 231-259.

Association, The American Statistical, (1951) “Resolution in Honor of Abraham Wald,” The American Statistician February 1951: 19.

Tversky, A. and Kahneman, D. (1974), “Judgment under Uncertainty: Heuristics and Biases,” Science, New Series, Vol. 185, No. 4157. pp. 1124-1131.

Taleb, N. E. (2011) The Black Swan: Kindle Edition: The Impact of the Highly Improbable. Random House

Johnston, M. (2013) “The Trough of No Value” http://theonlinephotographer.typ,epad.com/the_online_photographer/2009/02/the-trough-of-no-value.html

Kahneman, D. (2012) “Thinking, Fast and Slow,” Farrar, Straus and Giroux, 2012.

Rees, M. (1980) “The Mathematical Sciences and World War II,” The American Mathematical Monthly October 1980: 607-621.

Wilson, T. and Others. (1993) “Introspecting about Reasons can Reduce Post-Choice Satisfaction,” The Society for Personality and Social Psychology

Prentice, D. A. and Miller, D. T. (1993) "Pluralistic Ignorance and Alcohol Use on Campus: Some Consequences of Misperceiving the Social Norm," Journal of Personality and Social Psychology, Vol. 64, No. 2. 243-256

Arkes, Hal R., and Peter Ayton. “The Sunk Cost and Concorde Effects: Are Humans Less Rational than Lower Animals?” Psychological Bulletin 125.5 (1999): 591-600. Print.

Shroff, A. (2010) “Are You Making Milkshake Mistakes?” http://arunshroff.com/2010/11/08/are-you-making-milkshake-mistakes/

[email protected]

pointsource.com johnwromano.com

@johnwromano

[email protected]

youarenotsosmart.com soundcloud.com/youarenotsosmart

@davidmcraney @notsmartblog