29
Conjoint Conjoint Adaptive Ranking Database System Adaptive Ranking Database System ( ( CARDS CARDS ) ) Ely Dahan Ely Dahan Michael Yee, John Hauser & Jim Michael Yee, John Hauser & Jim Orlin Orlin EXPLOR Award Winning Presentation – September 22, 2004 EXPLOR Award Winning Presentation – September 22, 2004 Good, Fast, Cheap Good, Fast, Cheap and and Easy? Easy?

Conjoint Adaptive Ranking Database System ( CARDS ) Ely Dahan Michael Yee, John Hauser & Jim Orlin EXPLOR Award Winning Presentation – September 22, 2004

Embed Size (px)

Citation preview

ConjointConjoint Adaptive Ranking Database SystemAdaptive Ranking Database System ( (CARDSCARDS))

Ely DahanEly Dahan

Michael Yee, John Hauser & Jim OrlinMichael Yee, John Hauser & Jim Orlin

EXPLOR Award Winning Presentation – September 22, 2004EXPLOR Award Winning Presentation – September 22, 2004

Good, Fast, Cheap Good, Fast, Cheap andand Easy? Easy?

The Problem• Current methods require many

questions for few answers

• Respondents must rate products they don’t like

• Simplifying rules to narrowchoices not typically captured

• Respondents make mistakesdue to fatigue, causing inconsistency

• Is there a better way?

Smart Phone ExampleSmart Phone Example

SmallSmallServiceService

Phone BrandPhone Brand

Example: Smart Phone• Respondent: Alex Bell• How does Alex choose

a smart phone?

Mini KeyboardMini Keyboard

FlipFlip 1010

55

7766

99 Utility Scores:Utility Scores:Alex makes tradeoffsAlex makes tradeoffs

SmallSmallServiceService

Phone BrandPhone Brand

• Respondent: Alex Bell• How does Alex choose

a smart phone?

Mini KeyboardMini Keyboard

FlipFlip 1616

11

4422

88 Process of Elimination:Process of Elimination:Focus on key featuresFocus on key features

Example: Smart Phone

Consider this tough task:Rank 32 Smart Phones based on your preferences

There are a billion, billion, billion, billionbillion, ways for a respondent to rank 32 smartphones!

Are weAre wesurprised thatsurprised that

respondents becomerespondents becomefatigued and makefatigued and make

mistakes!?mistakes!?

Prior Research on Adaptive Questioning• Johnson (1987, 1991) & Orme and King (‘02) Sawtooth ACA• Huber and Zwerina (1996), Aggregate utility balance• Arora and Huber (2001), Aggregate customization• Sandor and Wedel (2001), Aggregate + prior beliefs• Louviere, Hensher, and Swait (2000), Aggregate CBC

Prior Research on Fast & Frugal Rules• Tversky (1969, 1972), lexicographic semi-order, elimination by aspects• Dawes and Corrigan (1974), unit weights, linear models• Montgomery and Svenson (1976), 2-stage processing• Thorngate (1980), efficient decision heuristics• Shugan (1980), cost of thinking (pair wise comparisons)• Johnson, Meyer, et. al. (1984, 1989), protocol anal., choice models can fail• Roberts and Lattin (1991, 1997), two-stage w/greedy• Gigerenzer and Goldstein (1996), Take the Best & others• Bettman, Luce, Payne (1996, 1998), Accuracy vs. effort, lexicography• Martignon and Hoffrage (2002), fast and frugal is robust

Two new ideas:

• IDEA 1: We can now measure Alex’s process of elimination

• IDEA 2: We can help Alex avoid inconsistent answers

Customer Insight:Respondents may be using a simple

process of eliminationprocess of eliminationto narrow choices for consideration

IDEA 1:

Phone BrandPhone BrandMini KeyboardMini Keyboard

FlipFlip

“I will only considerflip phones, with mini-keyboards, from Blackberry”

Customer Insight:Respondents may be using a simple

process of eliminationprocess of elimination

IDEA 1:

How hard is it to identify each respondent’s simplifying rule?

AGB

How can we identify each respondent’s

process of eliminationprocess of elimination

IDEA 1:

?

•Tougher than it seems, because they may be using one of a huge number of possible rules

•We solved this problem with a new computer technique (speedy)

•We tested our theory and it works!

The big benefit of identifying respondents’

process of eliminationprocess of elimination?

•GoodGood Accuracy, customer insight

•FastFast 1 minute for them, quick for us

•CheapCheap Pack more into the same study

•EasyEasy Reduce drudgery

RankSome

RankAll

2 minutes 7 minutes

2 7

Process of elimination Benefit: CheapCheap

Could you use 5 extra minutes of survey time?Could you use 5 extra minutes of survey time?

FastFast

Ranksome

Rankall

kind of funokay

Somewhatinteresting

about rightlong

Benefit: EasyEasy

How do we know if theprocess of elimination

idea is

GoodGood?

Holdout sample for rankings

Pretty GOODGOOD

Holdout sample for first choice

Pretty GOODGOOD

The consistency criterion,a new approach

Reduce response error by “guiding” respondents towards consistent answers

Each choice must be 100% consistent with at least one set of utility scores

IDEA 2: Avoiding inconsistent answers

• Show product features

• Click on favorite cards

• Inconsistentcards just“disappear”

• GetUtilityscores

• Save lotsof clicks

Keeping people consistent:Conjoint Adaptive Ranking Database System (CARDSCARDS)

IDEA 2: Consistency

How do we keep people consistent?

IDEA 2: Consistency

SmallSmall

ServiceServicePhone BrandPhone BrandMini KeyboardMini Keyboard

FlipFlip

1010 5577 6699

Imagine we knew Alex Bell’s utility scores…

We would know how he would rank all 32 phones

How do we keep people consistent?

IDEA 2: Consistency

Imagine we knew every possible set of scores…

Each set of utility scores are consistent witha unique ranking of all 32 phones

Surprise: Consistent rankings are atiny percentagetiny percentage of the possible answers

Eliminate rankings not on the consistent list

The big benefit of keeping respondents

consistentconsistent?

•GoodGood Accuracy, consistent

•FastFast minutes for them, quick for us

•CheapCheap Pack more into the same study

•EasyEasy 50% to 75% effort reduction

Consistency reduced effort 73%!

Without consistency With consistency

1717cards

7cards

Consistency Benefit: EasyEasy

•Scalable

•Utility Scores as you go

•Emphasizes likes

•Measures uncertainty

Extra benefits of consistency

How do we know if consistency is

GoodGood?

Holdout sample for rankings

Pretty GOODGOOD

Holdout sample for first choice

Just OKAYOKAY

Key Takeaways:

•GoodGood Predictive; Customer insight

•FastFast For them and for us

•CheapCheap Pack more into one study

•EasyEasy Reduce drudgery & mistakes

Thank you for this exciting award!

[email protected]

Good, Fast, Cheap, Easy Good, Fast, Cheap, Easy demonstrations:

http://orc-pumba.mit.edu/~myee/CARDS/conjoint.php http://orc-pumba.mit.edu/~myee/CARDS/conjoint.php Use email: nodebug7@nodebug7@

http://wow.mit.eduhttp://wow.mit.edu