Transcript
Page 1: The Ludic Fallacy Applied to Automated Planning

The Ludic FallacySPG

11th Feb 2011

Page 2: The Ludic Fallacy Applied to Automated Planning

• Ludic - Of or pertaining to games of chance

• Fallacy - An argument which seems to be correct but which contains at least one error.

Page 3: The Ludic Fallacy Applied to Automated Planning

Example

Page 4: The Ludic Fallacy Applied to Automated Planning

Example• Suppose you flip a coin, what is the chance it comes

up heads?

Page 5: The Ludic Fallacy Applied to Automated Planning

Example• Suppose you flip a coin, what is the chance it comes

up heads?

• 50/50

Page 6: The Ludic Fallacy Applied to Automated Planning

Example• Suppose you flip a coin, what is the chance it comes

up heads?

• 50/50

• Suppose you flip the coin 100 times and the first 99 were tails. What is the chance of the final flip giving heads?

Page 7: The Ludic Fallacy Applied to Automated Planning

Example• Suppose you flip a coin, what is the chance it comes

up heads?

• 50/50

• Suppose you flip the coin 100 times and the first 99 were tails. What is the chance of the final flip giving heads?

• Independent variables, still 50/50.

Page 8: The Ludic Fallacy Applied to Automated Planning

Example• Suppose you flip a coin, what is the chance it comes

up heads?

• 50/50

• Suppose you flip the coin 100 times and the first 99 were tails. What is the chance of the final flip giving heads?

• Independent variables, still 50/50.

• ...or is it?

Page 9: The Ludic Fallacy Applied to Automated Planning

Origins

Page 10: The Ludic Fallacy Applied to Automated Planning

Origins

• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".

Page 11: The Ludic Fallacy Applied to Automated Planning

Origins

• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".

• Broadly, the ability to describe the outcomes of events gives an impression of control. It does not give ACTUAL control of the events.

Page 12: The Ludic Fallacy Applied to Automated Planning

Origins

• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".

• Broadly, the ability to describe the outcomes of events gives an impression of control. It does not give ACTUAL control of the events.

• A complex but inaccurate model is most importantly inaccurate.

Page 13: The Ludic Fallacy Applied to Automated Planning

"Gambling With the Wrong Dice"

Page 14: The Ludic Fallacy Applied to Automated Planning

"Gambling With the Wrong Dice"

• Case Study based on Las Vegas casino.

Page 15: The Ludic Fallacy Applied to Automated Planning

"Gambling With the Wrong Dice"

• Case Study based on Las Vegas casino.

• Extensive and sophisticated systems and models to account for potential cheating.

Page 16: The Ludic Fallacy Applied to Automated Planning

"Gambling With the Wrong Dice"

• Case Study based on Las Vegas casino.

• Extensive and sophisticated systems and models to account for potential cheating.

• Aim was to manage risk.

Page 17: The Ludic Fallacy Applied to Automated Planning

"Gambling With the Wrong Dice"

• Case Study based on Las Vegas casino.

• Extensive and sophisticated systems and models to account for potential cheating.

• Aim was to manage risk.

• But the vast majority of losses came from non-gambling activity : a disgruntled ex-employee, onstage accidents, failure to file correct paperwork and a kidnap ransom.

Page 18: The Ludic Fallacy Applied to Automated Planning

Blinded By Probability

Page 19: The Ludic Fallacy Applied to Automated Planning

Blinded By Probability

• Because we see numbers as solvable, we focus on solving them.

Page 20: The Ludic Fallacy Applied to Automated Planning

Blinded By Probability

• Because we see numbers as solvable, we focus on solving them.

• Lose sight of the broader picture.

Page 21: The Ludic Fallacy Applied to Automated Planning

Blinded By Probability

• Because we see numbers as solvable, we focus on solving them.

• Lose sight of the broader picture.

• The "game" becomes our main focus rather than the world it represents.

Page 22: The Ludic Fallacy Applied to Automated Planning

Back to Coins

Page 23: The Ludic Fallacy Applied to Automated Planning

Back to Coins• We flip 99 times, all tails.

Page 24: The Ludic Fallacy Applied to Automated Planning

Back to Coins• We flip 99 times, all tails.

• 0.5^99 = 1.8x10^-30

Page 25: The Ludic Fallacy Applied to Automated Planning

Back to Coins• We flip 99 times, all tails.

• 0.5^99 = 1.8x10^-30

• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model don't hold true?

Page 26: The Ludic Fallacy Applied to Automated Planning

Back to Coins• We flip 99 times, all tails.

• 0.5^99 = 1.8x10^-30

• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model don't hold true?

• Is the coin fair?

Page 27: The Ludic Fallacy Applied to Automated Planning

Back to Coins• We flip 99 times, all tails.

• 0.5^99 = 1.8x10^-30

• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model don't hold true?

• Is the coin fair?

• What actually is the probability of getting heads next?

Page 28: The Ludic Fallacy Applied to Automated Planning

Off-model Consequences

Page 29: The Ludic Fallacy Applied to Automated Planning

Off-model Consequences

• When we have a model, we risk getting blinkered into thinking about the model instead of the world.

Page 30: The Ludic Fallacy Applied to Automated Planning

Off-model Consequences

• When we have a model, we risk getting blinkered into thinking about the model instead of the world.

• But models are abstract representations.

Page 31: The Ludic Fallacy Applied to Automated Planning

Off-model Consequences

• When we have a model, we risk getting blinkered into thinking about the model instead of the world.

• But models are abstract representations.

• No PDDL model describes the effect of a meteorite hitting a robot, yet it is an (unlikely) possibility.

Page 32: The Ludic Fallacy Applied to Automated Planning

Off-model Consequences

• When we have a model, we risk getting blinkered into thinking about the model instead of the world.

• But models are abstract representations.

• No PDDL model describes the effect of a meteorite hitting a robot, yet it is an (unlikely) possibility.

• Outcomes of actions, or events, cannot be fully enumerated. There exist "off-model consequences"

Page 33: The Ludic Fallacy Applied to Automated Planning

Coins Again

Page 34: The Ludic Fallacy Applied to Automated Planning

Coins Again• We talk about coins having a head and a tail side and

50/50 chance of either.

Page 35: The Ludic Fallacy Applied to Automated Planning

Coins Again• We talk about coins having a head and a tail side and

50/50 chance of either.

• This isn't strictly true - there's a third possibility we don't model :

Page 36: The Ludic Fallacy Applied to Automated Planning

Coins Again• We talk about coins having a head and a tail side and

50/50 chance of either.

• This isn't strictly true - there's a third possibility we don't model :

• Edge

Page 37: The Ludic Fallacy Applied to Automated Planning

Coins Again• We talk about coins having a head and a tail side and

50/50 chance of either.

• This isn't strictly true - there's a third possibility we don't model :

• Edge

• This is Taleb's "Black Swan", highly unlikely but theoretically possible events that are ignored.

Page 38: The Ludic Fallacy Applied to Automated Planning

Coins Again• We talk about coins having a head and a tail side and

50/50 chance of either.

• This isn't strictly true - there's a third possibility we don't model :

• Edge

• This is Taleb's "Black Swan", highly unlikely but theoretically possible events that are ignored.

• A true Black Swan must also be "high impact"

Page 39: The Ludic Fallacy Applied to Automated Planning

What Am I Driving At?

Page 40: The Ludic Fallacy Applied to Automated Planning

Probabilistic Planning

Page 41: The Ludic Fallacy Applied to Automated Planning

Probabilistic Planning

• PPDDL is a prime example of "doing it wrong"

Page 42: The Ludic Fallacy Applied to Automated Planning

Probabilistic Planning

• PPDDL is a prime example of "doing it wrong"

• Extends PDDL by applying probabilities to sets of effects. P(X=i) I occurs, P(X=j) J occurs etc.

Page 43: The Ludic Fallacy Applied to Automated Planning

Probabilistic Planning

• PPDDL is a prime example of "doing it wrong"

• Extends PDDL by applying probabilities to sets of effects. P(X=i) I occurs, P(X=j) J occurs etc.

• Is the world really so cut and dry? Or is this simply shoehorning probabilities into PDDL in the most obvious way possible.

Page 44: The Ludic Fallacy Applied to Automated Planning

Summary

Page 45: The Ludic Fallacy Applied to Automated Planning

Summary• Models are typically incomplete.

Page 46: The Ludic Fallacy Applied to Automated Planning

Summary• Models are typically incomplete.

• Models are frequently wrong.

Page 47: The Ludic Fallacy Applied to Automated Planning

Summary• Models are typically incomplete.

• Models are frequently wrong.

• Probabilistic models make even more assumptions!

Page 48: The Ludic Fallacy Applied to Automated Planning

Summary• Models are typically incomplete.

• Models are frequently wrong.

• Probabilistic models make even more assumptions!

• We allow ourselves to be deceived by numbers into believing we can quantify the unquantifiable.

Page 49: The Ludic Fallacy Applied to Automated Planning

Summary• Models are typically incomplete.

• Models are frequently wrong.

• Probabilistic models make even more assumptions!

• We allow ourselves to be deceived by numbers into believing we can quantify the unquantifiable.

• As a result, we get bogged down solving a problem that isn't necessarily reflective of the real world.

Page 50: The Ludic Fallacy Applied to Automated Planning

So What Can We Do?

Page 51: The Ludic Fallacy Applied to Automated Planning

Introduce Noise

Page 52: The Ludic Fallacy Applied to Automated Planning

Introduce Noise

• Most basic approach is to add noise to probabilistic models.

Page 53: The Ludic Fallacy Applied to Automated Planning

Introduce Noise

• Most basic approach is to add noise to probabilistic models.

• If the model has P(x) = 0.2, test generated plans at say P(x) = 0.2+-0.05

Page 54: The Ludic Fallacy Applied to Automated Planning

Introduce Noise

• Most basic approach is to add noise to probabilistic models.

• If the model has P(x) = 0.2, test generated plans at say P(x) = 0.2+-0.05

• Allows for a rudimentary "what happens if these values are not spot on" check

Page 55: The Ludic Fallacy Applied to Automated Planning

Epsilon-separation of states

Page 56: The Ludic Fallacy Applied to Automated Planning

Epsilon-separation of states

• Similar concept to that used in temporal actions.

Page 57: The Ludic Fallacy Applied to Automated Planning

Epsilon-separation of states

• Similar concept to that used in temporal actions.

• In this case epsilon denotes a marginal probability of transitioning between any pair of states.

Page 58: The Ludic Fallacy Applied to Automated Planning

Epsilon-separation of states

• Similar concept to that used in temporal actions.

• In this case epsilon denotes a marginal probability of transitioning between any pair of states.

• Still not ideal, but at least captures the possibility of events changing the state in an undetermined way.

Page 59: The Ludic Fallacy Applied to Automated Planning

Epsilon-separation of states

• Similar concept to that used in temporal actions.

• In this case epsilon denotes a marginal probability of transitioning between any pair of states.

• Still not ideal, but at least captures the possibility of events changing the state in an undetermined way.

• Somewhat analogous to Van Der Waals forces.

Page 60: The Ludic Fallacy Applied to Automated Planning

State Charts

Page 61: The Ludic Fallacy Applied to Automated Planning

State Charts

• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems

Page 62: The Ludic Fallacy Applied to Automated Planning

State Charts

• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems

• One process interrupts the other, acts and the the first can resume from its previous state.

Page 63: The Ludic Fallacy Applied to Automated Planning

State Charts

• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems

• One process interrupts the other, acts and the the first can resume from its previous state.

• Can we use this model to capture the consequences of unmodelled events?

Page 64: The Ludic Fallacy Applied to Automated Planning

Abstract / Anonymous Actions

Page 65: The Ludic Fallacy Applied to Automated Planning

Abstract / Anonymous Actions

• In Prolog _ represents the anonymous variable.

Page 66: The Ludic Fallacy Applied to Automated Planning

Abstract / Anonymous Actions

• In Prolog _ represents the anonymous variable.

• Nothing analogous to this in PDDL.

Page 67: The Ludic Fallacy Applied to Automated Planning

Abstract / Anonymous Actions

• In Prolog _ represents the anonymous variable.

• Nothing analogous to this in PDDL.

• Would introducing this give us flexibility to patch plans when off-model events occur?

Page 68: The Ludic Fallacy Applied to Automated Planning

Abstract / Anonymous Actions

• In Prolog _ represents the anonymous variable.

• Nothing analogous to this in PDDL.

• Would introducing this give us flexibility to patch plans when off-model events occur?

• Could this be used for actions (perhaps based on DTG clusterings) be useful for this?

Page 69: The Ludic Fallacy Applied to Automated Planning

Brainstorm!


Recommended