What do you mean, “What do I mean?” continued

Preview:

DESCRIPTION

What do you mean, “What do I mean?” continued. Lecture 10-1 November 30 th , 1999 CS250. Steps in Building. Decide what to talk about Decide on a vocabulary Encode general rules Encode an instance Pose queries. General Ontologies. Categories Measures Composite Objects - PowerPoint PPT Presentation

Citation preview

Lecture 10-1 CS250: Intro to AI/Lisp

What do you mean, “What do I mean?” continued...

Lecture 10-1

November 30th, 1999

CS250

Lecture 10-1 CS250: Intro to AI/Lisp

Steps in Building

• Decide what to talk about

• Decide on a vocabulary

• Encode general rules

• Encode an instance

• Pose queries

Lecture 10-1 CS250: Intro to AI/Lisp

General Ontologies

• Categories

• Measures

• Composite Objects

• Time, Space and Change

• Events and Processes

• Physical Objects

• Substances

• Mental Objects and Beliefs

Lecture 10-1 CS250: Intro to AI/Lisp

Categories

• Categories

• Reification– How many people live on Earth?

• Inheritance

• Creating taxonomies– Kentucky Fried Chicken– Dewey decimal– LoC– MeSh

Lecture 10-1 CS250: Intro to AI/Lisp

Measures

• Examples: Height, mass, cost

• Measure = Units function + a Number

Lecture 10-1 CS250: Intro to AI/Lisp

Composite Objects

• Not inheritance– Difference between subclass and member

• Schema

• Script

Lecture 10-1 CS250: Intro to AI/Lisp

Composite Objects

• Not inheritance– Difference between subclass and member

• General event descriptions– Schema– Script

Lecture 10-1 CS250: Intro to AI/Lisp

Using Events to Represent Change

• What’s the problem?– Continuous time– Multiple agents– Actions of different durations

• Event calculus - Reify events

Lecture 10-1 CS250: Intro to AI/Lisp

Event Calculus Vocabulary

• Events are splotches in the space-time continuum

• Events have subevents

• Some events are intervals

Lecture 10-1 CS250: Intro to AI/Lisp

Examples

• Suppose we wish to represent facts about market manias

f fBulbEating SubEvent(f,TulipMania) PartOf(Location(f), Holland)

s sStockFrenzy SubEvent(s,USBullMarket) PartOf(Location(f), ??)

s sStockFrenzy SubEvent(s,USBullMarket) TradedOn(Exchange(s), NASDAQ)

Lecture 10-1 CS250: Intro to AI/Lisp

Place

• How are places like intervals?

• Relation In holds among places

• Location function: Maps an object to the smallest place that contains it

Lecture 10-1 CS250: Intro to AI/Lisp

Processes

• Why do we need processes when we have events?

• How can we say:– Barry Sonnenfeld was flying some time

yesterday

– Barry was flying all day yesterday

Kurt D. Fenstermacher:

Sonnenfeld directed:Men in Black (1997)Get Shorty (1995)The Addams Family (1991)

Kurt D. Fenstermacher:

Sonnenfeld directed:Men in Black (1997)Get Shorty (1995)The Addams Family (1991)

E(Flying(Barry), Yesterday)

T(Flying(Barry), Yesterday)

Lecture 10-1 CS250: Intro to AI/Lisp

A Logical Blender

Suppose Bill is accused of killing a zucchini, and when the cold, but efficient, Detective Frigerator (known to his pals as simply “Re”) questions the orange juice pitcher in FOPL, the orange juice has no idea how to say:

“Bill was in the kitchen with the tomato all day yesterday”

Lecture 10-1 CS250: Intro to AI/Lisp

Composite Events

• Use And to combine two events with the usual semantics:

And isn’t so bad, but disjunction is a bit more complicated -- how do we say:

“I saw the whole thing, the beef or the broccoli stabbed the zucchini all afternoon.”

p,q,e T(And(p, q), e) T(p, e) T(q, e)

Lecture 10-1 CS250: Intro to AI/Lisp

Time & Intervals

• Time is pretty important– Divvy up time into: Moments and ExtendedIntervals

– Define a couple handy functions

Start

End

Time

Date

Lecture 10-1 CS250: Intro to AI/Lisp

When Intervals Get Together

• Meet

• Before

• After

• During

• Overlap

Lecture 10-1 CS250: Intro to AI/Lisp

Objects in the Space-Time Continuum

• Remember that events are splotches of space-time

• Some events have coherence through time

• Need to capture the idea of an object existing through time

Lecture 10-1 CS250: Intro to AI/Lisp

Roman Empire

• Roman Empire spread across much of Eurasia, expanding and contracting, from 753 B.C. until the 5th century A.D.

Lecture 10-1 CS250: Intro to AI/Lisp

Roman Empire at 218 B.C.

Lecture 10-1 CS250: Intro to AI/Lisp

Roman Empire at 117 A.D.

Lecture 10-1 CS250: Intro to AI/Lisp

Roman Empire at 395 A.D.

Lecture 10-1 CS250: Intro to AI/Lisp

Fluents

• Roman Empire is an event– Subevents include

• First, Second and Third Punic Wars• One of the first known hammer and anvil

movements in battle (216 BC @ Cannae)

• A fluent allows us to capture the notion of the Roman Empire throughout time

T(Male(Emperor(RomanEmpire)), 1stCenturyAD)

T(In(Gaul, Roman Empire), AD12)

Lecture 10-1 CS250: Intro to AI/Lisp

Fluent Flavors

• Fluent is a function, f:Situations Fvalues– Domain is the set of all situations (states of

the world)

If Fvalues is {TRUE, FALSE} then it’s a Propositional fluent

If Fvalues is {All situations} then it’s a Situational fluent

Lecture 10-1 CS250: Intro to AI/Lisp

Substances

• Less vs. fewer

• Intrinsic vs. extrinsic properties

• Substances are those things that are fungible

Lecture 10-1 CS250: Intro to AI/Lisp

Going, Like, Totally Mental

• What are other agents know, and what are they thinking?– “Everybody’s looking at me”– “They’re trying to kill me”– “You look like someone who knows where I

can find extra virgin olive oil”

• Start with a Believes predicateBelieves(Agent, x)

Lecture 10-1 CS250: Intro to AI/Lisp

Reification & You

• A good first pass:

• Treat Flies(Superman) as a propositional fluent– Relationships like Believes, Know and When

between agents and propositions are propositional attitudes

• The problem: Can Clark fly?

Believes(Agent, Flies(Superman))

Lecture 10-1 CS250: Intro to AI/Lisp

“It is clear.”

• Referential transparency– Any term can be substituted for an equal

term– FOL is referentially transparent

Lecture 10-1 CS250: Intro to AI/Lisp

Knowing for Action

• Knowing preconditions: What do you need to know to do action a?

• Knowledge effects: What effect does performing action a have on an agent’s knowledge?

Lecture 10-1 CS250: Intro to AI/Lisp

Replacing that Zucchini

• Grocery shopping– Percepts– Actions– Goals– Environment

Lecture 10-1 CS250: Intro to AI/Lisp

You say you wanna resolution?

Lecture 10-1 CS250: Intro to AI/Lisp

Chain of Fools

• Forward chaining– Start with sentences, apply SdMP (GMP)

to derive new conclusions– Good when adding new facts

• Backward chaining– Start from sentences and derive premises– Got goal?

American(x)

Lecture 10-1 CS250: Intro to AI/Lisp

Forward Chaining

• Renaming– Two sentences are renamings of one

another if they are the same except for variable names

for each rule that p unifies with a premiseif the other premises are known then

add conclusion to KBkeep on chainin’

Lecture 10-1 CS250: Intro to AI/Lisp

Composition

• Define COMPOSE(T1, T2) to apply two substitutions in a row:

SUBST(COMPOSE(T1, T2), p) =

SUBST(T2, SUBST(T1, p))

Lecture 10-1 CS250: Intro to AI/Lisp

Forward Chaining in Action1) American(x) Weapon(y) Nation(z) Hostile(z) Sells(x, y, z) Criminal(x)2) Owns(Nono, x) Missile(x)

Sells(West, Nono, x)3) Missile(x) Weapon(x)4) Enemy(x, America) Hostile(x)

ForwardChain(KB, American(West))ForwardChain(KB, Nation(Nono))ForwardChain(KB, Enemy(Nono, America))

ForwardChain(KB, Hostile(Nono))ForwardChain(KB, Owns(Nono, M1))ForwardChain(KB, Missile(M1))

ForwardChain(KB, Sells(West, Nono, M1))ForwardChain(KB, Weapon(M1))

ForwardChain(KB, Criminal(West))

Lecture 10-1 CS250: Intro to AI/Lisp

What’s the Problem?

• Will-nilly inferencing

Lecture 10-1 CS250: Intro to AI/Lisp

Backward Chaining

• Start from what you’re trying to prove, and look for support

• When a query q is asked:

If a matching fact q’ is known

return the unifier

for each rule whose consequent q’ matches q

attempt to prove each premise of rule by backward chaining

Lecture 10-1 CS250: Intro to AI/Lisp

Revisiting Unification

• Can we unify:Knows(John, x) & Knows(x, Elizabeth)

Lecture 10-1 CS250: Intro to AI/Lisp

Now what’s wrong?

• Is this complete?

• Inference procedure i is complete iffKB |=i whenever KB |=

PhD(x) HighlyQualified(x)

PhD(x) EarlyEarnings(x)

HighlyQualified(x) Rich(x)

EarlEarnings(x) Rich(x)

Lecture 10-1 CS250: Intro to AI/Lisp

Does a Complete Algorithm Exist?

• Kurt says yes– Any sentence that is entailed by another

set of sentences can be proved from that set

– In other words: We can find a complete inference procedure

• What is it?

Lecture 10-1 CS250: Intro to AI/Lisp

Resolution

• Remember Chapter 6?

,

,

• Is this an improvement?

Lecture 10-1 CS250: Intro to AI/Lisp

Resolution Procedure• Resolution is a refutation procedure: To prove KB |= , show KB is unsatisfiable

Lecture 10-1 CS250: Intro to AI/Lisp

Resolution Procedure

Lecture 10-1 CS250: Intro to AI/Lisp

Canonical Forms

• CNF– Start with a bunch of disjunctions– Pretend all of them are joined with one big

conjunct

• INF– Each sentence is an implication with a

conjunction of atoms on the left, and a disjunction of atoms on the right

Lecture 10-1 CS250: Intro to AI/Lisp

Out of the Frying Pan?

• Created GMP, needed Horn clauses– But can’t always transform sentences into

Horn clauses!– Find another procedure

• Stumble upon resolution, which needs CNF or INF– Can we always transform into CNF or INF?

Lecture 10-1 CS250: Intro to AI/Lisp

CNF vs. Horn

• The diff– In Horn, RHS must be an atom– In CNF, RHS is a disjunction

• MP can derive atomic conclusions, what about resolution?– Recast terms as implications of TRUE

Lecture 10-1 CS250: Intro to AI/Lisp

Conversion to CNF• Can convert any FOL KB into CNF

Lecture 10-1 CS250: Intro to AI/Lisp

Skolemization

• Remove existential quantifiers by elimination– Like EE, but more general

• Replace existentially quantified variables with unique constants– What happens if there’s a universal

quantification hiding inside?– Example: Everyone has a heart

Lecture 10-1 CS250: Intro to AI/Lisp

Resolution Proof

• To prove :– Negate it, – Convert it to CNF– Add to a CNF KB– Infer a contradiction

Lecture 10-1 CS250: Intro to AI/Lisp

Da Proof