Artificial Intelligence Slides

Preview:

DESCRIPTION

Just a handy outline to some of Artificial Intelligence concepts.

Citation preview

ByByV V L Divakar AllavarapuV V L Divakar Allavarapu

Asst. ProfessorAsst. Professor GITAM UniversityGITAM University VishakapatnamVishakapatnam

UNIT IIUNIT IIPredicate LogicPredicate Logic

Quantifiers

There are two types of quantifiers

1. Universal quantifier ( pronounced as “For All”)∀

2. Existential quantifier( pronounced as “There Exists”)∃

Universal Quantifier

1. All Kings are persons∀x: king(x) → person(x)

2. All people are literal ∀x: person(x) → literate(x)

3. All men are people∀x: man(x) → person(x)

4. All Pompieans were romans∀x: pompiens(x) →roman(x)

Existential Quantifier ( )∃

1. There is some people who wrote games∃x : person(x) → wrote(x, games)

2. There is a person who wrote chess ∃x : person(x) → wrote(x, chess)

3. Everyone is loyal to someone.∀x y : loyalto(x,y) ∃

Predicate Sentences

1. Marcus was a manman(Marcus)

2. Marcus was a PompeianPompeian(Marcus)

3. All Pompeians were Romans∀x: Pompeian(x) → Roman(x)

4. Caeser was a rulerruler(Caeser)

Predicate Sentences

5. All Romans were either loyal to Caesar or hated him.

∀x : Roman(x) → loyalto(x,Caeser) v hate(x, Caeser)

6. Everyone is loyal to someone. ∀x: y : loyalto(x,y) ∃

Predicate Sentences7. People only try to assassinate rulers they are not loyal to.∀x: y: Person(x) & ruler(y)& ∀tryassassinate(x,y) → ~loyalto(x,y)8. Marcus tried to assassinate Caesar.

tryassisanate(Marcus, Caesar)9. All Men are people.∀x : man(x) → person(x)

Predicate Sentences

Answer the Question

Was Marcus loyal to Caesar?We need to prove

~Loyalto(Marcus,Caesar)OR

Loyalto(Marcus,Caesar)

Predicate Sentences

1. Marcus was a man (1)2. All men are people (9) 3.Conclusion Marcus was a person4. Marcus tried to assassinate Caesar(8) 5. Caesar was a ruler (4)6. people only try to assassinate rulers they are not loyal to.(7) Conclude from (3, 4 and 5)

Marcus not loyal to Caesar.

Predicate Sentences

(Predicate 1)man(Marcus) | (Predicate 9)

person(Marcus) | (Predicate 8)

person(Marcus) &Tryassassinate(Marcus,Caesar)

| (Predicate 4)

Predicate Sentences

person(Marcus) & Ruler(Caeser) &Tryassassinate(Marcus,Caesar)

| (Predicate 7)

~Loyalto(Marcus, Caeser)

instance and isa Relationship

Knowledge can be represented as classes, objects, attributes and Super class and sub class relationships.

Knowledge can be inference using property inheritance. In this elements of specific classes inherit the attributes and values.

instance and isa Relationship

Attribute instance is used to represent the relationship “Class membership ” (element of the class)

Attribute isa is used to represent the relationship “Class inclusion” (super class, sub class relationship)

instance and isa Relationship

1. man(Marcus)2. Pompeian(Marcus)3. x: Pompeian(x) → Roman(x)∀4. Ruler(Caesar)5. x : Roman(x) → loyalto(x,Caeser) v ∀hate(x, Caeser)

Using instance Attribute

1. instance (Marcus,man)2. instance (Marcus,Pompeian)3. x: instance(x,Pompeian) → ∀instance(x,Roman)4. instance(Caesar, Ruler)5. x : instance(x,Roman) → ∀loyalto(x,Caeser) v hate(x, Caeser)

Using isa Attribute

1. instance (Marcus,man)2. instance (Marcus,Pompeian)3.isa (Pompeian, Roman)4. instance(Caesar, Ruler)5. x : instance(x,Roman) → ∀loyalto(x,Caeser) v hate(x, Caeser) 6. x: y: z: instance(x, y) & isa(y, ∀ ∀ ∀z)→instance(x, z)

Computable functions and Predicates

Some of the computational predicates like Less than, Greater than used in knowledge representation.It generally return true or false for the inputs.Examples: Computable predicates

gt(1,0) or lt(0,1)gt(5,4) or gt(4,5)

Computable functions: gt(2+4, 5)

cont...

1. marcus was a manman(Marcus)

2. Marcus was a pompeianPompeian(Marcus)

3. Marcus was born in 40 A.Dborn(marcus, 40)

4. All men are mortal∀x: men(x)→ mortal(x)

5. All Pompeians died when the volcano erupted in 79 A.Derupted(volcano,79) & x :pompeian(x)→died(x, 79) ∀

cont...

6. No mortal lives longer than150 years∀x: t1: t2: mortal(x) & born(x,t1) & gt(t2-t1,150)→ ∀ ∀

dead(x,t1) 7. It is Now 1991

Now=19918. Alive means not dead

∀x: t: [ alive(x,t) ∀ →~dead(x,t)] & [~dead(x,t)→alive(x,t)] 9. If someone dies then he is dead at all later times

∀x: t1: t2: died(x,t1) & gt(t2,t1)→ dead(x1,t2)∀ ∀

cont...

1. man(Marcus)2. Pompeian(Marcus)3. born(marcus, 40)4. x: men(x)→ mortal(x)∀5. erupted(volcano,79) 6. x :pompeian(x)→died(x, 79) ∀7. x: t1: t2: mortal(x) & born(x,t1) & gt(t2-t1,150)→ ∀ ∀ ∀dead(x,t1) 8. Now=1991

9. x: t: [ alive(x,t) ∀ ∀ →~dead(x,t)] &

[~dead(x,t)→alive(x,t)] 10. x: t1: t2: died(x,t1) & gt(t2,t1)→ dead(x1,t2)∀ ∀ ∀

cont...

Is Marcus alive?

cont...~alive(Marcus, Now) | (9, Substitution)

Dead(Marcus, Now) | (10, Substituation)

pompeian(Marcus) & gt(now ,t1) |(5, Substituation)

pompeian(Marcus) & gt(now ,79)| (2)

gt(now,79) | (8, substitute Equals)

gt( 1991, 79) |True

cont...

Disadvantage:

Many steps required to prove simple conclusions

Verity of processes such as matching and substitution used to prove simple conclusions

Resolution

Resolution is a proof procedure by refutation.To prove a statement using resolution it attempt to show that the negation of that statement.

Conversion to Conjunctive Normal FormAll Romans who know Marcus either hate

Caesar or think that any one hates any one is crazy.∀x : [Roman(x) & known(x,Marcus)] → [hate(x, Caeser) v ( y: z :hate(y,z) ∀ ∃→thinkcrazy(x,y))]

CNF Equivalent: ~Roman(x) v ~known(x,Marcus) v hate(x, Caeser) v ~hate(y,z) v thinkcrazy(x,y)

Algorithm : converting to CNF

1. Eliminate →a→b = ~a v b

∀x : ~[Roman(x) & known(x,Marcus)] v [hate(x, Caeser) v ( y: z : ~hate(y,z) v ∀ ∀thinkcrazy(x,y)]

CNF...

2. Reduce the scope of ~~(~p) = p ~(a & b)= ~a v ~b~(a v b)= ~a & ~b

∀x : [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( y: z : ~hate(y,z) v ∀ ∀thinkcrazy(x,y)]

CNF...

3. Make each quantifier bind to a unique variables

∀x : P(x) v x Q(x)∀

by eliminate unique variables ∀x : P(x) v y Q(y)∀

CNF...

4. Move all quantifiers to the left of the formula

∀x : y: z : [~∀ ∀ Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( ~hate(y,z) v thinkcrazy(x,y)]

CNF...

5. Eliminate existential quantifier( ) by ∃substituting variable reference to

∃y: president(y) president(F1)

∀x: y: father-of(x, y)∃∀x: father-of(F(x), x)

CNF...

6. Drop the prefix [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( ~hate(y,z) v thinkcrazy(x,y)]

7. Convert statement into conjunction of disjunct(a & b) v c = (a v b) & (b v c)

Resolution in Propositional Logic

Given Axioms:1. P2. (P & Q) →R3. (S v T) →Q4. TStep 1: Convert all Axioms into clause form1. P2. ~P v ~Q v R3. (a) ~S v Q (b) ~T v Q4. T

Propositional Resolution

Step 2: Negate the proposition we want to prove and add it to the existing clauses

example : Form Above we want to prove R

so ~R add it to clauses

Propositional Resolution...step 3: select some clauses and try to prove our assumption is wrong.

(~R) & (~P v ~Q v R) [clause 2]|

~P v ~Q | [clause 1]

(~P v ~Q) & P|

~Q | [clause 3(b)]

(~Q) &(~T v Q)|

~T| [clause 4]

(~T) & (T)|

Contradiction

Unification

Unification is the process of finding substitutions that make different logical expressions look identical

Proposition Logic: R & ~R

Predicate Logic: man(Marcus) & ~man(Marcus)

man(Marcus) & ~man(Spot)

Cont…

Solution for this problem is matching and substitution.

Example:Unify

P(x, x) P(y, z)

Here x, y, z are variables

UNIT IIIUNIT IIISymbolic Reasoning Symbolic Reasoning

Under UncertaintyUnder Uncertainty

Non Monotonic Non Monotonic ReasoningReasoning

Example

•ABC Murder Story:•Abbott(A), Babbitt(B), Cabot(C) be suspects in a Murder case.

•1. A has alibi, in the register of respected Hotel.•2. B also has alibi, for his brother-in-law testified that B was visiting at that time.•3. C pleads alibi too, claiming to watch live match in the ground(But we have only his words).

Example...

•So We can believe:

•1. That A did not commit the crime •2. That B did not commit the crime•3. That A or B or C did

•Conclusion ?

Example...

•But C have been caught by Live television. •So new belief is

•4. That C did not commit the crime.

Monotonic Reasoning

•1. It is complete with respect to the domain interest.•2. It is consistent•3. Knowledge increase monotonically when new facts can be added. • Ex: KB1 = KBL• KB2 = KBL U F ( F is some facts) • than• KB1 is sub set of KB2

Non Monotonic Reasoning

•1. It is may not be complete, allow inference to be made on the basis of lake of knowledge.• •2. It is inconsistent

•3. Knowledge may decrease when new facts can be added.

Approaches

•Approaches to handle these problems

•1. Non Monotonic Reasoning (Belief)

•2. Statistical Reasoning (Certainty)

Logics for Non Monotonic Reasoning

Deferent Reasonings

•1. Default Reasoning• a) Non Monotonic Logic(NML)• b) Default Logic(DL)

•2. Minimalist Reasoning• a) Closed World Assumption (CWA)

Non Monotonic Logic

•This is the Predicate logic, augmented with model operator M, which can be read as

•“is consistent”

NML Example

•∀x: y: Related(x, y) & ∀ M GetAlong(x, y) →WillDefend(x, y)

•For all X and Y are related and if the fact that X gets along with Y is consistent with everything else that is believed, than conclude that X will defend Y

NML Example...

•1. x: Republican(x) & ∀ M ~Pacifist(x) → ~Pacifist(x)•2. x: Quaker(x) & ∀ M Pacifist(x) → Pacifist(x)•3. Republican(Marcus)•4. Quaker(Marcus)

Default Logic

•It is an alternative logic. In this rules are represented in the form of

• A: M B / C

•If A is provable and it is consistent to assume B then conclude C.

Abduction Resoning

•Deduction: • ∀x: A(x) →B(x)• Given A(marcus)• we conclude B(marcus)

•Abduction: It is the reverse process• It is given B(Marcus)•• we conclude A(Marcus)

•But it is wrong some times•

Closed World Assumption(CWA)

•It is a simple kind of minimalist reasoning.

•Courses offered: CS 101, CS 203, CS 503

•How many courses will be offered?

•Answer ?

•or ?

CWA...

•May be one to infinity.

•Reason is that course assertions are do not deny unmentioned courses are also offered.(incomplete information)

•Courses are different from each other.

CWA...

•The assumption is that the provided information is complete.

•So not asserted to be true are assumed to be false.

•Example: Airline KB Application• Is there any flight from Vskp to Hyd?• ~Connect(Vskp, Hyd) is asserted when we can not prove Connect(Vskp, Hyd)

Implementation Issues

•1. How to update Knowledge incrementally ?•2. Many facts are eliminated when new Knowledge become available.• How should it be manged?•3. Theories are not computationally effective?

•These issues can be handled by search control.•Depth first search ?•Breadth first search?

Depth First Search

Cronological Backtracking

It is a depth first search backtracking.

It makes a guess at something, thus it creating a branch in the search space.

If our guess is wrong, back up there and try alternative. It is leaving everything after guess

Example

We need to Know the fact 'F'. Which can be derived by making some

assumption 'A' and derive 'F'. It also derives some additional facts 'G' and

'H' from 'F' Later we derive new facts 'M' and 'N', They

are independent of 'A' and 'F'.

Example...

A

H

F

G M N

Example...

At some time a new fact invalidates 'A'.

In cronological backtracking invalidates all F, G, H, M, N even M, N not depend on assumption.

Exmple 2•Problem: •Finding a time at which three busy people can all attend a meating

•Assumption:

•Meating held on wednesday

•Found a fact: All are free at 2:00

•So choose 2:00 is the meating time.

Example...

Assume day= Wed

After Many steps that only time all people availabe is 2:00 PM

Repete same time finding process and again decide on 2:00 PM. For same reasons.

FAIL(a special conference has all the rooms booked on Wed)

Try to find room

SUCCEED

Problem

•Based on the order they geerated by search process insted of responcibility of inconsistancy, we may waste a great effort

Dependency Directed Backtracking

It makes a guess at something, associate each node one or more justification in the search space.

Two Approaches for Dependency Directed Backtracking

Justification based Truth Maintenance systems(JTMS)

Logical based Truth Maintenance systems(JTMS)

Justification based Truth Maintenance Systems (JTMS)

JTMS...

JTMS has an ability to provided dependency directed backtracking and so to support nonmonotonic reasoning.

Example: ABC Murder Story

Initially our believe that A is the primary suspect. Because he was a beneficiary and he had no alibi.

contd...

•Using Default Logic:

•Beneficiary(x): M ~Alibi(x) / Suspect(x)

Dependency Network

Suspect A[IN]

Benificiary A Alibi Abbott

- (OUT List)IN List +

Abort should be a suspect when it is belived that he is a benificiary and it is not belived that he has an alibi

Dependency Network...

•There are three assertions:

•1. Suspect A(Primary Murder suspect)

•2. Benificiary A(He is benificiary of the victim)

•3. Alibi Abbott(A was at a hotel at the time)

Dependency Network...

Suspect A[OUT]

Benificiary A [IN] Alibi A [IN]

++

Abort should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

Registered A [IN]

Far Away [IN]

Registered Forged A [OUT]

_+

+

Dependency Network...

Suspect B[OUT]

Benificiary B [IN] Alibi B [IN]

++

B should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

Say So B-I-L [IN] Lies B-I-L [OUT]

+ _

Dependency Network...

Suspect C[IN]

Benificiary C [IN] Alibi C [OUT]

++

Abort should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

+

Tells truth Cabot [OUT]

+

Dependency Network...

Suspect C[OUT]

Benificiary C [IN] Alibi C [IN]

++

C should be a suspect when it is belived that he is a benificiary and it is belived that he has no alibi

+

Tells truth Cabot [IN]

+C seen in TV [IN]

+

TV Forgery [OUT]

_

Dependency Network...

Contradiction[IN]

Sespect A

-

Sespect A Sespect ASespect A

- -Sespect Other

-

Logical Based Truth Maintainance System (LTMS)

It is similar to the JTMS. In JTMS the nodes in the network are

treated as atoms. Which assumes no rerelationships among

them except the ones that are expliucitly stated in the justifications.

Example: we can represent Lies B-I-L and not Lies B-I-L and labled both of them IN.

No contruduction will be detected automatically.

LTMS...

•In LTMS contradiction will be detected automatically. In this we need not create

explicit contradiction

Breadth First Search

Statistical ReasoningStatistical Reasoning

Basic Probability

•1. 0 <= p(a) <= 1

•2. p(a)+p(~a) = 1• •3. p(a V b) = p(a) + p(b) - p(a & b)

It is associated the degree of belief in the absence of any other information

P(A) = 0.3P(Cavity)=0.1

It is used only when no other related information.

Prior Probability

Conditional Probability

•Once we have obtained some evidence concerning previously unknown random variable conditional probabilities should be used.

• P(a|b)= 0.2 • Probability of a with known evidence b

• P(Cavity|Toothache) = 0.8

Conditional Probability...

•Product Rule:

P(a & b)= P(a|b) P(b) P(a & b)= P(b|a) P(a)

•P(a|b)=P(a&b) / P(b)

Bayes Theorem

•Bayes rule states: •The probability of the hypothesis(H) to be true with Known observations(E) is•

•P(H|E) = P(H&E) / P(E)

•=> P(H|E) = P(E|H)P(H) / P(E)

•For N events if P(A1)+P(A2)+.......+P(An)=1

•P(Ai)|B)= P(B|Ai)*P(Ai) • P(B|A1)*P(A1)+....+P(B|An)*P(An)

Bayes Theorem...

•Doctor know that cavity causes the patent has toothache say 50%. Prior probability that any patent has toothache 1/20 and cavity 1/1000.

•P(Toothache|Cavity)=0.5•P(Cavity)=0.001•P(Toothache)=0.05

•Finding P(Cavity|Toothache)• = 0.5 * 0.001/0.05• = 0.01

Bayes Network•S: Sprinkler was on last night•W: Grass is wet•R: It rained last night

Sprinkler Rain

Wet

P(wet|Sprinkler) = 0.9P(Rain|wet) =0.7

Bayes Network

Sprinkler Rain

Wet

Cloudy

P(C)= 0.5

C P(S) t .10 f .50

C P(R) t .80 f .20

S R P(W) t t .99 t f .90 f t .90 f f .00

Bayes Theorem

•Disadvantages:

•1. Too many probabilities need to be provided•2. Space to store all probabilities •3. The time required to compute the probabilities•4. This theory is good for well structured situation in which all the data is available and the assumptions are satisfied. Unfortunately these conditions may not occur in reality.

Certainty Factors and

Rule-based systems

cont...

•In MYCIN Expert system each rule is associated with certainty factor which is the measure of the evidence to be believed.

•MYCIN Rule looks like:•If •1. The stain of the organism is gram positive and•2. The Morphology is Coccus and•3. The Growth is Clumps then suggestive evidence 0.7 that is staphylococcus.

Certainty Factor

•Certainty factor CF[h,e] is defined in two components:

MB[h,e] – A measure (b/w 0 and 1) of belief in hypothesis h given evidence e. MD[ h,e] – A measure (b/w 0 and 1) of disbelief in hypothesis h given evidence e.

•CF[h,e] = MB[h,e] - MD[h,e]

Cont…

• In MYCIN model, e for two evidences e1 and e2 supporting hypothesis h.

• The measure of belief MB is

• MB[h,e1&e2] =• MB[h,e1] +MB[h,e2] *(1-MB[h,e1])

• MD[h,e1&e2] =• MD[h,e1] +MD[h,e2] *(1-

MD[h,e1])

Cont…

• If MD[h,e1&e2] = 0 • Or • MB[h,e1&e2]=1 All the

evidences (e1 and e2) approves the hypothesis (h)

• Or

• MD[h,e1&e2] = 1• Or • MB[h,e1&e2]=0 All the

evidences (e1 and e2) disproves the hypothesis (h)

Example for CF

• Set of rules r1,r2, …r7 are given a support of evidence for the hypothesis

• H is conclusion that it is an elephant

• e1: r1:It has a tail 0.3• e2: r2:It has a trunk 0.8• e3: r3:It has a heavy body 0.4• e4: r4:It has four legs 0.2• e5: r5:It has black colour 0.1• e6: r6:It has stripes 0.6• e7: r7:It has long flat ears 0.6

Cont…• For rule 1: MB= 0.3 and MD=0• Inclusion of effect of rule 2 gives the value of

MB and MD as

• MB=0.3+0.8 * (1- 0.3) =0.86 and MD=0

• For rule 3 inclusion

• MB=0.86+0.4 * (1- 0.8) =0.94 and MD=0

• For rule 4 inclusion

• MB=0.94+0.2 * (1- 0.94) =0.952 and MD=0

Cont…

• For rule 5 inclusion

• MB=0.952+0.1* (1- 0.952) =0.9568 and MD=0

• For rule 6 inclusion

• 0.9568 and MD=0.6

• For rule 7 inclusion

• MB= 0.9568 +0.6 (1- 0.9568) =0.98272 and MD=0.6

• We can find CF(h,e1,e2,e3,e4,e5,e6,e7) = 0.9827-0.6=

Dempster Shafer Theory

DST is design to deal with the distinction between Uncertainty and ignorance.

It is very useful to handle epistemic information as well as ignorance or lack of information

cont...

It is represented in the Belief and Plausibility Belief measures the strength of the evidence range from 0 to 1Plausibility is denoted to be pl(s)= 1-bel(~s)

Weak-Slots Weak-Slots and and

Filler StructuresFiller Structures

Introduction

Knowledge can be represented in slot-and –filler system as a set of entities and their attributes

The structure is useful beside the support of Inheritance It enables attribute values to be retrieved quickly Properties of relations are easy to describe It allows ease consideration of object oriented

programming.

Introduction…

A slot is an attribute value pair in its simplest form

A filler is a value that a slot can take -- could be a numeric, string (or any data type) value or a pointer to another slot

A weak slot and filler structure does not consider the content of the representation

Semantic Nets

Semantic nets consists of Nodes denoting the objects. Links denoting relations between objects. Link labels that denotes particular relations.

Example

Blue Brooklyn-DodgersPee-Wee-Reese

Nose

has-part

instance

teamUniform-color

Mammal

Person

isa

Representing Binary predicates

Some of the predicates can be asserted from the above figure are: isa(person, Mammal) Instance(Pee-Wee-Reese, person) team(Pee-Wee-Reese,Brooklyn-Dodgers) uniform-color(Pee-Wee-Reese,Blue) has-part(Pee-Wee-Reese, Nose)

Representing Nonbinary predicates

Three or more place predicate can be represented by creating one new object. Example : score( Cubs, Dodgers, 5-3)

Dodgers

5-3CubsVisiting team

Home-team

G23score

Game

isa

Representing a sentenceJohn gave the book to marry

EV7

Mary

Give

BK23john

instance

object

agent

beneficiary

BK23

instance

Relating Entities

If we want to relate these two entities with the fact John is taller then Bill.

john 72height

Bill 52height

Relating Entities…

Relating two entities by creating two objects H1 and H2 with grater-than

H1

john

H2

height

Greater-than

Bill

height

Relating Entities…

This is assigning values to John and Bill

H1

john

H2

height

Greater-than

Bill

height

72

value

52

value

Partitioned Semantic Nets

Consider simple statement The dog bitten the mail carrier can be represented

d

Dogs

b

isa

assailant

Bite

isa

m

Mall-carrier

victim

isa

Partitioned Semantic Nets…

If you want to represent quantified expressions in semantic nets, one way is to partition the semantic net into a hierarchical set of spaces

Consider the sentence Every dog has bitten a mail carrier

∀x : dog( x ) -> y: mail-carrier( y ) v bite( x,y) ∃

Partitioned Semantic Nets…

d

Dogs

b

isa

assailant

Bite

isa

m

Mall-carrier

isa

victim

S1

SA

g

GS

form

A

isaisa

Partitioned Semantic Nets…

The above figure g is the instance of special class GS of general statement

Every general statement has two attributes Form which states the relation that is being

asserted On or more universal quantifier connections

Partitioned Semantic Nets…Every dog in town has bitten the constable can be represented

d

Dogs

b

isa

assailant

Bite

isa

c

constables

isa

victim

S1

g

GS

form

A

isa

Town-dog

SA

Partitioned Semantic Nets…Every dog has bitten every mail carrier can be represented

d

Dogs

b

isa

assailant

Bite

isa

m

Mall-carrier

victim

gGS

S1

SA

A

isa

form A

isa

Frames

Semantic nets initially we used to represent labeled connections between objects

As tasks became more complex the representation needs to be more structured

The more structured system it becomes more beneficial to use frames

Frames…

A frame is a collection of attributes or slots and associated values that describe some real world entity

Each frame represents: a class (set) an instance (an element of a class)

Frame system example

PersonIsa : MammalCardinality : 6,000,000,000* handed : Right

Adult-MaleIsa: : PersonCardinality : 2,000,000,000*handed : Right

ML-Baseball-PlayerIsa : Adult-MaleCardinality : 624*height : 6-1*bats : Equal to

handed*batting-average : .252*team :*uniform-colour :

FielderIsa : ML-Baseball

PlayerCardinality : 376*batting-average : .262

Frame system example

Pee-Wee-ReeseInstance : FielderHeight : 5-10Bats : RightBatting-Average : .309Team : Brooklyn-DodgersUniform-color : Blue

ML-Baseball TeamIsa : TeamCardinality : 26*team-size : 24*Manager :

Brooklyn-DodgersInstance : ML-Baseball-TeamTeam-size : 24Manager : Leo-DurocherPlayers : {Pee-Wee-Reese,...}

Class of All Teams As a Metaclass

ClassInstance : ClassIsa : Class

TeamInstance : ClassIsa : ClassCardinality : {The number of teams that exist}*team-size : {Each team has a size}

ML-Baseball-TeamIsa : MammalInstance : ClassIsa : TeamCardinality : 26 {The number of baseball

teams that exist}*team-size : 24 {The default 24 players per

team}Manager :

cont... Brooklyn-Dodgers

Instance : ML-Baseball-Team

Isa : ML-Baseball-TeamTeam-size : 24Manager : Leo-Durocher*uniform-colour : Blue

Pee-Wee-ReeseInstance : Brooklyn-

DodgersInstance : FielderUniform-colour : BlueBatting-Average : .309

Class(set of sets)

Team

ML-Baseball-Team

Pee-Wee-Reese

ML-Baseball-Player

Brooklyn-Dodgers

Classes and Metaclasses

Representing Relationships among classes

Classes can be related to each otherClass1 can be a subset of Class2Mutually-disjoint-with relates a class to one

or more other classes that are guaranteed to have no elements in common with it

Is-covered-by a set S of mutually disjoint classes than S is called as partition of the class

Representing Relationships among classes

ML-Baseball-Player

Three-Finger-Brown

Pitcher National-leaguer

American-leaguer

FielderCatcher

instanceinstance

isaisaisaisa isa

Cont…ML-Baseball-Player

Is-covered-by : {Pitcher, Catcher, Fielder}{National-Leaguer, American-

leager}

PitcherIsa : ML-Baseball-PlayerMutually-disjoint-with : {Catcher, Fielder}

FielderIsa : ML-Baseball-PlayerMutually-disjoint-with : {Pitcher,Catcher}

CatcherIsa : ML-Baseball-PlayerMutually-disjoint-with : {Pitcher, Fielder}

National-LeaguerIsa : ML-Baseball-Player

Three-Finger-BrownInstance : PitcherInstance : National-Leaguer

Slot-Values as Objects

We could attempt to compare slots by creating slots themselves into objects.

we use Lambda(λ) notation for creating objects

Johnheight : 72

Billheight :

Cont…

Johnheight: 72; λx( x.height > Bill.height )

Billheight: λ x( x.height < John.height )

Inheritance

Bird fly:yes

Ostrichfly:no Pet-Bird

Fifi fly:?

isa isa

instanceinstance

Cont…

Quaker pacifist :no

Republican- pacifist: false

Dick pacifist:?

instance instance

Solution

The solution to this problem instead of using path length but use inferential distance.

Class1 is closer to class2 then to class3 , if and only if class1 has an inference path through class2 to class3 (class2 is between class1 and class3)

Property inheritance

The set of competing values for a slot S in a frame F contains all those valuesCan be derived from some frame X that is above

F in the isa hierarchyAre not contradicted by some frame Y that has

a shorter inferential distance to F than X does

Bird fly : yes

Ostrich fly:no

Pet-Bird

isa isa

Fifi fly:?

instance

instanceWhite-Plumed Ostrich

Plumed- Ostrich

isa

isa

Cont…

Quaker pacifist :no

Republican pacifist:false

Dick pacifist:?

instanceinstance

Conservative-Republican

isa

Reasoning Capabilities of Frames

Consistency checking to verify that when a slot value is added to a frame

Propagation of definition values along isa and instance links

Inheritance of the default values along isa and instance links

Frame Languages

The idea of Frame system as a way to represent declarative knowledge has been encapsulated in a series of frame oriented knowledge representation languages KRL [Bobrow and Winograd in 1922]FRL [Roberts and Goldstein, 1977]RLL, KL-ONE, KRYPTON, NIKL, CYCL,

Conceptual Graphs, THEO and FRAMEKIT

UNIT IVUNIT IVStrong-Slots Strong-Slots

and and Filler StructuresFiller Structures

Conceptual Dependency

In semantic network and Frame systems may have specialized links and inference procedure but there is no rules about what kind of objects and links are good in general for knowledge representation

Conceptual Dependency is a theory of how to represent events in natural language sentencesFacilitates drawing inferences from sentences

Independent of the language in which sentences were originally stated

Conceptual Dependency …CD provides

a structure into which nodes representing information can be placed

a specific set of primitives at a given level of granularity

Sentences are represented as a series of diagrams

The agent and the objects are represented

The actions are built up from a set of primitive acts which can be modified by tense.

Primitive Acts

ATRANS Transfer of an abstract relationship (e.g.,give)PTRANS Transfer of the physical location of an object (e.g.,go)PROPEL Application of physical force to an object (e.g.,push)MOVE Movement of a body part by its owner (e.g.,kick) GRASP Grasping of an object by an actor (e.g.,clutch)INGEST Ingestion of an object by an animal (e.g.,eat)EXPEL Expulsion of something from the body of an

animal (e.g.,cry)

Conceptual Dependency …

MTRANS Transfer of mental information (e.g.,tell)MBUILD Building new information out of old

(e.g.,decide)

SPEAK Production of sounds (e.g.,say)

ATTEND Focusing of a sense organ toward a. stimulus (e.g.,listen)

Premitive Concepts conceptual categories provide building

blocks which are the set of allowable dependencies in the concepts in a sentence

PP -- Real world objects(picture producers)

ACT -- Real world actions

PA -- Attributes of objects(Modifiers of PP)

AA -- Attributes of actions(Modifiers of actions)

T -- Times

LOC – Locations

Example Raju ATRANS ← book

“Raju gave the man a book”Arrows indicate the direction of dependencyLetters above indicate certain RelationshipDouble arrows () indicate two-way links

between the actor (PP) and action (ACT)o -- object. R -- recipient-donor.I -- instrument e.g. eat with a spoonD -- destination e.g. going home.

O

from Raju

mantopR

Modifiers…

The use of tense and mood in describing events is extremely important modifiers are: p – past delta -- timeless f -- future c -- conditional t -- transition / -- negative ts-- start transition ? – interrogativetf-- finished transition k -- continuing

the absence of any modifier implies the present tense.

Conceptual Dependency …

Arrows indicate the direction of dependency

The Double arrow() has an object (actor), PP and action, ACT. I.e. PP ACT.

The triple arrow( ) is also a two link but between an object, PP, and its attribute, PA. I.e. PP to PA.

It represents isa type dependencies.

1. PPACT John PTRANS John ran2. PP PA John height (>average) John is

tall3. PP PA John doctor John is a

doctor4.

5. .

6. ACT PP

1. ACT

PP↑PA

boy ↑nice

A nice boy

PP

PP

dog

johnJohn’s dogPoss_by

o John PROPEL cart o John pushed the cart

PPPP

o JohnATRANS John

Maryop

booko

John took the book from mary.

8.

9.

10.PP

I John INGEST I John

do

spoon

oIce creamo

PPPP

D John PTRANS

D field

bag

fertilizero

P

P

PP

PAplants

Size>x

Size=x

John ate ice cream with a spoon

John fertilized the field

The plants grew.

ACT

ACT

11. (a)(b)

12.

13.

Bill PTOPEL bullet Bob Gun

Bob health(-10)

P

R

yesterday

John PTRANSP

1PTRANS 1 o D home

I

1MTRANS frog

CP

eyes

Bill shot Bob.

John ran yesterday.

While going home , I saw a frog

14. PP woods

MTRANS frog CP

ears

I heard a frog in the woods

one INGEST SMOKE

one

onecigarette

o R

deadalive

p

c

INGEST

tfp

smoke

Ro cigarette

I

Since smoking can kill you, I stopped.

Joh

nBill

Billp

MTRANS

Bill

o

do1

nose

broken

John

Poss-by

John

believe

I

John

do2cf

o

Bill do1

John

nose

broken

Poss-by

Bill threatened John With a broken nose.

Advantages with CD

Using these primitives involves fewer inference rules.

Many inference rules are already represented in CD structure.

The holes in the initial structure help to focus on the points still to be established.

Disadvantages with CD

Knowledge must be decomposed into fairly low level primitives.

Impossible or difficult to find correct set of primitives.

A lot of inference may still be required. Representations can be complex even for

relatively simple actions

Scripts

Scripts generally used to represent knowledge about common sequence of events

Script is a structure that describes a stereotyped sequence of events in a particular context

A script consists of a set of slots associated with some information

Components of Scripts

Entry conditions Conditions that must, in general, be satisfied before the

events described in the script can occur.ResultConditions that will , in general, be true after the events

described in the script have occurred.Props Slots representing objects that are involved in the events

described in the script. The presence of these objects can be inferred even if they are not mentioned explicitly.

Roles Slots representing people who are involved in the events

described in the script. The presence of these people ,too, can be inferred even if they are not mentioned explicitly. If specific individuals are mentioned, they can be inserted into the appropriate slots.

Track The specific variation on a more general pattern that is

represented by this particular script. Different tracks of the same script will share many but not all components.

Scenes The actual sequences of events that occur. The events are

represented in conceptual dependency formalism.

Planning

ContentsIntroduction to Planning Blocks world ProblemComponents of Planning system

Green’s approach

STRIPS

Goal Stack PlanningSussman Anomaly

Non Linear Planning Using Constraint PostingTWEAK Algorithm

Hierarchical Planning

Planning

Planning problems are hard problems They are certainly non­trivial Method which we focus on ways of decomposing

the original problem into appropriate subparts and on ways of handling interactions among the subparts during the problem-solving process are often called as planning

Planning refers to the process of computing several steps of a problem-solving procedure before executing any of them

Block World Problem

There are number of square blocks, all the same size. They can be stacked one upon another

There is a robot arm that can manipulate the blocks

Robot arm can hold one block at a timeAll block are the same size

Robot Actions

UNSTACK(A,B)-

Pick up block A from its current position on block B. The arm must be empty and block A must have no block on top of it.

STACK(A,B)-

Place block A on block B. The arm must already be holding and the surface of B must be clear.

PICKUP(A)-

Pick up block A from the table and hold it. The arm must be empty and there must be nothing on top of block A.

PUTDOWN(A)-

Put block A down on the table. The arm must have been holding block A.

Robot Actions

Set of Predicates

ON(A,B)- Block A is on block B.ONTABLE(A)- Block A is on the table.CLEAR(A)-There is nothing on top of block A.HOLDING(A)- The arm is holding block A.ARMEMPTY- The arm is holding nothing.

Logical Statements

[ x:HOLDING(x)]∃ ⌐ARMEMPY

∀x: ONTABLE(x) ⌐ y:ON(x,y)∃

∀x:[⌐ y:ON(y,x)]∃ CLEAR(x)

Components of Planning System

Components of Planning System

Chose best rule to applyApply the chosen ruleDetecting when a solution has been foundDetecting dead endsRepairing an almost correct solution

1. Chose Best Rule

Isolate set of differences between the desired goal state and the current state

Identify those rules that are relevant to reducing those differences (Means-Ends-Analysis)

If several rules found then choose best using heuristic information

2. Apply Rules

In simple systems, applying rules is easy. Each rule simply specified the problem state that would result from its application.

In complex systems, we must be able to deal with rules that specify only a small part of the complete problem state.

One way is to describe, for each action, each of the changes it makes to the state description.

Green's Approach (Applying Rules)

The changes to a state produced by the application of a rule

Green's Approach (Applying Rules)

UNSTACK(x,y)=[CLEAR(x,s) ∧ ON(x,y,s)]

[HOLDING(x,DO(UNSTACK((x,y),s)∧ CLEAR(y,DO(UNSTACK(x,y),s))]

Initial State of the problem is S0If we execute UNSTACK(A,B) in state S0The state that results from the unstacking

operation is S1=> HOLDING(A,S1) ∧CLEAR (B,S1)

Green's Approach (Applying Rules)

AdvantagesResolution can be applied on state

description Disadvantages

Many rules required to represent problemDifficult to represent complex problems

STRIPS(Applying Rules)

STRIPS approach each operator described by set of lists of predicates

STRIPES has three lists are ADD, DELETE, PRECONDITIONA list of things that become TRUE called ADD A list of things that become FALSE called

DELETE A set of prerequisites that must be true before

the operator can be applied

STRIPS (Applying Rules)

STACKS(x,y)P:CLEAR(y) HOLDING(x)∧D:CLEAR(y) HOLDING(x)∧A:ARMEMPTY ON(x,y)∧

UNSTACK(x,y)P:ON(x,y) CLEAR(x) ARMEMPTY∧ ∧D:ON(x,y) ARMEMPTY∧A:HOLDING(x) CLEARC(y)∧

STRIPS (Applying Rules)

PICKUP(x)P:CLEAR(x) ONTABLE(x) ARMEMPTY∧ ∧D:ONTABLE(x) ARMEMPTY∧A:HOLDING(x)

PUTDOWN(x)P:HOLDING(x)

D:HOLDING(x)

A:ONTABLE(x) ARMEMPTY∧

STRIPS (Applying Rules)

If a new attribute is introduced we do not need to add new axioms for existing operators. Unlike in Green's method we remove the state indicator and use a database of predicates to indicate the current state

Thus if the last state was:

ONTABLE(B) ON(∧ A,B) CLEAR(∧ A)

after the unstack operation the new state is

ONTABLE(B) CLEAR(∧ B) HOLDING(∧ A) CLEAR(∧ A)

3. Detecting a SolutionA planning system has succeeded in finding

a solution to a problem when it has found a sequence of operators that transform the initial problem state into the goal state

In simple problem-solving systems we know the solution by a straightforward match of the state description

But in complex problem different reasoning mechanisms can be used to describe the problem states, that reasoning mechanisms could be used to discover when a solution had been found

4. Detecting Dead EndsA Planning system must be able to detect

when it is exploring a path that can never lead to a solution. Above same reasoning mechanism can be used to detect dead ends

In Search process reasoning forward from initial state , it can prune any path that lead to a state from which goal state cannot be reached

Similarly backward reasoning, some states can be pruned from the search space

In completely decomposable problems can be solve the sub problems and combine the sub solutions yield a solution to the original problem.

But try to solving nearly decomposable problems one way is use Means-Ends Analysis technique to minimize the difference between initial state to goal state

One of the better way to represent knowledge about what went wrong and then apply a direct patch

5.Repairing an Almost Correct Solution

Goal Stack Planning

Goal Stack Planning

Use of goal stack for solving compound goals by taking advantage of STRIPS method

Problem solver makes use of a single stack that contains both goals ,operators and database

Database describes the current situationSet of Operators described as

PRECONDITIO, ADD and DELETE lists

Simple Blocks world problem

Goal Stack Planning…

We can describe the start state or Goal stack asON(B, A) ∧ ONTABLE(A) ∧ONTABLE(C) ONTABLE(D) ∧

ARMEMPTY

We can describe the goal state or Goal stack asON(C,A) ON(B,D) ONTABLE(A) ONTABLE(D)∧ ∧ ∧

Example

Example…

Decompose the problem into four different sub problems in the goal stack

1. ON(C,A)

2. ON(B,D)

3.ONTABLE(A)

4.ONTABLE(D)

ONTABLE(A) and ONTABLE(D) are already true in the initial state.

Example…

Depending on the order in which we want to solve the sub problems.

There are two different orders are

(1) (2)ON(C,A) ON(B,D)ON(B,D) ON(C,A)ON(C,A) ON(B,D) OTAD∧ ∧ ON(C,A) ON(B,D) OTAD∧ ∧

OTAD is an abbreviation of ONTABLE(A) ONTABLE(D)∧

Example…

At each step of the problem solving process the top goal on the stack will be solved until the goal stack is empty

One last check, the original goal is compared to the finial state derived from chosen operator

Choose first alternative, predicate on top of the goal stack is ON(C,A)

Example…

First check to see whether ON(C,A) is true in the current state.

It is not, so find an operator that could cause it to be true

If you Apply STACK(C,A) operator will lead the state to ON(C,A) goal

STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD∧ ∧

Example…In order to apply STACK(C,A) operator, its

preconditions must hold, so we stack those sub goals

CLEAR(A) HOLDING(C)∧Resultant goal stack is

CLEAR(A)HOLDING(C)CLEAR(A) HOLDING(C)∧STACK(C,A)ON(B,D)ON(C,A) ON(B,D) OTAD∧ ∧

Example…Check whether CLEAR(A) is true. It is not.The only operator that could makes true is

UNSTACK(B,A). So apply it to goal stackUNSTACK(B,A)

HOLDING(C)CLEAR(A) HOLDING(C)∧STACK(C,A)ON(B,D)ON(C,A) ON(B,D) OTAD∧ ∧

Set of preconditions should be satisfied when you are applying UNSTACK(B,A) operator are ON(B, A) ∧ CLEAR(B) ∧ ARMEMPTY

Example…So goal stack is

ON(B, A)

CLEAR(B)

ARMEMPTY

ON(B, A) ∧ CLEAR(B) ∧ ARMEMPTY

UNSTACK(B,A)

HOLDING(C)

CLEAR(A) HOLDING(C)∧

STACK(C,A)

ON(B,D)

ON(C,A) ON(B,D) OTAD∧ ∧

Example…

Compare the top element of the Goal stack ON(B,A) is satisfied. We see that it is satisfied. So pop it off.

Consider the next goal CLEAR(B), it is also satisfied. So pop it off.

Consider the next goal ARMEMPTY, it is also satisfied. So pop it off.

Now apply top element of goal stack, UNSTACK(B,A) operator and pop it off.

Example…

The DATABASE corresponding to the world modal at this pointONTABLE(A) ONTABLE(C) ONTABLE(D) ∧ ∧ ∧HOLDING(B) CLEAR(A)∧

Goal Stack now is,

HOLDING(C)CLEAR(A) HOLDING(C)∧STACK(C,A)ON(B,D)ON(C,A) ON(B,D) OTAD∧ ∧

Example…

Now attempt to satisfy the goal HOLDING(C) The two operators might make this true i.e

PICKUP(C) and UNSTACK(C,x).I am considering only first operator.The goal stack is

PICKUP(C)CLEAR(A) HOLDING(C)∧STACK(C,A)ON(B,D)ON(C,A) ON(B,D) OTAD∧ ∧

Example…

So preconditions for PICKUP(C) is

ONTABLE(C) CLEAR(C) ARMEMPTY ONTABLE(C) ∧ CLEAR(C)

∧ARMEMPTY PICKUP(C)CLEAR(A) HOLDING(C)∧STACK(C,A)ON(B,D)ON(C,A) ON(B,D) OTAD∧ ∧

Example…

The top element of the goal stack is ONTABLE(C) satisfied. The next element CLEAR(C) also satisfied . So pop them from goal stack

The next element ARMEMPTY is not satisfied since HOLDING(B) is true

ARMEMPTY ONTABLE(C) ∧ CLEAR(C) ∧ ARMEMPTY PICKUP(C)CLEAR(A) HOLDING(C)∧STACK(C,A)ON(B,D)ON(C,A) ON(B,D) OTAD∧ ∧

Example…

There are two operators to make ARMEMPTY is true, STACK(B,x) and PUTDOWN(B)

which operator should we choose?If you look ahead in the goal stack the block

B onto DSo we choose to apply STACK(B,D) by

binding D to x.

Example…So the goal stack now is

CLEAR(D)HOLDING(B)CLEAR(D) HOLDING(B)∧STACK(B,D)ONTABLE(C) CLEAR(C) ARMEMPTY∧ ∧

PICKUP(C)

CLEAR(A) HOLDING(C)∧

STACK(C,A)

ON(B,D)

ON(C,A) ON(B,D) OTAD∧ ∧

Example…

Both CLEAR(D) and HOLDING(B) satisfied so pop them from goal stack and apply STACK(B,D).

The resultant databaseONTABLE(A) ONTABLE(C) ONTABLE(D) ∧ ∧ ∧

ON(B,D) ARMEMPTY∧

The goal stack now isPICKUP(C)CLEAR(A) HOLDING(C)∧STACK(C,A)ON(B,D)ON(C,A) ON(B,D) OTAD∧ ∧

Example…

Now all the preconditions for PICKUP(C) are satisfied, so it can be executed

Then all the preconditions for STACK(C,A) also satisfied so execute it.

So pop these the operatorsCheck for next predicate ON(B,D), so it is

already satisfied, pop it off.One last check for combined goal

ON(C,A) ON(B,D) OTAD∧ ∧Also satisfied...

Example…

The problem solver now it halt and return the plan

1. UNSTACK(B,A)2. STACK(B,D)3. PICKUP(C)4. STACK(C,A)

SUSSMAN ANOMALY

Try to solve the below problem

Example…

Two ways to solve the above problem (1) (2)

ON(A,B) ON(B,C)ON(B,C) ON(A,B)ON(A,B) ON(B,C)∧ ON(A,B) ON(B,C)∧

Choose alternative 1

Example…

ON(C,A)CLEAR(C)ARMEMPTYON(C,A) CLEAR(C) ARMEMPTY∧ ∧UNSTACK(C,A)ARMEMPTYCLEAR(A) ARMEMPTY∧PICKUP(A)CLEAR(B) HOLDING(A)∧STACK(A,B)ON(B,C)ON(A,B) ON(B,C)∧

Example…

All the preconditions of UNSTACK(C,A) are satisfied, so pop it off and apply this operator

So the goal stack now isARMEMPTYCLEAR(A) ARMEMPTY∧PICKUP(A)CLEAR(B) HOLDING(A)∧STACK(A,B)ON(B,C)ON(A,B) ON(B,C)∧

Example…

To satisfy ARMEMPTY of precondition of PICKUP(A)

Simply apply operator PUTDOWN(C) and pop all the conditions until

ON(B,C)ON(A,B) ON(B,C)∧

The current state is

ONTABLE(B) ONT(A,B) ONTABLE(C) ARMEMPTY∧ ∧ ∧

Example…

The sequence of operators applied so far is1. UNSTACK(CA)2. PUTDOWN(C)3. PICKUP(A)4. STACK(A,B)

Example…

Then try to achieve the other goal ON(B,C) ON(A,B)

CLEAR(A)ARMEMPTYON(A,B) CLEAR(A) ARMEMPTY∧ ∧UNSTACK(A,B)ARMEMPTYCLEAR(B) ARMEMPTY∧PICKUP(B)CLEAR(C) HOLDING(B)∧STACK(B,C)ON(A,B) ON(B,C)∧

Example…

All the preconditions of UNSTACK(A,B) are satisfied, so pop it off and apply this operator

So the goal stack now isARMEMPTYCLEAR(B) ARMEMPTY∧PICKUP(B)CLEAR(C) HOLDING(B)∧STACK(B,C)ON(A,B) ON(B,C)∧

Example…

To satisfy ARMEMPTY of precondition of PICKUP(B)

Simply apply operator PUTDOWN(A) and pop all the conditions until

ON(A,B) ON(B,C)∧ The current state is

ON(B,C) ∧ ONTABLE(A) ONTABLE(C) ARMEMPTY∧ ∧

Example…

But check Remaining goalON(A,B) ∧ ON(B,C) is not satisfied

The difference between current state to goal state is ON(A,B)

Sequence of operators to be added to the goal stack is

9. PICKUP(A)10.STACK(A,B)

Example…

Now combine the operators and check for goal is satisfied1. UNSTACK(C,A) 6. PUTDOWN(A)2. PUTDOWN(C) 7. PICKUP(B)3. PICKUP(A) 8. STACK(B,C)4. STACK(A,B) 9.PICKUP(A)5.UNSTACK(A,B) 10.STACK(A,B)

Example…

But the same goal can be achieved using good plan by

1. UNSTACK(C,A) 4. STACK(B,C)2. PUTDOWN(C) 5.PICKUP(A) 3. PICKUP(B) 6.STACK(A,B)

Nonlinear Planning Using Constraint Posting

Goal stack planning method solves sub goals one at a time, in order

But difficult problems require interactions between sub problems

Non linear planning is not composed of a linear sequence of complete sub plans

Generally we use heuristic algorithms to solve nonlinear planning

Nonlinear Planning Using Constraint Posting

Nonlinear Planning Using Constraint Posting…

If we want to solve the above problem 1. Try to achieve ON(A,B) clearing block A

putting block C on the table. 2. Achieve ON(B,C) by stacking block B on block

C. 3. Complete ON(A,B) by stacking block A on

block B.

Nonlinear Planning Systems

HACKER is an automatic programming system (Basic idea)

NOHA first non linear planning system The goal stack algorithm of STRIPS was

transformed into a goal set algorithm TWEAK used constraint posting as a

central technique In constraint posting is to build up a plan by

incrementally, partial ordering and binding variables with in the operators

Example…

Try to solve the sussman’s anomaly using nonlinear planning

Constraint Posting

suggesting operators, trying to order themproduce bindings between variables in the

operators and actual blocks. The initial plan consists of no stepsThere is no order or detail at this stage Gradually more detail constraints about

the order of subsets of the steps are introduced until a completely ordered sequence is created

Heuristics for Planning (Constraint Posting)

1. Step Addition creating new steps (GPS).

2. Promotion constraining a step to come before another step

(Sussman's HACKER). 3. Declobbering

placing a new step between two steps to revert a precondition (NOAH, NONLIN). 4. Simple Establishment

assigning a value to a variable to ensure the preconditions of some step (TWEAK). 5. Separation

preventing the assignment certain values to a variables (TWEAK).

Step Addition

Introducing new steps to achieve goals or preconditions is called step addition

In our problem, incrementally generate nonlinear plan that is plan with no steps.

Means end analysis we choose two steps ON(A,B) and ON(B,C)

To achieve the goal add new steps to the problem

Step Addition

CLEAR(B) CLEAR(C) *HOLDING(A) *HOLDING(G)

---------------------- ---------------------- STACK(A,B) STACK(B,C) ---------------------- ----------------------

ARMEMPY ARMEMPTYON(A,B) ON(B,C)⌐CLEAR(B) ⌐CLEAR(C)⌐HOLDING(A) ⌐HOLDING(B)

Each step with its preconditions above and post conditions below it. Delete post conditions represented with (⌐) symbolUnachieved preconditions represented with (*) symbol

Step Addition…

To achieve the preconditions of the two steps above we use step addition again

*CLEAR(A) *CLEAR(B)

ONTABLE(A) ONTABLE(B)

*ARMEMPTY *ARMEMPTY

------------------ -----------------

PICKUP(A) PICKUP(B)

------------------ ------------------

⌐ONTABLE(A) ⌐ONTABLE(B)

⌐ARMEMPTY ⌐ARMEMPTY

HOLDING(A) HOLDING(B)

Promotion

Promotion first used by sussman in his HACKER program

Promotion is posting constraints that one step must precede another

Adding PICKUP steps may not satisfy the *HOLDING precondition of STACK step

Because there are no ordering constraints present among the steps

S1S2 means that step S1 precede S2

Promotion…

PICKUP step should precede STACK step so

PICKUP(A)STACK(A,B)PICKUP(B)STACK(B,C)

The above example in step addition *CLEAR(A) is unachieved because block A is not clear in the initial state

*CLEAR(B) is unachieved even B is clear in the initial state. There exist a step STACK(A,B) with post condition ⌐CLEAR(B)

Promotion…

So we can achieve CLEAR(B) by stating that the PICKUP(B) step must come before the STACK(A,B)

PICKUP(B) STACK(A,B) Now turn to two unachieved preconditions

*ARMEMPTY and *CLEAR(A) Try to achieve *ARMEMPTY

Promotion…

Initial state has an empty arm. Each operators PICKUP(A) and PICKUP(B) has post condition ⌐ARMEMPTY

Either operator could prevent the other from executing

So order them

PICKUP(B) PICKUP(A)Original order

PICKUP(B) PICKUP(A)STACK(A,B)

Declobbering

Placing a new step in between two old steps

Initial state contains an empty arm, so all preconditions of PICKUP(B) is satisfied

Result of PICKUP(B) assert ⌐ARMEMPTY This can be solved by inserting another step in

between PICKUP(B) and PICKUP(A) to reassert ARMEMPTY

STACK(B,C) can be achieved this (Use Heuristic)

PICKUP(B) STACK(B,C) PICKUP(A)

Simple Establishment Now try to solve unachieved precondition

for PICKUP(A) is *CLEAR(A) from the PICKUP(A), step addition

*ON(x,A)

*CLEAR(x)

*ARMEMPTY

-------------------

UNSTACK(x,A)

--------------------

⌐ARMEMPTY

CLEAR(A)

HOLDING(A)

⌐ON(x,A)

Simple Establishment…

Assigning a value to a variable We introduce the variable x because the

only post condition we interested is CLEAR(A)

X=C in step UNSTACK(x,A)

Simple Establishment…

The other preconditions to be satisfied CLEAR(C) and ARMEMPTY, we use promotion to make order

UNSTACK(x,A)STACK(B,C)UNSTACK(x,A)PICKUP(A)UNSTACK(x,A)PICKUP(B)

The original order so far UNSTACK(C,A)PICKUP(B)

STACK(B,C)PICKUP(A)STACK(A,B)

Example…

The step PICKUP(B) requires ARMEMPTY but this is denied by the new UNSTACK(C,A) step

Use Declobbering step to plan like PUTDOWN(C) in between the two steps

HOLDING(C)----------------PUTDOWN(C)-----------------⌐HOLDING(C)ONTABLE(C)ARMEMPTY

Example…

The original order so far UNSTACK(C,A)PUTDOWN(C) STACK(B,C)

PICKUP(B)PICKUP(A)STACK(A,B)

1. UNSTACK(C,A)

2. PUTDOWN(C)

3. PICKUP(B)

4. STACK(B,C)

5. PICKUP(A)

6. STACK(A,B)

Example…

The above nonlinear planning example we use four steps addition, Promotion, Declobbering and Simple Establishment

The other heuristic is Separation makes preventing the assignment certain values to the variables

TWEAK Algorithm

1. Initialize S to be the set of propositions in the goal state. 

2. Repeat 

I. Remove some unachieved proposition P from S. 

II. Achieve P by using one of the heuristics. 

III. Review all the steps, including additional steps to find all unachieved preconditions, add these to S the set of unachieved preconditions. 

until the set S is empty. 

3. Complete the plan by converting partial orders into a total order performing all necessary instantiations, binding of the variables. 

Hierarchical Planning

Hierarchical Planning

Main difficulty in strips-like planning is complexity, One reason for complexity there is no structureThere is no distinction between important and unimportant propertiesno distinction between important and unimportant operatorsThis observation gives raise to two different ways of abstraction in planning

abstraction of situationsabstraction of operators

Cont…

It is important to be able to eliminate some of the details of the problem until a solution that addresses the main issues is found

Early attempts to do this involved the use of macro operators.

But in this approach, no details were eliminated from actual descriptions of the operators.

Cont…

Consider the example, you want to visit a friend in Europe but you have a limited amount of cash to spend.

First preference will be find the airfares, since finding an affordable flight will be the most difficult part of the task.

You should not worry about getting out of your driveway, planning a route to the airport etc, until you are sure you have a flight.

ABSTRIPS (Hierarchical Planning)

ABSTRIPS actually planned in a hierarchy of abstraction spaces, in each of which preconditions at a lower level of abstraction were ignored.ABSTRIPS approach is as follows:

First solve the problem completely, considering only preconditions whose criticality value is the highest possible.

These values reflect the expected difficulty of satisfying the precondition.

To do this, do exactly what STRIPS did, but simply ignore the preconditions of lower than peak criticality.

Once this is done, use the constructed plan as the outline of a complete plan and consider preconditions at the next-lowest criticality level.

Because this approach explores entire plans at one level of detail before it looks at the lower-level details of any one of them, it has been called length-first approach.

Other Planning Systems

Triangle Table Meta planning Macro-operators Case based planning

UNIT VUNIT VNatural Language Natural Language

ProcessingProcessing

Introduction

Language is meant for Communicating about the world.

By studying language, we can come to understand more about the world.

We look at how we can exploit knowledge about the world, in combination with linguistic facts, to build computational natural language systems.

Introduction

NLP problem can be divided into two tasks:– Processing written text, using lexical,

syntactic and semantic knowledge of the language as well as the required real world information.

– Processing spoken language, using all the information needed above plus additional knowledge about phonology as well as enough added information to handle the further ambiguities that arise in speech.

Steps in NLP

Morphological Analysis: Individual words are analyzed into their components and non word tokens such as punctuation are separated from the words.

Syntactic Analysis: Linear sequences of words are transformed into structures that show how the words relate to each other.

Semantic Analysis: The structures created by the syntactic analyzer are assigned meanings.

Steps in NLP…

Discourse integration: The meaning of an individual sentence may depend on the sentences that precede it and may influence the meanings of the sentences that follow it.

Pragmatic Analysis: The structure representing what was said is reinterpreted to determine what was actually meant. For example, the sentence “Do you know what time it is?” should be interpreted as a request to told the time.

Morphological Analysis

Suppose we have an English interface to an operating system and the following sentence is typed:

I want to print Bill’s .init file.Morphological analysis must do the following

things:Pull apart the word “Bill’s” into proper noun

“Bill” and the possessive suffix “’s”Recognize the sequence “.init” as a file

extension that is functioning as an adjective in the sentence.

Morphological Analysis

This process will also assign syntactic categories to all the words in the sentence.

Consider the word “prints”. This word is either a plural noun or a third person singular verb ( he prints ).

Syntactic Analysis

Syntactic analysis must exploit the results of morphological analysis to build a structural description of the sentence.

The goal of this process, called parsing, is to convert the flat list of words that forms the sentence into a structure that defines the units that are represented by that flat list.

Syntactic Analysis

The important thing here is that a flat sentence has been converted into a hierarchical structure and that the structure correspond to meaning units when semantic analysis is performed.

Reference markers are shown in the parenthesis in the parse tree.

Each one corresponds to some entity that has been mentioned in the sentence.

Syntactic Analysis

S

(RM1)

NP

PRO

I(RM2)

VP

V

Want

S(RM3)

NP

PRO

I

(RM2)

VP

V

print

NP

(RM4)

ADJS

Bill’s

(RM5)

NP

ADJS N

.init file

I want to print Bill’s .init file.

Semantic Analysis

Semantic analysis must do two important things:

– It must map individual words into appropriate objects in the knowledge base or database.

– It must create the correct structures to correspond to the way the meanings of the individual words combine with each other.

Discourse Integration

Specifically we do not know whom the pronoun “I” or the proper noun “Bill” refers to.

To pin down these references requires an appeal to a model of the current discourse context, from which we can learn that the current user is USER068 and that the only person named “Bill” about whom we could be talking is USER073.

Once the correct referent for Bill is known, we can also determine exactly which file is being referred to.

Pragmatic Analysis

The final step toward effective understanding is to decide what to do as a results.

One possible thing to do is to record what was said as a fact and be done with it.

For some sentences, whose intended effect is clearly declarative, that is precisely correct thing to do.

But for other sentences, including the one, the intended effect is different.

Pragmatic Analysis

We can discover this intended effect by applying a set of rules that characterize cooperative dialogues.

The final step in pragmatic processing is to translate, from the knowledge based representation to a command to be executed by the system.

The results of the understanding process is

Lpr /wsmith/stuff.init

Syntactic Processing

Syntactic Processing

Syntactic Processing is the step in which a flat input sentence is converted into a hierarchical structure that is called parsing.

It plays an important role in natural language understanding systems for two reasons:Semantic processing must operate on sentence

elements. If there is no syntactic parsing step, then the semantics system must decide on its own constituents.

Thus it can play a significant role in reducing overall system complexity.

Syntactic Processing

Although it is often possible to extract the meaning of a sentence without using grammatical facts, it is not always possible to do so. Consider the examples:

The satellite orbited MarsMars orbited the satellite

In the second sentence, syntactic facts demand an interpretation in which a planet revolves around a satellite, despite the apparent improbability of such a scenario.

Syntactic Processing

Almost all the systems that are actually used have two main components:

– A declarative representation, called a grammar, of the syntactic facts about the language.

– A procedure, called parser, that compares the grammar against input sentences to produce parsed structures.

Grammars and ParsersThe most common way to represent

grammars is as a set of production rules.First rule below can be read as “ A sentence

is composed of a noun phrase followed by Verb Phrase”; Vertical bar is OR ; ε represents empty string.

Symbols that are further expanded by rules are called non terminal symbols.

Symbols that correspond directly to strings that must be found in an input sentence are called terminal symbols.

Grammars and Parsers A simple Context-free phrase structure grammar fro

English:S → NP VPNP → the NP1NP → PRONP → PNNP → NP1NP1 → ADJS NADJS → ε | ADJ ADJSVP → VVP → V NPN → file | printerPN → BillPRO → IADJ → short | long | fastV → printed | created | want

A Parse tree for a sentence

S

NP

PN

Bill

VP

V

printed

NP

theNP1

ADJS

E

N

file

Bill Printed the file

A Parse tree for a sentence

John ate the apple.1. S -> NP VP2. VP -> V NP3. NP -> NAME4. NP -> ART N5. NAME -> John6. V -> ate7. ART-> the8. N -> apple

S

NP VP

NAME

John

V

ate

NP

ART N

the apple

Top-down Vs Bottom-Up parsing

There are two ways this can be done:Top-down Parsing: Begin with start symbol and apply the grammar rules forward until the symbols at the terminals of the tree correspond to the components of the sentence being parsed.Bottom-up parsing: Begin with the sentence to be parsed and apply the grammar rules backward until a single tree whose terminals are the words of the sentence and whose top node is the start symbol has been produced.

Top-down Vs Bottom-Up parsing

The choice between these two approaches is similar to the choice between forward and backward reasoning in other problem-solving tasks.

The most important consideration is the branching factor. Is it greater going backward or forward?

Sometimes these two approaches are combined to a single method called “bottom-up parsing with top-down filtering”.

Finding One Interpretation or Finding Many

Four ways of Handling sentences:All Paths

Follow all possible paths and build all the possible intermediate component

Best Path with BacktrackingFollow only one path at a time, at every choice point, the information that is necessary to make another choice if the chosen path fails to lead to a complete interpretation of the sentence

Best Paths with Patch upFollow only one path at a time, but when an error is detected, explicitly shuffle around the components that have already been formed

Wait and seeFollow only one path, but rather than making decisions about the function of each component as it is encountered, procrastinate the decision until enough information is available to make the decision correctly

Augmented Transition Networks

ATN is a top-down parsing procedure that allows various kinds of knowledge to be incorporated into the parsing system so it can operate efficiently.

ATN is similar to Finite state machine in which a class of labels are attached to the arcs-that define transitions between states augmented.

Consider a sentence “The Long File has printed”

Augmented Transition Network…

PP Q9 Q10 /F

NPPrep

SQ1

Q2

Q3 Q4/F Q5/F

NPAUX

NP

NP

AUX

V

V

PP

NPQ6

Q8/F

Q7/F

Det

NPR

NAdj

PP

Parsing sentence: “The long file has printed”Execution proceeds as follows:1. Begin in State S2. Push to NP.3. Do a category test to see if “the” is a determiner.4. This test succeeds, so set the DETERMINER register to DEFINITE and go to state Q6.5. Do a category test to see if “long” is an adjective.6. This test succeeds, so append “long” to the list contained in the ADJS register. Stay in state Q67. Do a category text to see if “file” is an adjective. This test fails.8. Do a category test to see if “file” is a noun. This test succeeds, so set the NOUN register to “file”

and go to state Q7.9. Push to PP.10. Do a category test to see if “has” is a preposition. This test fails, so pop and return the structure.11. There is nothing else that can be done from state Q7, so pop and return the structure. 12. The return causes the machine to be in state Q1, with the SUBJ register set to the structure just

returned and the TYPE register set to DCL13. Do a category test to see if “has” is a verb. This test succeeds so set the AUX register to NIL and

set the V register to “has”. Go to state Q414. Push to state NP. Since the next word, “printed”, is not a determiner or a proper noun, NP will

pop and return failure.15. The only other thing to do in state Q4 is to halt. But more input remains, so a complete parse has

not been found. Backtracking is now required.16. The last choice point was at state Q1, so return there. The registers AUX and V must be unset.17. Do a category test to see if “has” is an auxiliary. This test succeeds, so set the V register to

“printed”. Go to state Q418. Do category test to see if “printed” is a verb. This test succeeds, so set the V register to

“printed”. Go to state Q419. Now, since the input is exhausted, Q4 is an acceptable final state. Pop and return the structure(S

DCL(NP(FILE(LONG)DEFINITE))HAS(VP PRINTED)). This structure is the output of the parse.

ATN’s can also be used in variety of ways as follows:The contents of registers can be swapped.

If the network were expanded to recognize passive sentences, then at the point that the passive was detected, the current contents of the SUBJ register would be transferred to an OBJ register and the object of the preposition “by” would be placed in the SUBJ register.

Bill printed the fileThe file was printed by Bill

Arbitrary tests can be placed on the arcs. In each of the arcs, the test is specified simply as T. But

this need not be the case. Suppose that when the first NP is found, its number is determined and recorded in the register called NUMBER. Then the arcs labeled V could have an additional test placed on them that checked that the number of particular verb that was found is equal to the value stored in NUMBER.

Unification Grammar For limiting the procedurally such as in

Speech Processing

Understanding and generating from same grammar

Major operations by parser while applying grammar are:

Matching (of sentence constituents to grammar rules)

Building Structure (corresponding to the result of combining constituents)

Unification GrammarDAG(Direct acyclic graph) can be used to define unification operator.Each DAG represents a set of attribute-value pairs.Ex:

[CAT:DET [CAT:N LEX:the] LEX:file

NUMBER: SING]Result of combining these two words:

[NP:[DET:the HEAD:file NUMBER:SING]]

We describe NP rule as:NP-> DET N

Rule of the Graph:[CONSTITUENT1: [CAT:DET

LEX:{1}] CONSTITUENT2:[CAT:N

LEX:{2} NUMBER:{3}]

BUILD:[NP:[DET:{1} HEAD:{2} NUMBER:{3}]]]

Note that the order in which attribute-value pairs are stated does not matter.Ex:

[CAT:DET [LEX:theLEX:{1}] should match contituent such as CAT:DET]

Algorithm: Unification Grammar If either G1 or G2 is an attribute that is not itself an attribute-value pair

then: If the attributes conflict, then fail. If either is a variable, then bind it to the other and return the value Otherwise, return the most general value that is consistent with

both the original values. Specially, if disjunction is allowed, then return the inter section of the values

Otherwise, do: Set variable NEW to empty For each attribute A that is present in either G1 or G2 do

• If A is not present at the top level in the other input, then add A and its value to NEW

• If it is, then call Graph-Unify with the two values for A. If that fails, then fail Otherwise, take the new value of A to be the result of that unification and add A with its value to NEW.

If there are any labels attached to G1 or G2, then bind them to NEW and return NEW.

Semantic Analysis

Semantic AnalysisProducing a syntactic parse of a sentence is only

the first step toward understanding it.We must still produce a representation of the

meaning of the sentence.Because understanding is a mapping process, we

must first define the language into which we are trying to map.

There is no single definitive language in which all sentence meaning can be described.

The choice of a target language for any particular natural language understanding program must depend on what is to be done with the meanings once they are constructed.

Choice of target language in semantic Analysis

There are two broad families of target languages that are used in NL systems, depending on the role that the natural language system is playing in a larger system:

When natural language is being considered as a phenomenon on its own, as for example when one builds a program whose goal is to read text and then answer questions about it, a target language can be designed specifically to support language processing.

When natural language is being used as an interface language to another program( such as a db query system or an expert system), then the target language must be legal input to that other program. Thus the design of the target language is driven by the backend program.

Lexical processing

The first step in any semantic processing system is to look up the individual words in a dictionary ( or lexicon) and extract their meanings.

Many words have several meanings, and it may not be possible to choose the correct one just by looking at the word itself.

The process of determining the correct meaning of an individual word is called word sense disambiguation or lexical disambiguation.

It is done by associating, with each word in lexicon, information about the contexts in which each of the word’s senses may appear.

Lexical processing…For example the word ‘diamond’ might have

following set of meanings: A geometrical shape with four equal sides.A base ball field.An extremely strong and valuable gem stone

To select the correct meaning for the word ‘diamond’ in the sentence,

Joan saw Susan’s diamond shining from across the room.

It is not necessary to know neither of geometrical shapes or nor baseball fields shimmer, but gem stones do.

Lexical processing…

The process of determining the correct meaning of an individual word is called word sense disambiguation or lexical disambiguation.

I t is done by associating, with each word in lexicon information about the contexts in which each word’s senses may appear.

baseball field interpretation can be marked as LOCATION

Some of the useful semantic markers arePHYSICAL-OBJECTANIMATE-OBJECTABSTRACT-OBJECTTIMELOCATION

Sentence-Level Processing Several approaches to the problem of creating a

semantic representation of a sentence have been developed, including the following:Semantic grammars, which combine syntactic,

semantic and pragmatic knowledge into a single set of rules in the form of grammar.

Case grammars, in which the structure that is built by the parser contains some semantic information, although further interpretation may also be necessary.

Conceptual parsing in which syntactic and semantic knowledge are combined into a single interpretation system that is driven by the semantic knowledge.

Approximately compositional semantic interpretation, in which semantic processing is applied to the result of performing a syntactic parse

Semantic grammars

A semantic grammar is a context-free grammar in which the choice of non terminals and production rules is governed by semantic as well as syntactic function.

There is usually a semantic action associated with each grammar rule.

The result of parsing and applying all the associated semantic actions is the meaning of the sentence.

ExampleS-> what is FILE-PROPERTY of FILE {query FILE.FILE-PROPERTY} S-> I want to ACTION {command ACTION}FILE-PROPERTY-> the FILE-PROP {FILE-PROP}FILE-PROP-> extension | protection | creation date| owner {value}FILE-> FILE-NAME | FILE 1 {value}FILE 1-> USER’S FILE2 {FILE2.ownwer: USER}FILE 1-> FILE2 {FILE2}FILE 2-> EXT file {instance: file-struct extension: EXT}

ExampleEXT-> .init | .txt | .lsp | .for | .ps | .mss valueACTION-> print FILE {instance: printing object : FILE}ACTION-> print FILE on PRINTER {instance: printing object : FILE printer : PRINTER}USER-> Bill | Susan {value}

File 1 {instance file-struct extension: .init owner: Bill}}}

File 1 {instance file-struct extension: .init owner: Bill}}}

S {command: {instance: printing object: {instance file-struct extension: .init owner: Bill}}}

ACTION{command: {instance: printing object: {instance file-struct extension: .init owner: Bill}}}

FILE

EXT

I want to print Bill’s .init file

The Result of parsing with a Semantic Grammar

Semantic grammars

The advantages of semantic grammars areWhen the parse is complete, the result can be used

immediately without the additional stage of processing.

Many ambiguities that would arise during a strictly syntactic parse can be avoided.

Syntactic issues that do not affect the semantics can be ignored.

The drawbacks in use of Semantic GrammarsThe number of rules required can become very large

since many syntactic generalizations are missed .Because the number of grammar rules may be very

large, the parsing process may be expensive

Case grammarsCase grammars provide a different approach to the

problem of how syntactic and semantic interpretation can be combined.

Grammar rules are written to describe syntactic rather than semantic regularities.

But the structures the rules produce correspond to semantic relations rather than to strictly syntactic ones

Consider two sentencesSusan printed the file.The file was printed by susan.

Case grammars

The case grammar interpretation of the two sentences would both be

(printed ( agent Susan) ( object File ))

The file was printed by Susan

S

VPV

NP N

P

Susan printed the file

S

VPV

NP P

P

Case grammars

Mother baked for three hours(baked ( agent Mother) ( timeperiod 3-hours ))

The pie baked for thee hours( baked (object pie) ( timeperiod 3-hours ) )

The pie baked for thee hours

S

VPV

NP N

P

Mother baked for three hours

S

VP

V

NP

PP

Conceptual Parsing

Conceptual parsing is a strategy for finding both the structure and meaning of a sentence in one step.

Conceptual parsing is driven by dictionary that describes the meaning of words in conceptual dependency (CD) structures.

The parsing is similar to case grammar.CD usually provides a greater degree of predictive

power.

Conceptual Parsing

X

“want” stative

Main noun object

human

human

humantransitiv

e

intransitive

cf

cfx pleased

cf

cfx pleased

fig: The Verb-Act Dictionary

ATRANS

oy

oneR

cf

cfx pleased

oyPTRANS

Dhere

Discourse and Pragmatic Processing

There are a number of important relationships that may hold between phrases and parts of their disclosure contexts, including:

Identical Entities. Consider the text, --Bill had a red balloon --John wanted it Parts of Entities. Consider the text, --Sue opened the book she just

bought --The title page was torn Parts of action. Consider the text, --John went on a business trip to NY --he left on an early morning flight

Entities involved in action. Consider the text, --My house was broken into last week --They took the TV and the stereoName of individuals. Consider the text, --Dave went to the moviesCasual chains. Consider the text, --There was a big storm yesterday --The schools were closed todayPlanning sequences. Consider the text, --Sally wanted a new car --She decided to get a jobImplicit presuppositions. Consider the text, --Did Joe fail CS101

Discourse and Pragmatic Processing

The Kinds of Knowledge used:The current focus of the dialogueA model of each participant’s current beliefsThe goal driven character of dialogueThe rules of conversation shared by all

participantsUsing Focus in Understanding, there are two

important parts of using knowledge to facilitate understanding:Focus on the relevant part(s) of available

knowledge baseUse that knowledge to resolve ambiguities and

to make connection among things that were said

The End

Reference: 1. Artificial intelligence - Elaine Rich, Kevin Knight 2. Artificial Intelligence: A Modern Approach by Stuart Russell and Peter Norvig

Recommended