47
Machine Learning Group Department of Computer Sciences University of Texas at Austin Learning Language Semantics from Ambiguous Supervision Rohit J. Kate Raymond J. Mooney

Learning Language Semantics from Ambiguous Supervision

  • Upload
    arlais

  • View
    39

  • Download
    0

Embed Size (px)

DESCRIPTION

Learning Language Semantics from Ambiguous Supervision. Rohit J. Kate Raymond J. Mooney. Semantic Parsing. Involves learning language semantics to transform natural language (NL) sentences into computer executable complete meaning representations (MRs) for some application - PowerPoint PPT Presentation

Citation preview

Page 1: Learning Language Semantics from Ambiguous Supervision

Machine Learning GroupDepartment of Computer Sciences

University of Texas at Austin

Learning Language Semantics from Ambiguous Supervision

Rohit J. Kate Raymond J. Mooney

Page 2: Learning Language Semantics from Ambiguous Supervision

2

Semantic Parsing• Involves learning language semantics to transform natural

language (NL) sentences into computer executable complete meaning representations (MRs) for some application

• Geoquery: An example database query application

Which rivers run through the states bordering Texas?

Queryanswer(traverse(next_to(stateid(‘texas’))))

Semantic Parsing

Arkansas, Canadian, Cimarron, Gila, Mississippi, Rio Grande …

Answer

Page 3: Learning Language Semantics from Ambiguous Supervision

3

Learning for Semantic Parsing

• Learning for semantic parsing consists of inducing a semantic parser from training data which can map novel sentences into their meaning representations

• Many accurate learning systems for semantic parsing have been recently developed: [Ge & Mooney, 2005], [Zettlemoyer & Collins, 2005], [Wong & Mooney, 2006], [Kate & Mooney, 2006], [Nguyen, Shimazu & Phan, 2006]

Page 4: Learning Language Semantics from Ambiguous Supervision

4

Unambiguous Supervision for Learning Semantic Parsers

• The training data for semantic parsing consists of hundreds of natural language sentences unambiguously paired with their meaning representations

Page 5: Learning Language Semantics from Ambiguous Supervision

5

Unambiguous Supervision for Learning Semantic Parsers

• The training data for semantic parsing consists of hundreds of natural language sentences unambiguously paired with their meaning representations

Which rivers run through the states bordering Texas?

answer(traverse(next_to(stateid(‘texas’))))

What is the lowest point of the state with the largest area?

answer(lowest(place(loc(largest_one(area(state(all)))))))

What is the largest city in states that border California?

answer(largest(city(loc(next_to(stateid( 'california'))))))

……

Page 6: Learning Language Semantics from Ambiguous Supervision

6

Shortcomings of Unambiguous Supervision

• It requires considerable human effort to annotate each sentence with its correct meaning representation

• Does not model the type of supervision children receive when they are learning a language– Children are not taught meanings of individual

sentences

– They learn to identify the correct meaning of a sentence from several meanings possible in their perceptual context

Page 7: Learning Language Semantics from Ambiguous Supervision

7

???

“Mary is on the phone”

Page 8: Learning Language Semantics from Ambiguous Supervision

8

Ambiguous Supervision for Learning Semantic Parsers

• A computer system simultaneously exposed to perceptual contexts and natural language utterances should be able to learn the underlying language semantics

• We consider ambiguous training data of sentences associated with multiple potential meaning representations– Siskind (1996) uses this type “referentially uncertain” training data to

learn meanings of words

• Capturing meaning representations from perceptual contexts is a difficult unsolved problem – Our system directly works with symbolic meaning representations

Page 9: Learning Language Semantics from Ambiguous Supervision

9

“Mary is on the phone”

???

Page 10: Learning Language Semantics from Ambiguous Supervision

10

“Mary is on the phone”???

Page 11: Learning Language Semantics from Ambiguous Supervision

11

Ironing(Mommy, Shirt)

“Mary is on the phone”???

Page 12: Learning Language Semantics from Ambiguous Supervision

12

Ironing(Mommy, Shirt)

Working(Sister, Computer)

“Mary is on the phone”???

Page 13: Learning Language Semantics from Ambiguous Supervision

13

Ironing(Mommy, Shirt)

Working(Sister, Computer)

Carrying(Daddy, Bag)

“Mary is on the phone”???

Page 14: Learning Language Semantics from Ambiguous Supervision

14

Ironing(Mommy, Shirt)

Working(Sister, Computer)

Carrying(Daddy, Bag)

Talking(Mary, Phone)

Sitting(Mary, Chair)

“Mary is on the phone”

Ambiguous Training Example

???

Page 15: Learning Language Semantics from Ambiguous Supervision

15

Ironing(Mommy, Shirt)

Working(Sister, Computer)

Talking(Mary, Phone)

Sitting(Mary, Chair)

“Mommy is ironing shirt”

Next Ambiguous Training Example

???

Page 16: Learning Language Semantics from Ambiguous Supervision

16

Ambiguous Supervision for Learning Semantic Parsers contd.

• Our model of ambiguous supervision corresponds to the type of data that will be gathered from a temporal sequence of perceptual contexts with occasional language commentary

• We assume each sentence has exactly one meaning in a perceptual context

• Each meaning is associated with at most one sentence in a perceptual context

Page 17: Learning Language Semantics from Ambiguous Supervision

17

Sample Ambiguous Corpus

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

Forms a bipartite graph

Page 18: Learning Language Semantics from Ambiguous Supervision

18

Rest of the Talk

• Brief background on KRISP, the semantic parsing learning system for unambiguous supervision

• KRISPER: Extended system to handle ambiguous supervision

• Corpus construction

• Experiments

Page 19: Learning Language Semantics from Ambiguous Supervision

19

KRISP: Semantic Parser Learner for Unambiguous Supervision

• KRISP: Kernel-based Robust Interpretation for Semantic Parsing [Kate & Mooney 2006]

• Takes NL sentences unambiguously paired with their MRs as training data

• Treats the formal MR language grammar’s productions as semantic concepts

• Trains an SVM classifier for each production with string subsequence kernel [Lodhi et al. 2002]

Page 20: Learning Language Semantics from Ambiguous Supervision

20

MR: answer(traverse(next_to(stateid(‘texas’))))Parse tree of MR:

Productions: ANSWER answer(RIVER) RIVER TRAVERSE(STATE)

STATE NEXT_TO(STATE) TRAVERSE traverse NEXT_TO next_to STATEID ‘texas’

ANSWER

answer

STATE

RIVER

STATE

NEXT_TO

TRAVERSE

STATEID

stateid ‘texas’

next_to

traverse

Meaning Representation Language

ANSWER answer(RIVER)

RIVER TRAVERSE(STATE)

TRAVERSE traverse STATE NEXT_TO(STATE)

NEXT_TO next_to STATE STATEID

STATEID ‘texas’

Page 21: Learning Language Semantics from Ambiguous Supervision

21

Semantic Parsing by KRISP

• SVM classifier for each production gives the probability that a substring represents the semantic concept of the production

Which rivers run through the states bordering Texas?

NEXT_TO next_to 0.02 0.01

NEXT_TO next_toNEXT_TO next_to 0.95

Page 22: Learning Language Semantics from Ambiguous Supervision

22

Semantic Parsing by KRISP

• SVM classifier for each production gives the probability that a substring represents the semantic concept of the production

Which rivers run through the states bordering Texas?

TRAVERSE traverseTRAVERSE traverse 0.210.91

Page 23: Learning Language Semantics from Ambiguous Supervision

23

Semantic Parsing by KRISP

• Semantic parsing is done by finding the most probable derivation of the sentence [Kate & Mooney

2006]

Which rivers run through the states bordering Texas?

ANSWER answer(RIVER)

RIVER TRAVERSE(STATE)

TRAVERSE traverse STATE NEXT_TO(STATE)

NEXT_TO next_to STATE STATEID

STATEID ‘texas’

0.91

0.95

0.89

0.92

0.81

0.98

0.99

Probability of the derivation is the product of the probabilities at the nodes.

Page 24: Learning Language Semantics from Ambiguous Supervision

24

Semantic Parsing by KRISP

• Given a sentence and a meaning representation, KRISP can also find the probability that it is the correct meaning representation for the sentence

Page 25: Learning Language Semantics from Ambiguous Supervision

25

KRISPER: KRISP with EM-like Retraining

• Extension of KRISP that learns from ambiguous supervision

• Uses an iterative EM-like method to gradually converge on a correct meaning for each sentence

Page 26: Learning Language Semantics from Ambiguous Supervision

26

KRISPER’s Training Algorithm

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

1. Assume every possible meaning for a sentence is correct

Page 27: Learning Language Semantics from Ambiguous Supervision

27

KRISPER’s Training Algorithm

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

1. Assume every possible meaning for a sentence is correct

Page 28: Learning Language Semantics from Ambiguous Supervision

28

KRISPER’s Training Algorithm contd.

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

2. Resulting NL-MR pairs are weighted and given to KRISP

1/2

1/2

1/41/4

1/41/4

1/5 1/51/5

1/51/5

1/3 1/31/3

1/31/3

1/3

Page 29: Learning Language Semantics from Ambiguous Supervision

29

KRISPER’s Training Algorithm contd.

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

3. Estimate the confidence of each NL-MR pair using the resulting parser

Page 30: Learning Language Semantics from Ambiguous Supervision

30

KRISPER’s Training Algorithm contd.

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

3. Estimate the confidence of each NL-MR pair using the resulting parser

0.92

0.11

0.320.88

0.220.24

0.71 0.180.85

0.140.95

0.24 0.890.33

0.970.81

0.34

Page 31: Learning Language Semantics from Ambiguous Supervision

31

KRISPER’s Training Algorithm contd.

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

4. Use maximum weighted matching on a bipartite graph to find the best NL-MR pairs [Munkres, 1957]

0.92

0.11

0.320.88

0.220.24

0.180.85

0.24 0.890.33

0.970.81

0.34

0.71

0.950.14

Page 32: Learning Language Semantics from Ambiguous Supervision

32

KRISPER’s Training Algorithm contd.

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

4. Use maximum weighted matching on a bipartite graph to find the best NL-MR pairs [Munkres, 1957]

0.92

0.11

0.320.88

0.220.24

0.180.85

0.24 0.890.33

0.970.81

0.34

0.71

0.950.14

Page 33: Learning Language Semantics from Ambiguous Supervision

33

KRISPER’s Training Algorithm contd.

Daisy gave the clock to the mouse.

Mommy saw that Mary gave the hammer to the dog.

The dog broke the box.

John gave the bag to the mouse.

The dog threw the ball.

ate(mouse, orange)

gave(daisy, clock, mouse)

ate(dog, apple)

saw(mother, gave(mary, dog, hammer))

broke(dog, box)

gave(woman, toy, mouse)

gave(john, bag, mouse)

threw(dog, ball)

runs(dog)

saw(john, walks(man, dog))

5. Give the best pairs to KRISP in the next iteration, continue till converges

Page 34: Learning Language Semantics from Ambiguous Supervision

34

Corpus Construction

• There is no real-world ambiguous corpus yet available for semantic parsing to our knowledge

• We artificially obfuscated the real-world unambiguous corpus by adding extra distracter MRs to each training pair (Ambig-Geoquery)

• We also created an artificial ambiguous corpus (Ambig-ChildWorld) which more accurately models real-world ambiguities in which potential candidate MRs are often related

Page 35: Learning Language Semantics from Ambiguous Supervision

35

Ambig-Geoquery Corpus

NL

NL

NL

NL

NL

MR

MR

MR

MR

MR

Start with the unambiguous Geoquery corpus

Page 36: Learning Language Semantics from Ambiguous Supervision

36

Ambig-Geoquery Corpus

NL

NL

NL

NL

NL

MR

MR

MR

MR

MR

Insert 0 to random MRs from the corpus between each pair

MR

MR

MR

MRMRMR

MR

MR

MR

MR

Page 37: Learning Language Semantics from Ambiguous Supervision

37

Ambig-Geoquery Corpus

NL

NL

NL

NL

NL

MR

MR

MR

MR

MR

Form a window of width from 0 to in either direction for each NL sentence

MR

MR

MR

MRMRMR

MR

MR

MR

MR

Page 38: Learning Language Semantics from Ambiguous Supervision

38

Ambig-Geoquery Corpus

NL

NL

NL

NL

NL

MR

MR

MR

MR

MR

Form the ambiguous corpus

MR

MR

MR

MRMRMR

MR

MR

MR

MR

Page 39: Learning Language Semantics from Ambiguous Supervision

39

Ambig-ChildWorld Corpus

• Although Ambig-Geoquery corpus uses real-world NL-MR pairs, it does not model relatedness between potential MRs for each sentence, common in perceptual contexts

• Constructed a synchronous grammar [Aho &

Ullman, 1972] to simultaneously generate artificial NL-MR pairs

• Uses 15 verbs and 37 nouns (people, animals, things), MRs are in predicate logic without quantifiers

Page 40: Learning Language Semantics from Ambiguous Supervision

40

Ambig-ChildWorld Corpus contd.

• Different perceptual contexts were modeled by choosing subsets of productions of the synchronous grammar

• This leads to subsets of verbs and nouns (e.g. only Mommy, Daddy, Mary) causing more relatedness among potential MRs

• For each such perceptual context, data was generated in a way similar to Ambig-Geoquery corpus

Page 41: Learning Language Semantics from Ambiguous Supervision

41

Ambiguity in Corpora

• Three levels of ambiguity were created by varying parameters and

MRs per NL 1 2 3 4 5 6 7

Level 1 25% 50% 25%

Level 2 11% 22% 34% 22% 11%

Level 3 6% 13% 19% 26% 18% 12% 6%

Page 42: Learning Language Semantics from Ambiguous Supervision

42

Methodology

• Performed 10-fold cross validation

• Metrics:

• Measured best F-measure across the precision-recall curve obtained using output confidence thresholds

MRsoutputcompletewithsentencestestofNumber

MRscorrectofNumberPrecision =

Recall = Number of correct MRs

Number of test sentences

F − measure = 2* Pr ecision *Recall

Pr ecision + Recall

Page 43: Learning Language Semantics from Ambiguous Supervision

43

Results on Ambig-Geoquery Corpus

0

10

20

30

40

50

60

70

80

90

100

225 450 675 900

Number of training examples

Best F-measure

No ambiguity

Level 1 ambiguity

Level 2 ambiguity

Level 3 ambiguity

Page 44: Learning Language Semantics from Ambiguous Supervision

44

Results on Ambig-ChildWorld Corpus

0

10

20

30

40

50

60

70

80

90

100

225 450 675 900

Number of training examples

Best F-measure

No ambiguity

Level 1 ambiguity

Level 2 ambiguity

Level 3 ambiguity

Page 45: Learning Language Semantics from Ambiguous Supervision

45

Future Work

• Construct a real-world ambiguous corpus and test this approach

• Combine this system with a vision-based system that extracts MRs from perceptual contexts

Page 46: Learning Language Semantics from Ambiguous Supervision

46

Conclusions

• We presented the problem of learning language semantics from ambiguous supervision

• This form of supervision is more representative of natural training environment for a language learning system

• We presented an approach that learns from ambiguous supervision by iteratively re-training a system for unambiguous supervision

• Experimental results on two artificial corpora showed that this approach is able to cope with ambiguities to learn accurate semantic parsers

Page 47: Learning Language Semantics from Ambiguous Supervision

47

Thank you!

Questions??