36
06/15/22 CPSC503 Winter 2008 1 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

Embed Size (px)

Citation preview

Page 1: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 1

CPSC 503Computational Linguistics

Lecture 9Giuseppe Carenini

Page 2: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 2

Knowledge-Formalisms Map

Logical formalisms (First-Order Logics)

Rule systems (and prob. versions)(e.g., (Prob.) Context-Free

Grammars)

State Machines (and prob. versions)

(Finite State Automata,Finite State Transducers, Markov Models)

Morphology

Syntax

PragmaticsDiscourse

and Dialogue

Semantics

AI planners

Page 3: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 3

Today 6/10

• Finish: The Earley Algorithm• Partial Parsing: Chuncking• Dependency Grammars / Parsing• Treebank

Page 4: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 4

Earley Parsing Procedure• So sweep through the table from 0 to n in

order, applying one of three operators to each state:– predictor: add top-down predictions to the

chart– scanner: read input and add corresponding

state to chart– completer: move dot to right when new

constituent found

• Results (new states) added to current or next set of states in chart

• No backtracking and no states removed

Page 5: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 5

Predictor• Intuition: new states represent top-

down expectations• Applied when non-part-of-speech

non-terminals are to the right of a dotS --> • VP [0,0]

• Adds new states to end of current chart– One new state for each expansion of the

non-terminal in the grammarVP --> • Verb [0,0]VP --> • Verb NP [0,0]

Page 6: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 6

Scanner (part of speech)

• New states for predicted part of speech.

• Applicable when part of speech is to the right of a dotVP --> • Verb NP [0,0] ( 0 “Book…” 1 )

• Looks at current word in input• If match, adds state(s) to next chart

Verb --> • book [0,1]

Page 7: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 7

Completer• Intuition: we’ve found a constituent, so

tell everyone waiting for this• Applied when dot has reached right end

of ruleNP --> Det Nom • [1,3]

• Find all states w/dot at 1 and expecting an NPVP --> Verb • NP [0,1]

• Adds new (completed) state(s) to current chartVP --> Verb NP • [0,3]

Page 8: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 8

So far only a recognizer…

• Then simply read off all the backpointers from every complete S in the last column of the table

To generate all parses:• When old states waiting for the just

completed constituent are updated => add a pointer from each “updated” to “completed”Chart [0]

…..S5 S->.VP [0,0] []S6 VP -> . Verb [0,0] []S7 VP -> . Verb NP[0,0] []….

Chart [1]S8 Verb -> book . [0,1] []S9 VP -> Verb . [0,1] [S8]S10 S->VP. [0,1] [S9]S11 VP->Verb . NP [0,1] [??]….

Page 9: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 9

Error Handling• What happens when we look at the

contents of the last table column and don't find a S --> state?– Is it a total loss? No...– Chart contains every constituent and

combination of constituents possible for the input given the grammar

• Also useful for partial parsing or shallow parsing used in information extraction

Page 10: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 10

Dynamic Programming Approaches

• Earley– Top-down, no filtering, no

restriction on grammar form

• CKY– Bottom-up, no filtering, grammars

restricted to Chomsky-Normal Form (CNF) (i.e., -free and each production either A-> BC or A-> a)

Page 11: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 11

Today 6/10

• Finish: The Earley Algorithm• Partial Parsing: Chuncking • Dependency Grammars / Parsing• Treebank

Page 12: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 12

Chunking• Classify only basic non-recursive

phrases (NP, VP, AP, PP)– Find non-overlapping chunks– Assign labels to chunks

• Chunk: typically includes headword and pre-head material

[NP The HD box] that [NP you] [VP ordered] [PP from] [NP Shaw] [VP never arrived]

Page 13: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 13

Approaches to Chunking (1): Finite-State Rule-Based

• Set of hand-crafted rules (no recursion!) e.g., NP -> (Det) Noun* Noun

• Implemented as FSTs (unionized/deteminized/minimized)

• F-measure 85-92

• To build tree-like structures several FSTs can be combined [Abney ’96]

Page 14: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 14

Approaches to Chunking (1): Finite-State Rule-Based

• … several FSTs can be combined

Page 15: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 15

Approaches to Chunking (2): Machine Learning

• A case of sequential classification

• IOB tagging: (I) internal, (O) outside, (B) beginning

• Internal and Beginning for each chunk type => size of tagset (2n + 1) where n is the num of chunk types

• Find an annotated corpus

• Select feature set• Select and train a

classifier

Page 16: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 16

Context window approach• Typical features:

– Current / previous / following words– Current / previous / following POS– Previous chunks

Page 17: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 17

Context window approach

• Specific choice of machine learning approach does not seem to matter

• F-measure 92-94 range

• Common causes of errors:– POS tagger inaccuracies

– Inconsistencies in training corpus

– Inaccuracies in identifying heads

– Ambiguities involving conjunctions (e.g., “late arrivals and cancellations/departure are common in winter” )

Page 18: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 18

Today 6/10

• Finish: The Earley Algorithm• Partial Parsing: Chuncking• Dependency Grammars / Parsing• Treebank

Page 19: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 19

Dependency Grammars• Syntactic structure: binary relations

between words

• Links: grammatical function or very general semantic relation

• Abstract away from word-order variations (simpler grammars)

• Useful features in many NLP applications (for classification, summarization and NLG)

Page 20: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 20

Dependency Grammars (more verbose)

• In CFG-style phrase-structure grammars the main focus is on constituents.

• But it turns out you can get a lot done with just binary relations among the words in an utterance.

• In a dependency grammar framework, a parse is a tree where – the nodes stand for the words in an utterance– The links between the words represent

dependency relations between pairs of words.• Relations may be typed (labeled), or not.

Page 21: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 21

Dependency Relations

Show grammar primer

Page 22: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 22

Dependency Parse (ex 1)

They hid the letter on the shelf

Page 23: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 23

Dependency Parse (ex 2)

Page 24: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 24

Dependency Parsing (see MINIPAR / Stanford demos)

• Dependency approach vs. CFG parsing.– Deals well with free word order

languages where the constituent structure is quite fluid

– Parsing is much faster than CFG-based parsers

– Dependency structure often captures all the syntactic relations actually needed by later applications

Page 25: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 25

Dependency Parsing• There are two modern approaches to

dependency parsing (supervised learning from Treebank data)– Optimization-based approaches that

search a space of trees for the tree that best matches some criteria

– Transition-based approaches that define and learn a transition system (state machine) for mapping a sentence to its dependency graph

Page 26: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 26

Today 6/10

• Finish: The Earley Algorithm• Partial Parsing: Chuncking• Dependency Grammars / Parsing• Treebank

Page 27: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 27

Treebanks• DEF. corpora in which each sentence

has been paired with a parse tree • These are generally created

– Parse collection with parser– human annotators revise each parse

• Requires detailed annotation guidelines– POS tagset– Grammar– instructions for how to deal with

particular grammatical constructions.

Page 28: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 28

Penn Treebank• Penn TreeBank is a widely used

treebank.Most well known is the Wall Street Journal section of the Penn TreeBank.

1 M words from the 1987-1989 Wall Street Journal.

Page 29: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 29

Treebank Grammars

• Treebanks implicitly define a grammar.

• Simply take the local rules that make up the sub-trees in all the trees in the collection

• if decent size corpus, you’ll have a grammar with decent coverage.

Page 30: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 30

Treebank Grammars• Such grammars tend to be very flat

due to the fact that they tend to avoid recursion.– To ease the annotators burden

• For example, the Penn Treebank has 4500 different rules for VPs! Among them...

Page 31: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 31

Heads in Trees

• Finding heads in treebank trees is a task that arises frequently in many applications.– Particularly important in statistical

parsing

• We can visualize this task by annotating the nodes of a parse tree with the heads of each corresponding node.

Page 32: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 32

Lexically Decorated Tree

Page 33: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 33

Head Finding

• The standard way to do head finding is to use a simple set of tree traversal rules specific to each non-terminal in the grammar.

Page 34: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 34

Noun Phrases

Page 35: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 35

Treebank Uses

• Searching a Treebank. TGrep2NP < PP or NP << PP• Treebanks (and headfinding) are

particularly critical to the development of statistical parsers– Chapter 14

• Also valuable to Corpus Linguistics – Investigating the empirical details of

various constructions in a given language

Page 36: 12/6/2015CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini

04/18/23 CPSC503 Winter 2008 36

Next time: read Chpt 14

Logical formalisms (First-Order Logics)

Rule systems (and prob. versions)

(e.g., (Prob.) Context-Free Grammars)

State Machines (and prob. versions)

(Finite State Automata,Finite State Transducers, Markov Models)

Morphology

Syntax

PragmaticsDiscourse

and Dialogue

Semantics

AI planners