Upload
lykhanh
View
217
Download
4
Embed Size (px)
Citation preview
SemanticRoleLabelingHaibo DingOct17,2016
Outline
• Introduction• ThematicRoles• FrameNet• Propbank• SemanticRoleLabelingSystem
IntroductiontoSemanticRepresentation
Hi,BaymaxHiro washitbyaman.
IntroductiontoSemanticRepresentation
Hi,BaymaxAmanattackedHiro.
IntroductiontoSemanticRepresentation
Hi,Baymax
• (1)Hiro washitbyaman.• (2)AmanattackedHiro.
Subject Prepositionphrase
Subject Object
NeedSemanticRepresentations!
ThematicRoles
• Thematicroles (orthematicrelations)representtherolethatanounphraseplayswithrespecttoanactionorstate,usuallyexpressedbyaverb.• ModernformulationfromFillmore(1966,1968),Gruber(1965)• Thereisnouniversallyagreed-uponsetofroles.Butarelativelysmallsetofthematicrolesarecommonlyused.• KeyIdea:abstractawayfromsyntaxtorepresenttheconceptualrolethataphraseplayswithrespecttoanactionorstate.
CommonThematicRoles• Agent• Theme• Instrument• Recipient• Experiencer• Beneficiary• Cause• Location• Path
TheAgent Role• Anagent isresponsibleforanaction.Usually(thoughnotalways)thisimpliesintentionality(volitionalcauserofanevent).• TheagentisusuallyANIMATE.Forcesofnature(e.g.,tornado)maybepermissibleagentsifthereisno cause roles.
(1)Johnbrokethewindow.(2)Johnintentionallybrokethewindow.(3)Thehammerbrokethewindow.(4)Thehammerintentionallybrokethewindow.(*)
• Aco-agent isanotherentitythatalsoperformedtheaction.JohnpaintedthewallwithMary.
TheTheme Role
• Thetheme (orpatient)isthethingbeingaffectedoractedupon.• Thethemeisusuallytheanswertothequestion:
“Whatwasverb-ed?”(1)JohnsmelledRover.(2)JohngaveRoverabath.
• Aco-theme isanotherobject/conceptthatisaffectedoractedupon.(4)JohnwashedRover alongwithSnoopy.
TheInstrument Role
• Aninstrument representsatool,material,orforcethatisusedtoperformanaction.• Aninstrumentdoesnothavetobeaphysicalobject.Itcanbeanythingusedtoaccomplishanaction.
(1)Thehammer broke thewindow.(2)Iate spaghettiwithafork.(3)Ipaid fortherepairusinghiscreditcard.
TheRecipient andBeneficiary Roles
• Therecipient roleisassignedtoanentitythatreceivessomething.
• Thebeneficiary roleisassignedtoanentitythatbenefitsfromanaction(withoutreceivinganything).
Johngave abirthdaypresenttoMary.Johngave Mary abirthdaypresent.
Johnsang asongforMary.Johnsang Mary asong.
TheExperiencer andCause Roles
• Someverbsexpressinternalbeliefs,emotions,orstates.Theexperiencer rolerepresentsanentitywhoisexperiencingsomething.
• Acause roleissometimesusedforeventsorsituationsthatproduceaneffect.
Theeconomicrecessionledtomanycorporatelayoffs.Thetornadodamagedthousandsofhomes.
Mary believesthatElvisisstillalive.Theboyfearsspiders.
Location,Times,Measurements• Location,Time,andMeasurement rolesmaybesubdividedintosub-rolesrepresentingdifferenttypesofstatechanges.• ATstate:
• From/ToChange:
• PATHrole:atrajectorySheswamacrossthechannel.à PATH-LOC
YosemiteisinCalifornia.à AT-LOCIwokeupat7am.à AT-TIMEIhave20dollars.à AT-VALUE
IdrovefromUtah toCalifornia.à FROM-LOC(Origin/Source),TO-LOC(Destination)Thepartystartsat8am andendsatmidnight.à FROM-TIME ,TO-TIME
Exercise
• Forfollowingsentences,assigneachnounphrasewithathematicrolefrom:AgentThemeInstrumentRecipientExperiencerBeneficiaryCauseLocationPath:• (1)I paidmylandlord therent.• (2)Anepidemickilledoffallofthetomatoes.• (3)Manypeoplefearsnakes
Answers:
• (1)I paidmylandlord therent.AgentRecipientTheme
• (2)Anepidemickilledoffallofthetomatoes.CauseTheme
• (4)[Manypeople]fearsnakesExperiencerTheme
Semantic/CaseFrames
• Asemanticframe(orcaseframe)isaconceptualstructurethatrepresentsthesemanticarguments ofapredicate (usuallyaverb).TheyoriginatedfromCharlesFillmore’stheoryofframesemantics.• Aframerepresentssemanticknowledgeaboutanactivityorstatethatcapturesitsmeaning.• In semantics,predicates takearguments,whicharenecessarytorepresentthemeaningofanactivityorstatedescribedintext.Examples:agents,themes,recipients
• In contrast,adjuncts areoptional.Theyarenotneededtocompletethemeaningofthepredicate.Examples:dates,locations,manner
CaseFrameExample
Predicate:GiveArguments:Agent,Theme,Recipient
Caseframescanbedefinedmanuallywithmappingsbetweensyntacticandsemanticroles.Forexample:
Predicate:Give(inactivevoice)Arguments:
Agent=SubjectTheme=DirectObjectRecipient=IndirectObject
CaseFrameExample
Example:JohngaveMaryabook.Predicate:GiveArguments:Agent=John
Theme=abookRecipient=Mary
Predicate:GiveArguments:Agent,Theme,Recipient
Selectional Restrictions• Aselectional restrictionisasemantictypeconstraintthataverbimposesonthekindofconceptsthatareallowedtofillitsarguments.
GIVE(active)Agent=subject[ANIMATE]Theme=directobject[PHYSOBJ]Recipient=indirectobject[ANIMATE]
Thematic roles Syntax roles Semantic type constraint
CaseFrameExample
Predicate:KILL(Passive)Theme =subject[ANIMATE]Agent =PP(by)[ANIMATE]Instrument =PP(by)[WEAPON]Instrument =PP(with)[WEAPON]Co-Theme =PP(with][ANIMATE]
CaseFrameExamplePredicate:KILL(Passive)Theme =subject[ANIMATE]Agent =PP(by)[ANIMATE]Instrument =PP(by)[WEAPON]Instrument =PP(with)[WEAPON]Co-Theme =PP(with][ANIMATE]
(1)JohnwaskilledbyMary.KillAgent=MaryTheme=John
CaseFrameExample
(1)JohnwaskilledbyMary.KillAgent=MaryTheme= John
(2)Johnwaskilledbyabomb.KillTheme=JohnInstrument=abomb
Predicate:KILL(Passive)Theme =subject[ANIMATE]Agent =PP(by)[ANIMATE]Instrument =PP(by)[WEAPON]Instrument =PP(with)[WEAPON]Co-Theme =PP(with][ANIMATE]
CaseFrameExample
JohnwaskilledbyMary.KillAgent=MaryTheme=John
Johnwaskilledbyabomb.KillTheme=JohnInstrument=abomb
(3)JohnwaskilledwithMarybyamanwithabomb.KillTheme=JohnCo-Theme=MaryAgent=amanInstrument=abomb
Predicate:KILL(Passive)Theme =subject[ANIMATE]Agent =PP(by)[ANIMATE]Instrument =PP(by)[WEAPON]Instrument =PP(with)[WEAPON]Co-Theme =PP(with][ANIMATE]
ModernResourcesforSemanticRoleLabeling
• (1)FrameNet:• Mainlybasedonthethematicroles• Definesemanticrolesspecifictoaparticularverboraparticulargroupofsemanticallyrelatedverbs.
• (2)PropBank:• Usebothproto-roles(general)andverb-specificroles.• Semanticrolesareannotatedontopofparsetrees.
FrameNet
• TheBerkeleyFrameNet projectisanefforttocreateframe-specificdescriptionsforEnglishterms.• Aframe inFrameNet isaconceptualstructurethatdefinesasetofframe-specificsemanticroles(calledframeelements),• includesasetofpredicatesthatusetheseroles• eachwordevokesaframeandprofilessomeaspectoftheframe
• TheFrameNet datebase includes:• a setofframes,frameelements,lexicalunits(pairsofalemmaandaframe)associatedwitheachframe,andasetoflabeledexamplesentences.
FrameExample:The“killing”frame• Definition:• AKillerorCausecausesthedeathoftheVictim.
• Groupoftargetwords(predicates):
VERBS:AnnihilateAsphyxiateAssassinateBeheadButcherCrucifySuicide
DecapitateDestroyDrownEliminateEuthanizeExterminateGarrotteStarve
KillLiquidateLynchMassacreMurderTerminateTakeoutSuffocate
NOUNS:AnnihilationAssassinAssassinationBeheadingblood-bathBloodshedsuffocation
CarnageCrucifixionDecapitationEuthanasiaExterminationFatalityMurderSlaughter
HolocaustHomicideImmolationliquidationADJECTIVE:DeadlyFatalLethal
Rolesin“killing”frame
Core Roles
CauseInstrumentKillerMeansVictim
AninanimateentityorprocessthatcausesthedeathoftheVictim.ThedeviceusedbytheKillertobringaboutthedeathoftheVictim.ThepersonorsentiententitythatcausesthedeathoftheVictim.ThemethodoractionthattheKillerorCauseperformsresultinginthedeathoftheVictim.Thelivingentitythatdiesasaresultofthekilling.
Some Non-CoreRoles
Beneficiary
PlacePurposeTimeFrequency
Thisextra-thematicFEappliestoparticipantsthatderiveabenefitfromtheoccurrenceoftheeventspecifiedbythetargetpredicate.Thelocationwherethedeathtookplace.ThestateofaffairsthattheKilleristryingtobringaboutbykilling.Thepointorperiod oftimewithinwhichtheVictimiskilled.Thisframeelementisdefinedasthenumberoftimesaneventoccurspersomeunitoftime.AFrequencyexpressionanswersthequestionhowoften.
Examplesof“killing”frame
(1) John DROWNEDMartha.KillerVictim
(2)TherockslideKILLED nearlyhalfoftheclimbers.CauseVictim
(3)ThefloodEXTERMINATED therats bycuttingoffaccesstofood.CauseVictimMeans
(4)He KILLED her togettheinheritance.KillerVictimPurpose
AnotherFrameExample• The“change_position_on_a_scale”Frame• ThisframeconsistsofwordsthatindicatethechangeofanITEM’spositiononascale(theATTRIBUTE)fromastartingpoint(INITIAL VALUE)toanendpoint(FINAL VALUE).
• Groupsoftargetwords
Source:Jurafsky andMartin(2016)
VERBS: dwindle move soar escalation shiftadvance edge mushroom swell explosion tumbleclimb explode plummet swing falldecline fall reach triple fluctuation ADVERBS:decrease fluctuate rise tumble gain increasinglydiminish gain rocket growthdip grow shift NOUNS: hikedouble increase skyrocket decline increasedrop jump slide decrease rise
Rolesin“Changepositiononascale”FrameCore Roles
ATTRIBUTE The ATTRIBUTE is a scalar property that the ITEM possesses.DIFFERENCE The distance by which an ITEM changes its position on the scale.FINAL STATE A description that presents the ITEM’s state after the change in the ATTRIBUTE’s
value as an independent predication.FINAL VALUE The position on the scale where the ITEM ends up.INITIAL STATE A description that presents the ITEM’s state before the change in the AT-
TRIBUTE’s value as an independent predication.INITIAL VALUE The initial position on the scale from which the ITEM moves away.ITEM The entity that has a position on the scale.VALUE RANGE A portion of the scale, typically identified by its end points, along which the
values of the ATTRIBUTE fluctuate.Some Non-Core Roles
DURATION The length of time over which the change takes place.SPEED The rate of change of the VALUE.GROUP The GROUP in which an ITEM changes the value of an
ATTRIBUTE in a specified way.Figur 22.3 The frame elements in the change position scale frame from the FrameNet LabelersSource:Ruppenhofer etal.,(2006)
AnnotatedSentences
• The“change_position_on_a_scale”Frame
(22.20) [ITEM Oil] rose [ATTRIBUTE in price] [DIFFERENCE by 2%].
(22.21) [ITEM It] has increased [FINAL STATE to having them 1 day a month].
(22.22) [ITEM Microsoft shares] fell [FINAL VALUE to 7 5/8].
(22.23) [ITEM Colon cancer incidence] fell [DIFFERENCE by 50%] [GROUP amongmen].
(22.24) a steady increase [INITIAL VALUE from 9.5] [FINAL VALUE to 14.3] [ITEMin dividends]
(22.25) a [DIFFERENCE 5%] [ITEM dividend] increase...
Source:Jurafsky andMartin(2016)
Exercise
• Annotatefollowingexamplesbasedontherolesinpreviousslide(Attribute,Difference,Final_state,Final_value,Initial_state,Initial_value,Item,Value_range,Duration,Speed,Group):
(1)AccidentsINCREASED 20%to345.(2)Hawke'sBaywineryDOUBLED insizeoverlastyear.(3)ThepopulationINCREASED fourfoldinsizeto807.
Exercise
• Answers:
(1) [Accidents]INCREASED [20%]to[345].ItemDifferenceFinal_value
(2)[Hawke'sBaywinery]DOUBLED [insize]over[lastyear].ItemAttributeDuration
(3)[Thepopulation]INCREASED [fourfold] in[size] to [807]ItemDifferenceAttributeFinal_value
PropBank
• ThePropBank projectproducedsemanticroleannotationsontheWallStreetJournalportionofthePennTreebank(whichalreadyhadparsetreeannotations)• Palmer,Martha,DanielGildea,andPaulKingsbury.2005.ThePropositionBank:AnAnnotatedCorpusofSemanticRoles.ComputationalLinguistics,31(1):71–106• PropBank providesFramefileswhichdefinesasetofroles(roleset)forverbsenses.• RolesinPropBank arespecifictoaverbsense
RolesinPropBank
• Eachverb sensehasnumberedargument:Arg0,Arg1,Arg2,…Arg0:AGENTArg1: PATIENT(Arg0andArg1usuallyhaveconsistentmeaningfordifferentverbsenses)Arg2:usually:benefactive,instrument,attribute,orendstate…Arg3:usually:start point,benefactive,instrument,orattribute…Arg4:theendpoint…(Arg2-Arg5arenotreallythatconsistent,causesaproblemforlabeling)
RolesinPropBank
• Non-numberedargumentsinPropBank calledArgMs, representingmodificationoradjunctmeanings.
TMP when? yesterday evening, nowLOC where? at the museum, in San FranciscoDIR where to/from? down, to BangkokMNR how? clearly, with much enthusiasmPRP/CAU why? because ... , in response to the rulingREC themselves, each otherADV miscellaneousPRD secondary predication ...ate the meat raw
�����
Source:Jurafsky andMartin(2016)
FrameFileExamples
(22.11) agree.01Arg0: AgreerArg1: PropositionArg2: Other entity agreeing
Ex1: [Arg0 The group] agreed [Arg1 it wouldn’t make an offer].Ex2: [ArgM-TMP Usually] [Arg0 John] agrees [Arg2 with Mary]
[Arg1 on everything].
Source:Jurafsky andMartin(2016)
kill.01 (causetodie)Arg0:killerArg1:corpseArg2:instrument
Ex1:[Arg0 John]killed[Arg1Mary]with[Arg2 aleadpipe]
ParseTreeAnnotations
• SemanticRolesinPropBank areannotatedonparsetrees.
S
NP-SBJ = ARG0 VP
DT NNP NNP NNP
The San Francisco Examiner
VBD = TARGET NP = ARG1 PP-TMP = ARGM-TMP
issued DT JJ NN IN NP
a special edition around NN NP-TMP
noon yesterday
Exercise(22.13) increase.01 “go up incrementally”
Arg0: causer of increaseArg1: thing increasingArg2: amount increased by, EXT, or MNRArg3: start pointArg4: end point
(1) BigFruitCo.increased thepriceofbananas.(2) Thepriceofbananaswasincreased againbyBigFruitCo.(3) Thepriceofbananasincreased 5%.
FrameFile:
Exercise
• Answer:
[Arg0 Big Fruit Co. ] increased [Arg1 the price of bananas].[Arg1 The price of bananas] was increased again [Arg0 by Big Fruit Co. ][Arg1 The price of bananas] increased [Arg2 5%].
(22.13) increase.01 “go up incrementally”Arg0: causer of increaseArg1: thing increasingArg2: amount increased by, EXT, or MNRArg3: start pointArg4: end point
SemanticRoleLabelingSystem
• SemanticRoleLabeling(SRL)isaNLPtaskofassigningsemanticrolestoeachargumentofeachpredicateinasentence.• SemanticRolesarealwaysrelativetoatarget word(i.e.thepredicate),whichisusuallyaverb.• CurrentapproachestoSRLaremainlybasedonsupervisedmachinelearningalgorithms,oftenusingFrameNet orPropBank astrainingandtestingsources.• FrameNet vsPropBank
(1) John killedMary withaknife.Killer Victim Instrument
(2)John killedMary withaknife.Arg0Arg1Arg2
ASimpleLabelingAlgorithm
function SEMANTICROLELABEL(words) returns labeled tree
parse PARSE(words)for each predicate in parse do
for each node in parse dofeaturevector EXTRACTFEATURES(node, predicate, parse)CLASSIFYNODE(node, featurevector, parse)
Source:Jurafsky andMartin(2016)
WewillusePropBank tobuildthesemanticrolelabeler,treateveryverbasapredicate.
3-stageLabelingAlgorithm
1. CandidateNodePruning:usesimpleheuristicstopruneunlikelyconstituents.
2. ArgumentIdentification:abinaryclassificationofeachnodeasanargumenttobelabeledora
NONE.3. ArgumentClassification:
a1-of-Nclassificationofalltheconstituentsthatwerelabeledasargumentsbythepreviousstage
Xue andPalmer(2004) LabelingSteps
SemanticRoles,Predicate:walkedARG0:NP1
AM-LOC:PP
NAACLSRLtutorial(Palmer,Titov andWu2013)
PruningAlgorithm(1st step)
• Pruning:• Step1:Designatethepredicateasthecurrentnodeandcollectitssisters(constituentsattachedatthesamelevelasthepredicate)unlessitssistersarecoordinatedwiththepredicate.IfasisterisaPP,alsocollectitsimmediatechildren.• Step2:ResetthecurrentnodetoitsparentandrepeatStep1tillitreachesthetoplevelnode.
PruningExample
• Predicate-headwordcombination:• issued-examiner
• Distancebetweenconstituentandpredicate• 3
FinalFeatureVector:[ ,examiner,NNP,issue-NP,issue-examiner,3]
S
NP-SBJ = ARG0 VP
DT NNP NNP NNP
The San Francisco Examiner
VBD = TARGET NP = ARG1 PP-TMP = ARGM-TMP
issued DT JJ NN IN NP
a special edition around NN NP-TMP
noon yesterday
Figur 22.5 P for PropBank showing the PropBank labels. The dotted line
y (2000)NP"S#VP#VBD.
y (2000)NP"S#VP#VBD.
• Path:Theminimalpathfromtheconstituentbeingclassifiedtothepredicate
• Headword:Theheadwordoftheconstituentbeingclassified
• Examiner• HeadwordPOS:
• NNP• Predicate-phrasetypecombination:
• issued-NP
FeaturesforArgumentIdentification(2nd Step) FeaturesforArgumentClassification(3rd Step)
• Predicate• Thepredicateitself.(issued)
• Path
• PhraseType• Thesyntacticcategory(NP,PP,etc.)oftheconstituentbeingclassified.(NP)
• Voice• Whetherthepredicateisactiveorpassive.(active)
• Position• Therelativepositionoftheconstituentbeingclassifiedwithregardtothepredicate(beforeorafter)(before)
S
NP-SBJ = ARG0 VP
DT NNP NNP NNP
The San Francisco Examiner
VBD = TARGET NP = ARG1 PP-TMP = ARGM-TMP
issued DT JJ NN IN NP
a special edition around NN NP-TMP
noon yesterday
Figur 22.5 for PropBank showing the PropBank labels. The dotted line
yNP"S#VP#VBD.
MoreFeaturesforArgumentClassification
• HeadWord• Theheadwordoftheconstituent beingclassified.(examiner)
• Sub-categorization• Thephrasestructureruleexpandingtheparentofthepredicate.(VP->VBDNPPP)
• Firstwordconstituent (the)
S
NP-SBJ = ARG0 VP
DT NNP NNP NNP
The San Francisco Examiner
VBD = TARGET NP = ARG1 PP-TMP = ARGM-TMP
issued DT JJ NN IN NP
a special edition around NN NP-TMP
noon yesterday
Figur 22.5 for PropBank showing the PropBank labels. The dotted line• Lastwordoftheconstituent (examiner)• Namedentitytypeoftheconstituent (Organization)
FinalFeatureVector:[issued,,NP,active,before,Head:examiner,VP->VBDNPPP,FW:the,LW:ex,Organization]
y (2000)NP"S#VP#VBD.
Conclusion
• Semanticrolesprovideashallowsemanticrepresentationforeventsandstates.• Commonthematicroles• Twocommonframeworks:• (1)FrameNet:rolesarespecifictoeachframe• (2)PropBank:rolesarespecifictoeachverbsense
• Supervisedmachinelearningmethods:• (1)needtoparsethesentence• (2)findthepredicate(mayneedtoidentifythesense)• (3)identifyandclassifyargumentsforeachpredicate.
Reference
• Ruppenhofer,Josef,etal."FrameNet II:Extendedtheoryandpractice."(2006).• Jurafsky,Dan,andJamesH.Martin.Speechandlanguageprocessing.Pearson,2016.http://web.stanford.edu/~jurafsky/slp3/• SemanticRoleLabelingTutorial(NAACL2013)http://naacl2013.naacl.org/TutorialsAccepted.aspx• Xue,Nianwen,andMarthaPalmer."CalibratingFeaturesforSemanticRoleLabeling."EMNLP.2004.