Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
TheComputationofLanguage:SyntacticAcquisitionEdition
Feb10,2016DepartmentofLinguistics
UCLA
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
Naturallanguageprocessing:Howdopeopleandmachinesextract
informationabouttheworldfromthelanguagedatatheyencounter?
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
Naturallanguageprocessing:Howdopeopleandmachinesextract
informationabouttheworldfromthelanguagedatatheyencounter?
OutputInput computation
“Isn’tthatanicekitty?”“That…isnotadog.”
internalrepresentation
persuasion
surprise
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
Recentworkonmindprintsandwriteprints:Linguisticfeature-based“fingerprints”intextindicatingmentalstatesandidentity.
Naturallanguageprocessing:
OutputInput computation
“Isn’tthatanicekitty?”“That…isnotadog.”
internalrepresentation
persuasion
surprise
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
Naturallanguageprocessing:
Onefinding:Whileshallowlinguisticfeaturescanmimichumanperformanceatdetectingsomementalstates,moresophisticatedsyntacticandsemanticfeaturesinmindprintscanallowclassifierstoexceedhumanperformanceinsomecases
(Pearl&Steyvers2010,2013,Pearl&Enverga2015,NIAAA,UCI,EU)
OutputInput computation
“Isn’tthatanicekitty?”“That…isnotadog.”
internalrepresentation
persuasion
surprise
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
Naturallanguageprocessing:
Anotherfinding:Wecanuselinguistically-sophisticatedwriteprintstoidentifywhowroteaparticulardocument(Pearl&Steyvers2012),andevenwhichcharacterwrittenbythesameauthoriscurrentlybeingvoicedinthetext(Pearl,Lu,&Haghighiinpress)—thoughthewriteprintfeaturesthatmatteraredifferentbetweenauthorsvs.betweencharactersbythesameauthor.
OutputInput computation
“Isn’tthatanicekitty?”“That…isnotadog.”
internalrepresentation
persuasion
surprise
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
Languageacquisition:Howdochildrenextractinformation
aboutlanguagefromthelanguagedatatheyencounter?
Lidz&Gagliardi2015
SophisYcatedframeworkthatmakesexplicitthedifferentcomponentsoftheacquisiYonprocess.
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
Languageacquisition:
Howdochildrenextractinformationaboutlanguagefromthelanguagedatatheyencounter?
Lidz&Gagliardi2015
computation
internalrepresentation
Output{look,at,the,kitty}
“Where’sthekitty?”
Inputlʊkətðəkɪɾi
“What’sthat?”“Doyouseeit?”
TheComputationofLanguage:Informationprocessing
Onewaytothinkaboutthecomputationoflanguageisfromaninformationprocessingstandpoint.
Languageacquisition:
Howdochildrenextractinformationaboutlanguagefromthelanguagedatatheyencounter?
Lidz&Gagliardi2015
computation
internalrepresentation
Output{look,at,the,kitty}
“Where’sthekitty?”
Inputlʊkətðəkɪɾi
“What’sthat?”“Doyouseeit?”
Languageacquisition:Methodsofinvestigation
Theoreticalmethods:Whatknowledgeoflanguageis(andwhatchildrenhavetolearn)
LOOK at the KItty
lʊkætðəkɪɾi
lookat
the kitty
Languageacquisition:Methodsofinvestigation
Experimentalmethods:Whenknowledgeisacquired,whattheinputlookslike,&plausiblecapabilitiesunderlyinghowacquisitionworks
€
p(H1 | ) ∝ p( | H1) p(H1)
Age
Performance
Languageacquisition:Methodsofinvestigation
Computationalmethods:Strategiesforhowchildrenacquireknowledge,sophisticatedquantitativeanalysisofchildren’sinput&output
“Whatdid…”
XP-YP-ZP…
start-XP-YP+1…
lʊkətðəkɪɾi
lʊkətðəkɪɾilʊkətðəkɪɾi
lʊkətðəkɪɾi
lʊkətðəkɪɾi
lʊkətðəkɪɾi
Languageacquisition:Methodsofinvestigation
Languageacquisition:Representation&Development
Languageacquisitioninvolvescomplexknowledgethatbuildsonitselfoverthecourseoflinguisticdevelopment,embeddedinadevelopingcognitivesystem.
Thismeansthere’sanaturaldependencebetweentheoriesofknowledgerepresentationandtheoriesofknowledgedevelopment.
Lidz&Gagliardi2015
Languageacquisition:Foundationalknowledge
Languageacquisitioninvolvescomplexknowledgethatbuildsonitselfoverthecourseoflinguisticdevelopment,embeddedinadevelopingcognitivesystem.
Examplesof“foundational”processesthatchildrenuseforbuildingmoresophisticatedknowledge:
speechsegmentationsyntacticcategorization
Lidz&Gagliardi2015
lookatthekittyVPDetN
syntaxlookat
the kittyLOOKattheKItty
phonology
Nouns=Xx
look(me,thekitty)
semantics
Languageacquisition:Foundationalknowledge
Languageacquisitioninvolvescomplexknowledgethatbuildsonitselfoverthecourseoflinguisticdevelopment,embeddedinadevelopingcognitivesystem.
Examplesof“foundational”processesthatchildrenuseforbuildingmoresophisticatedknowledge:
speechsegmentationsyntacticcategorization
Lidz&Gagliardi2015
Arecentfinding:Whentheunderlyingrepresentation(i.e.,assumptionsaboutlanguagestructure)isimmature,immatureprocessingcapabilitiesmaybehelpfulratherthanharmful
speechsegmentation:Pearl,GoldwaterandSteyvers2010,2011,PhillipsandPearl2012,2015b
Languageacquisition:Foundationalknowledge
Languageacquisitioninvolvescomplexknowledgethatbuildsonitselfoverthecourseoflinguisticdevelopment,embeddedinadevelopingcognitivesystem.
Examplesof“foundational”processesthatchildrenuseforbuildingmoresophisticatedknowledge:
speechsegmentationsyntacticcategorization
Lidz&Gagliardi2015
Arecentfinding:Developingrepresentationsareoften“goodenough”forscaffoldingotheracquisitionprocessingevenwhentheydon’tmatchadultrepresentations(Pearl2014,Pearl&Sprouse2015,Pearlunderreview)
speechsegmentation:PhillipsandPearl2012,2014a,b,2015a,b,PearlandPhillipsunderreview,PhillipsandPearlunderrevision
syntacticcategorization:Bar-SeverandPearl2016
Languageacquisition:Moresophisticatedknowledge
Languageacquisitioninvolvescomplexknowledgethatbuildsonitselfoverthecourseoflinguisticdevelopment,embeddedinadevelopingcognitivesystem.
Examplesofmoresophisticatedknowledgethatdependsonthefoundationalknowledge:
metricalstressLidz&Gagliardi2015
Languageacquisitioninvolvescomplexknowledgethatbuildsonitselfoverthecourseoflinguisticdevelopment,embeddedinadevelopingcognitivesystem.
Examplesofmoresophisticatedknowledgethatdependsonthefoundationalknowledge:
metricalstressLidz&Gagliardi2015
Acurrentfinding:Somelinguisticrepresentationsmaybelessacquirablefromcognitivelyplausiblechild-directedinputthanpreviouslyassumedunlesscertainlearningbiasesareinplace
Pearl2007,2008,2009,2011,Pearl,Ho,&Detrano2014,underreview,Pearlunderreview
Languageacquisition:Moresophisticatedknowledge
Languageacquisitioninvolvescomplexknowledgethatbuildsonitselfoverthecourseoflinguisticdevelopment,embeddedinadevelopingcognitivesystem.
syntacticislandsEnglishanaphoriconewhereargumentsappearsyntactically
Lidz&Gagliardi2015
Examplesofmoresophisticatedknowledgethatdependsonthefoundationalknowledge:
Languageacquisition:Moresophisticatedknowledge
Languageacquisitioninvolvescomplexknowledgethatbuildsonitselfoverthecourseoflinguisticdevelopment,embeddedinadevelopingcognitivesystem.
syntacticislandsEnglishanaphoriconewhereargumentsappearsyntactically
Lidz&Gagliardi2015
Acurrentfinding:Theknowledgeneededtocreatetherightacquisitionalintakemaynotnecessarilylooklikewethoughtitdid(e.g.,what’sinUniversalGrammar).
Examplesofmoresophisticatedknowledgethatdependsonthefoundationalknowledge:
syntacticislands:Pearl&Sprouse2013a,2013b,Pearl2014,Pearl&Sprouse2015,Pearlunderrev.
Englishanaphoricone:Pearl2007,Pearl&Lidz2009,Pearl&Mis2011,Pearl2014,Pearl&Misinpresswhereargumentsappear:Pearl&Sprouseinprogress
NSF:“TestingtheUniversalGrammarHypothesis”,“AnIntegratedTheoryofSyntacticAcquisition”
Languageacquisition:Moresophisticatedknowledge
Today’sPlan
Characterizinglearningproblemspreciselyenoughtoinformativelymodelthem
UGmodelingforaysNP
N’det
a adj
red
N’
N0
bottle
one=
InvestigatingUniversalGrammar(UG)in
UniversalGrammar
inindomain-specific
domain-general
innatederived
Today’sPlan
Characterizinglearningproblemspreciselyenoughtoinformativelymodelthem
UGmodelingforays
InvestigatingUniversalGrammar(UG)in
UniversalGrammar
inindomain-specific
domain-general
innatederived
NP
N’det
a adj
red
N’
N0
bottle
one=
MotivatingUniversalGrammar
Theargumentfromacquisition:oneexplicitmotivationthathighlightsthenaturallinkbetweenlinguisticrepresentationandlanguageacquisition.
UniversalGrammar(UG)allowschildrentoacquireknowledgeaboutlanguageaseffectivelyandrapidlyastheydo(Chomsky1980,Crain1991,Hornstein&Lightfoot
1981,Lightfoot1982b,Legate&Yang2002,amongmanyothers).
MotivatingUniversalGrammar
dataencountered
hypothesis1hypothesis2
correcthypothesis
What’ssohardaboutacquiringlanguage?Thereseemtobeinductionproblems,giventheavailabledata.(PovertyoftheStimulus,LogicalProblemofLanguageAcquisition,Plato’sProblem)
MotivatingUniversalGrammar
Soifthedatathemselvesdon’tpickouttherightanswer(andchildrenallseemto),somethinginternaltochildrenmustbeguidingthem.
dataencountered
hypothesis1hypothesis2
correcthypothesis
MotivatingUniversalGrammar
Ifthatsomethingisbothinnateanddomain-specific,weconsideritpartofUniversalGrammar(UG)(Chomsky1965,Chomsky1975,Pearl&Sprouse2013).
innate
UniversalGrammar
innateinnatedomain-specific
domain-general
innatederived
MotivatingthecontentsofUG
Proposalshavetraditionallycomefromcharacterizingaspecificacquisitionproblemforaparticularlinguisticphenomenon,anddescribingthe(UG)solutiontothatspecificcharacterization.
MotivatingthecontentsofUG
Proposalshavetraditionallycomefromcharacterizingaspecificacquisitionproblemforaparticularlinguisticphenomenon,anddescribingthe(UG)solutiontothatspecificcharacterization.
Structure-dependentrules(Chomsky1980,Anderson&Lightfoot2000;Fodor&Crowther2002;Berwicketal.2011;Anderson2013)
Pirateswhocandancecanoftenfightwell. Canpirateswhocandance__oftenfightwell?
MotivatingthecontentsofUG
Proposalshavetraditionallycomefromcharacterizingaspecificacquisitionproblemforaparticularlinguisticphenomenon,anddescribingthe(UG)solutiontothatspecificcharacterization.
Syntacticislands:Constraintsonlong-distancedependencies(Chomsky1973,Huang1982,Lasnik&Saito1984,Pearl&Sprouse2013a,2013b,2015)WheredidJackthinkLilyboughtthenecklacefrom__?*WheredidJackthinkthenecklacefrom__wastooexpensive?
MotivatingthecontentsofUG
Englishanaphoriconerepresentation(Baker1978,Pearl&Mis2011,2016) Look–aredbottle!Doyouseeanotherone?
Proposalshavetraditionallycomefromcharacterizingaspecificacquisitionproblemforaparticularlinguisticphenomenon,anddescribingthe(UG)solutiontothatspecificcharacterization.
UGproposals:Generation&evaluation
Howtogeneratealearningtheoryproposal:Characterizethelearningproblempreciselyandidentifyapotentialsolution.
UGproposals:Generation&evaluation
Howtogeneratealearningtheoryproposal:Characterizethelearningproblempreciselyandidentifyapotentialsolution.
BenefitofcomputaYonalmodeling:Wecanmakesurethelearningproblemischaracterizedpreciselyenoughtoimplement.It’snotalwaysobviouswhatpiecesaremissingunYlyoutrytobuildamodelofthelearningprocess.(Pearl2014,Pearl&Sprouse2015)
UGproposals:Generation&evaluation
Howtogeneratealearningtheoryproposal:Characterizethelearningproblempreciselyandidentifyapotentialsolution.
Howtoevaluatealearningtheoryproposal:Seeifit’ssuccessfulwhenembeddedinamodeloftheacquisitionprocessfor
thatlearningproblem.
UGproposals:Generation&evaluation
Howtogeneratealearningtheoryproposal:Characterizethelearningproblempreciselyandidentifyapotentialsolution.
Howtoevaluatealearningtheoryproposal:Seeifit’ssuccessfulwhenembeddedinamodeloftheacquisitionprocessfor
thatlearningproblem.
Recently,incomputaYonalmodeling,we’veseentheintegraYonofrichhypothesisspaceswithprobabilisYc/staYsYcallearningmechanisms(Sakas&Fodor2001,Yang2004,Pearl2011,Dillonetal.2013,Pearl&Sprouse2013,Pearletal.2014,Pearl&Mis2016,amongmanyothers).
UGproposals:Generation&evaluation
Howtogeneratealearningtheoryproposal:Characterizethelearningproblempreciselyandidentifyapotentialsolution.
Howtoevaluatealearningtheoryproposal:
Seeifit’ssuccessfulwhenembeddedinamodeloftheacquisitionprocessforthatlearningproblem.
We’vealsoseenthedevelopmentofmoresophisYcatedacquisiYonframeworksthathighlightthepreciseroleofUG(Lidz&Gagliardi2015).
Example:UGdetermineswhatdatafromtheperceivedinputarerelevant(acquisiYonalintake)
UGproposals:Generation&evaluation
Howtogeneratealearningtheoryproposal:Characterizethelearningproblempreciselyandidentifyapotentialsolution.
Howtoevaluatealearningtheoryproposal:
Seeifit’ssuccessfulwhenembeddedinamodeloftheacquisitionprocessforthatlearningproblem.
Thiscomputationalmodelingfeedbackhelpsusrefineourtheoriesaboutboththeknowledgerepresentationthelearningtheoryreliesonandtheacquisitionprocessthatusesthatrepresentation.
UGproposals:Generation&evaluation
Howtogeneratealearningtheoryproposal:Characterizethelearningproblempreciselyandidentifyapotentialsolution.
Howtoevaluatealearningtheoryproposal:Seeifit’ssuccessfulwhenembeddedinamodeloftheacquisitionprocessfor
thatlearningproblem.
HowtodecideifanycomponentsoftheproposalareUG:
Examinethecomponentsofthesuccessfullearningsolution.
UGproposals:Generation&evaluation
Howtogeneratealearningtheoryproposal:Characterizethelearningproblempreciselyandidentifyapotentialsolution.
Howtoevaluatealearningtheoryproposal:Seeifit’ssuccessfulwhenembeddedinamodeloftheacquisitionprocessfor
thatlearningproblem.
HowtodecideifanycomponentsoftheproposalareUG:
Examinethecomponentsofthesuccessfullearningsolution.
Aretheynecessarilybothdomain-specificandinnate?Note:Wemayuse“innate”asaplaceholderuntilwecandetermineif
it’simpossibletoderivetherelevantcomponent(Pearl2014,Pearl&Mis2016).
UGproposalrefinement:Recentsuccessfulforays
Syntacticislands(constraintsonwh-dependencies):Pearl&Sprouse2013a,2013b,2015
Englishanaphoricone:Pearl&Mis2011,2016
NP
N’det
a adj
red
N’
N0
bottle
one=
UGproposalrefinement:Recentsuccessfulforays
Syntacticislands(constraintsonwh-dependencies):Pearl&Sprouse2013a,2013b,2015
Englishanaphoricone:Pearl&Mis2011,2016
Recurringthemes:(1)Broadeningthesetofrelevantdatainthe
acquisitionalintake
Lidz&Gagliardi2015
NP
N’det
a adj
red
N’
N0
bottle
one=
UGproposalrefinement:Recentsuccessfulforays
Syntacticislands(constraintsonwh-dependencies):Pearl&Sprouse2013a,2013b,2015
Englishanaphoricone:Pearl&Mis2011,2016
Recurringthemes:(1)Broadeningthesetofrelevantdatainthe
acquisitionalintake(2)Evaluatingoutputbyhowusefulitis
Lidz&Gagliardi2015
NP
N’det
a adj
red
N’
N0
bottle
one=
UGproposalrefinement:Recentsuccessfulforays
Syntacticislands(constraintsonwh-dependencies):Pearl&Sprouse2013a,2013b,2015
Englishanaphoricone:Pearl&Mis2011,2016
Recurringthemes:(1)Broadeningthesetofrelevantdatainthe
acquisitionalintake(2)Evaluatingoutputbyhowusefulitis(3)Notnecessarilyneedingtheprior
knowledgewethoughtwedidLidz&Gagliardi2015
NP
N’det
a adj
red
N’
N0
bottle
one=
Today’sPlan
Characterizinglearningproblemspreciselyenoughtoinformativelymodelthem
UGmodelingforays
InvestigatingUniversalGrammar(UG)in
UniversalGrammar
inindomain-specific
domain-general
innatederived
NP
N’det
a adj
red
N’
N0
bottle
one=
Characterizinglearningproblems
Initialstate:
Pearl&Sprouse2015,Pearl&Mis2016
Lidz&Gagliardi2015
Initialstate: -initialknowledgestate ex:syntacticcategoriesexistandcanbeidentified ex:phrasestructureexistsandcanbeidentifiedex:participantrolescanbeidentified
Characterizinglearningproblems
N,V,Adj,P,…
Agent,Patient,Goal,…
Pearl&Sprouse2015,Pearl&Mis2016
Initialstate: -initialknowledgestate ex:syntacticcategoriesexistandcanbeidentified ex:phrasestructureexistsandcanbeidentifiedex:participantrolescanbeidentified
x
h1
h2h2morelikely
Characterizinglearningproblems
Pearl&Sprouse2015,Pearl&Mis2016
N,V,Adj,P,…
Agent,Patient,Goal,…
-learningbiases&capabilitiesex:frequencyinformationcanbetrackedex:distributionalinformationcanbeleveraged
start-IP-VP IP-VP-CP VP-NP-CPthat
Initialstate:initialknowledgestate+learningbiases&capabilities
Dataintake:
Pearl&Sprouse2015,Pearl&Mis2016
Characterizinglearningproblems
Lidz&Gagliardi2015
Initialstate:initialknowledgestate+learningbiases&capabilities
Dataintake: -encoding+acquisitionalintake=dataperceivedasrelevantforlearning(Fodor1998,Lidz&Gagliardi2015) ex:allwh-utterancesforlearningaboutwh-dependenciesex:allpronoundatawhenlearningaboutanaphoricone ex:syntacticandconceptualdataforlearningsyntacticknowledgethatlinkswith
conceptualknowledge [definedbyknowledge&biases/capabilitiesintheinitialstate]
Characterizinglearningproblems
Pearl&Sprouse2015,Pearl&Mis2016
Initialstate:initialknowledgestate+learningbiases&capabilities
Dataintake:dataperceivedasrelevantforlearning
Learningperiod:
Characterizinglearningproblems
Pearl&Sprouse2015,Pearl&Mis2016
Lidz&Gagliardi2015
Initialstate:initialknowledgestate+learningbiases&capabilities
Dataintake:dataperceivedasrelevantforlearning
Learningperiod: -howlongchildrenhavetoreachthetargetknowledgestate(wheninference&iterationhappen) ex:3years,~1,000,000datapoints ex:4months,~36,500datapoints
Characterizinglearningproblems
Pearl&Sprouse2015,Pearl&Mis2016
Initialstate:initialknowledgestate+learningbiases&capabilities
Dataintake:dataperceivedasrelevantforlearning
Learningperiod:howlongchildrenhavetolearn
Targetstate:
Characterizinglearningproblems
Pearl&Sprouse2015,Pearl&Mis2016
Lidz&Gagliardi2015
Initialstate:initialknowledgestate+learningbiases&capabilities
Dataintake:dataperceivedasrelevantforlearning
Learningperiod:howlongchildrenhavetolearn
Targetstate: -theknowledgechildrenaretryingtoattain(asindicatedbytheirbehavior)
ex:*WheredidJackthinkthenecklacefrom__wastooexpensive? ex:oneiscategoryN’whenitisnotNPex:
z-scorerating
Characterizinglearningproblems
Theicemelted.Thepenguinswam.doer
done-to
Pearl&Sprouse2015,Pearl&Mis2016lookingtimepreferences
Expectationsofargumentroles
Initialstate:initialknowledgestate+learningbiases&capabilities
Dataintake:dataperceivedasrelevantforlearning
Learningperiod:howlongchildrenhavetolearn
Targetstate:theknowledgechildrenmustattain
Characterizinglearningproblems
Pearl&Sprouse2015,Pearl&Mis2016
Oncewehaveallthesepiecesspecified,weshouldbeabletoimplementaninformativemodelofthelearningprocess.
Lidz&Gagliardi2015
InformingUG(+acquisitiontheory)Whenweidentifyasuccessfullearningstrategyviamodeling,thisisanexistenceproofthatchildrencouldsolvethatlearningproblemusingthelearningbiases,knowledge,andcapabilitiescomprisingthatstrategy.
Thisidentifiesusefullearningstrategycomponents,whichwecanthenexaminetoseewheretheymightcomefrom.
Initialstate
Knowledge1Knowledge2Capability1Bias1Bias2Bias3…
inUniversalGrammar
inindomain-specific
domain-general
innatederived
Today’sPlan
Characterizinglearningproblemspreciselyenoughtoinformativelymodelthem
UGmodelingforays
InvestigatingUniversalGrammar(UG)in
UniversalGrammar
inindomain-specific
domain-general
innatederived
NP
N’det
a adj
red
N’
N0
bottle
one=
• Why?CentraltoUG-basedsyntactictheories.
•What?Dependenciescanexistbetweentwonon-adjacentitems.Theydonotappeartobeconstrainedbylength(Chomsky1965,Ross1967),butratherbywhetherthedependencycrossescertainstructures(called“syntacticislands”).
Pearl&Sprouse2013a,2013b,2015
WhatdoesJackthink__?
WhatdoesJackthinkthatLilysaidthatSarahheardthatJarethbelieved__?
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]SyntacYcislands
ComplexNPisland: *Whatdidyoumake[theclaimthatJackbought__]? Subjectisland: *Whatdoyouthink[thejokeabout__]offendedJack?
Whetherisland: *Whatdoyouwonder[whetherJackbought__]?
Adjunctisland: *Whatdoyouworry[ifJackbuys__]?
Someexampleislands
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]SyntacYcislands
• Why?CentraltoUG-basedsyntactictheories.
•What?Dependenciescanexistbetweentwonon-adjacentitems.Theydonotappeartobeconstrainedbylength(Chomsky1965,Ross1967),butratherbywhetherthedependencycrossescertainstructures(called“syntacticislands”).
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Acquisitiontarget
Adultknowledgeasmeasuredbyacceptabilityjudgmentbehavior
ComplexNPisland: *Whatdidyoumake[theclaimthatJackbought__]? Subjectisland: *Whatdoyouthink[thejokeabout__]offendedJack?
Whetherisland: *Whatdoyouwonder[whetherJackbought__]?
Adjunctisland: *Whatdoyouworry[ifJackbuys__]?
WhatdoesJackthink__?
WhatdoesJackthinkthatLilysaidthatSarahheardthatJarethbelieved__?
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Syntacticislands:Acquisitiontarget
Adultknowledgeasmeasuredbyacceptabilityjudgmentbehavior
Sprouseetal.(2012)collectedmagnitudeestimationjudgmentsforfourdifferentislands,usingafactorialdefinitionthatcontrolledfortwosalientpropertiesofisland-crossingdependencies:- lengthofdependency(matrixvs.embedded)- presenceofanislandstructure(non-islandvs.island)
Lidz&Gagliardi2015
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Acquisitiontarget
Adultknowledgeasmeasuredbyacceptabilityjudgmentbehavior
Sprouseetal.(2012)collectedmagnitudeestimationjudgmentsforfourdifferentislands,usingafactorialdefinitionthatcontrolledfortwosalientpropertiesofisland-crossingdependencies:- lengthofdependency(matrixvs.embedded)- presenceofanislandstructure(non-islandvs.island)
Lidz&Gagliardi2015
Who__claimedthatLilyforgotthenecklace? matrix|non-islandWhatdidtheteacherclaimthatLilyforgot__? embedded|non-islandWho__madetheclaimthatLilyforgotthenecklace? matrix|island*WhatdidtheteachermaketheclaimthatLilyforgot__? embedded|island
Pearl&Sprouse2013a,2013b,2015
ComplexNPislands
Syntacticislands:Acquisitiontarget
Adultknowledgeasmeasuredbyacceptabilityjudgmentbehavior
Syntacticisland=superadditiveinteractionofthetwofactors(additionalunacceptabilitythatariseswhenthetwofactorsarecombined,aboveandbeyondtheindependentcontributionofeachfactor).
Lidz&Gagliardi2015
Pearl&Sprouse2013a,2013b,2015
●
●
●
●
−1
−0.5
0
0.5
1
1.5
2
z−sc
ore
ratin
g
matrix embedded
island structurenon−island structure
no island effect
matrix embedded
islandstructurenon-islandstructure
●
●
●
●
−1
−0.5
0
0.5
1
1.5
2
z−sc
ore
ratin
g
matrix embedded
island structurenon−island structure
island effect
matrix embedded
islandstructurenon-islandstructure
z-scoreratin
g
z-scoreratin
g
Syntacticislands:Acquisitiontarget
Pearl&Sprouse2013a,2013b,2015
Sprouseetal.(2012):acceptabilityjudgmentsfrom173adultsubjects
Lidz&Gagliardi2015
Superadditivitypresentforallislandstested=Knowledgethatdependenciescannotcrosstheseislandstructuresispartofadultknowledgeaboutsyntacticislands
Importanceforacquisition:Thisisonekindoftargetbehaviorthatwe’dlikealearnertoproduce.
Syntacticislands:Representations
Wh…[BN1 … [BN2… __]]
Subjacency(Chomsky1973,Huang1982,Lasnik&Saito1984)
(1)Adependencycannotcrosstwoormoreboundingnodes.
Boundingnodesarelanguage-specific(CP,IP,and/orNP–mustlearnwhichonesarerelevantforlanguage)
{CP,IP,NP}?
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Representations
Subjacency(Chomsky1973,Huang1982,Lasnik&Saito1984)
(1)Adependencycannotcrosstwoormoreboundingnodes.
Subjacency-ish(Pearl&Sprouse2013a,2013b,2015)(2)Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Containernode:phrasestructurenodethatcontainsdependency
Wh…[BN1 … [BN2… __]]
[CPWhatdo[IPyou[VPlike__[PPinthispicture?]]]]
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Representations
Subjacency(Chomsky1973,Huang1982,Lasnik&Saito1984)
(1)Adependencycannotcrosstwoormoreboundingnodes.
Subjacency-ish(Pearl&Sprouse2013a,2013b,2015)(2)Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Lowprobabilityregionsarelanguage-specific(definedbysequencesofcontainernodesthatmustbelearned)
lowprobability?
Wh…[BN1 … [BN2… __]]
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Representations
Subjacency(Chomsky1973,Huang1982,Lasnik&Saito1984)
(1)Adependencycannotcrosstwoormoreboundingnodes.
Subjacency-ish(Pearl&Sprouse2013a,2013b,2015)(2)Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Wh…[BN1 … [BN2… __]]
Incommon:Bothrelyonlocalstructureanomalies(atsomelevel)
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Representations
Subjacency(Chomsky1973,Huang1982,Lasnik&Saito1984)
(1)Adependencycannotcrosstwoormoreboundingnodes.
Subjacency-ish(Pearl&Sprouse2013a,2013b,2015)(2)Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Wh…[BN1 … [BN2… __]]
Different:Amountoflanguage-specificknowledgebuiltinjustforislands
(i)Dependenciesdefinedoverboundingnodes—trackthose(ii)Boundingnode=?(iii)2+boundingnodes=
(i)Dependenciesdefinedovercontainernodestructure—trackthatalready(ii)Containernode=?(iii)lowprobability=
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Representations
Subjacency(Chomsky1973,Huang1982,Lasnik&Saito1984)
(1)Adependencycannotcrosstwoormoreboundingnodes.
Subjacency-ish(Pearl&Sprouse2013a,2013b,2015)(2)Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Wh…[BN1 … [BN2… __]]
Pearl&Sprouse:Focusedonevaluatingthisone
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Subjacency-ish
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Subjacency-ishimplementation:Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Initialstate: (i) Dependenciesdefinedovercontainernodestructure
(ii) Containernodesrecognized(iii)Trackprobabilityofshortcontainernodesequences(trigrams)
Subjacency-ish:InitialstateimplementationBecausewh-dependenciesareperceivedassequencesofcontainernodes,localpiecesofdependencystructurecanbecharacterizedbycontainernodetrigrams.
[CPWhodid[IPshe[VPthink[CP[IP[NPthegift][VPwas[PPfrom__]]]]]]]]? IPVP CPnullIP VPPP
begin-IP-VP-CPnull-IP-VP-PP-end=
begin-IP-VP-CP-IP-VP-PP-end start-IP-VP-CPnull-IP-VP-PP-end
start-IP-VP-CPnull-IP-VP-PP-endstart-IP-VP-CPnull-IP-VP-PP-endstart-IP-VP-CP-IP-VP-PP-endstart-IP-VP-CP-IP-VP-PP-end
Pearl&Sprouse2013a,2013b,2015
Achildlearnsaboutthefrequencyofcontainernodetrigrams…
+1begin-IP-VP
+1IP-VP-CPnull
…
Subjacency-ish:Developingknowledge
[CPWhodid[IPshe[VPthink[CP[IP[NPthegift][VPwas[PPfrom__]]]]]]]]? IPVP CPnullIP VPPP
begin-IP-VP-CPnull-IP-VP-PP-end=
Pearl&Sprouse2013a,2013b,2015
begin-IP-VP-CP-IP-VP-PP-end start-IP-VP-CPnull-IP-VP-PP-end
start-IP-VP-CPnull-IP-VP-PP-endstart-IP-VP-CPnull-IP-VP-PP-endstart-IP-VP-CP-IP-VP-PP-endstart-IP-VP-CP-IP-VP-PP-end
…andattheendofthelearningperiodhasasenseoftheprobabilityofanygivencontainernodetrigram,basedonitsrelativefrequency.
begin-IP-VPIP-VP-CPnullbegin-IP-end
IP-VP-CPifIP-NP-PP
Subjacency-ish:Developingknowledge
Lidz&Gagliardi2015
Pearl&Sprouse2013a,2013b,2015
Anywh-dependencycanthenhaveaprobability,basedontheproductofthesmoothedprobabilitiesofitstrigrams.
begin-IP-VP-CPnull-IP-VP-PP-endProbability(begin-IP-VP-CPnull-IP-VP-PP-end) =
p(trigram)
Whodidshethinkthegiftwasfrom__?
Subjacency-ish:Developingknowledge
Lidz&Gagliardi2015
p(begin-IP-VP)-CP-IP-VP-PP-end start-p(IP-VP-CPnull)-IP-VP-PP-end
start-IP-p(VP-CPnull-IP)-VP-PP-endstart-IP-VP-p(CPnull-IP-VP)-PP-endstart-IP-VP-CP-p(IP-VP-PP)-endstart-IP-VP-CP-IP-p(VP-PP-end)
Pearl&Sprouse2013a,2013b,2015
Thisallowsthemodeledlearnertogeneratejudgmentsaboutthegrammaticalityofanydependency.
Higherprobabilitydependenciesaremoregrammaticalwhilelowerprobabilitydependenciesarelessgrammatical. begin-IP-VP-CPnull-IP-VP-PP-end=
begin-IP-VP-CPif-IP-VP-end=
Subjacency-ish:Developingknowledge
Lidz&Gagliardi2015
Pearl&Sprouse2013a,2013b,2015
Syntacticislands:Subjacency-ish
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Subjacency-ishinput&intake:Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Dataintake: definedbyinitialstate=allwh-dependenciesinchild-directedspeech,ascharacterizedbycontainernodes
Butwhichwh-dependencies?Justtheonesbeingevaluatedinthetargetstate?
Who__claimedthatLilyforgotthenecklace? matrix|non-islandWhatdidtheteacherclaimthatLilyforgot__? embedded|non-islandWho__madetheclaimthatLilyforgotthenecklace? matrix|island*WhatdidtheteachermaketheclaimthatLilyforgot__? embedded|island
Syntacticislands:Subjacency-ish
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Subjacency-ishinput&intake:Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Dataintake: definedbyinitialstate=allwh-dependenciesinchild-directedspeech,ascharacterizedbycontainernodes
Butwhichwh-dependencies?Justtheonesbeingevaluatedinthetargetstate?
No!Anywh-dependencyhasrelevantinformationaboutcontainernodetrigramsusedtodeterminethegrammaticalityofwh-dependenciesingeneral.
+1IP-VP-CPnull…
+1begin-IP-VP
Syntacticislands:Subjacency-ish
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Subjacency-ishinput&intake:Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Dataintake:
allwh-dependenciesinchild-directedspeech,ascharacterizedbycontainernodes
(Brown-Adam,Brown-Eve,Suppes,Valian)fromCHILDES:101,838utterancescontaining20,923wh-dependencies
76.7% Whatdidyousee__?
12.8% What__happened?
5.6% Whatdidshewanttodo__?2.5% Whatdidshereadfrom__?1.1% Whatdidshethinkhesaid__?…
definedbyinitialstate=
Syntacticislands:Subjacency-ish
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Subjacency-ishinput&intake:Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Learningperiod:definedbyempiricalestimatesfromHart&Risley(1995)(~3yearsofdata)=~200,000wh-dependencydatapoints
Syntacticislands:Subjacency-ish
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Subjacency-ishinput&intake:Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Targetstate:Behavioralevidenceofsyntacticislandsknowledge
●
●
●
●
−1
−0.5
0
0.5
1
1.5
2z−
scor
e ra
ting
matrix embedded
island structurenon−island structure
island effect
●
●
●
●
−1
−0.5
0
0.5
1
1.5
2
z−sc
ore
ratin
g
matrix embedded
island structurenon−island structure
no island effect
embedded matrix embedded
Non-parallellinesindicatesuperadditivity,whichindicatesknowledgeofislands.
Buthowdowegetacceptabilityjudgmentequivalents?
matrix
Syntacticislands:Subjacency-ish
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Subjacency-ishinput&intake:Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Targetstate:Behavioralevidenceofsyntacticislandsknowledge
●
●
●
●
−1
−0.5
0
0.5
1
1.5
2
z−sc
ore
ratin
g
matrix embedded
island structurenon−island structure
island effect
●
●
●
●
−1
−0.5
0
0.5
1
1.5
2
z−sc
ore
ratin
g
matrix embedded
island structurenon−island structure
no island effect
embedded matrix embedded
ForeachsetofislandstimulifromSprouseetal.(2012),wegenerategrammaticalitypreferencesforthemodeledlearnerbasedonthedependency’sperceivedprobabilityandusethisasastand-inforacceptability.
matrix
Syntacticislands:Subjacency-ish
Pearl&Sprouse2013a,2013b,2015
Lidz&Gagliardi2015
Subjacency-ishinput&intake:Adependencycannotcrossaverylowprobabilityregionofstructure(representedasasequenceofcontainernodes).
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Targetstate:Behavioralevidenceofsyntacticislandsknowledge
●
●
●
●
−1
−0.5
0
0.5
1
1.5
2
z−sc
ore
ratin
g
matrix embedded
island structurenon−island structure
island effect
●
●
●
●
−1
−0.5
0
0.5
1
1.5
2
z−sc
ore
ratin
g
matrix embedded
island structurenon−island structure
no island effect
embedded matrix embedded
Who__claimedthatLilyforgotthenecklace?
WhatdidtheteacherclaimthatLilyforgot__?
Who__madetheclaimthatLilyforgotthenecklace?
*WhatdidtheteachermaketheclaimthatLilyforgot__?
matrix embedded
non-island
island
matrix
Subjacency-ish:Success!
Superadditivityobservedforallfourislands—thequalitativebehaviorsuggeststhatthislearnerhasknowledgeofthesesyntacticislands.
ComplexNP Subject
AdjunctWhether
matrix embedded matrix embedded
matrix embeddedmatrix embedded
TheSubjacency-ishrepresentationthatreliesoncontainernodetrigramprobabilitiescansolvethislearningproblem.
Pearl&Sprouse2013a,2013b,2015
Subjacency-ish:Takeaway
RepresentationvalidationIfdependenciesarerepresentedascontainernodesequences,acquisitionworkswellforthesefoursyntacticislands.
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Pearl&Sprouse2013a,2013b,2015
Subjacency-ishvs.Subjacency:What’sinUG?
Wh…[CN1…[CN2… [CN3…[CN4…[CN5… __]]
Innate Derived Domain-specific
Domain-general
Attendtoboundingnodes(BNs) * *
Dependenciescrossing2+BNsarenotallowed * *
Innate Derived Domain-specific
Domain-general
Attendtocontainernodesofaparticularkind ? ? *
Lowprobabilityitemsaredispreferred * *
UG=innate+domain-specific
Wh…[BN1 … [BN2… __]]
FewerpiecesofknowledgenecessarilyinUG+empirically-motivatedalternativeproposalforonecomponent.
Subjacency-ish
Subjacency
Pearl&Sprouse2013a,2013b,2015
Recurringthemes:Syntacticislands
Informingtheoriesofrepresentation&acquisition
Recurringthemes,asseeninsyntacticislandacquisition:(1)Broadeningthesetofrelevantdataintheacquisitionalintaketoincludeallwh-dependencies
Lidz&Gagliardi2015
Pearl&Sprouse2013a,2013b,2015
Recurringthemes:Syntacticislands
Informingtheoriesofrepresentation&acquisition
Recurringthemes,asseeninsyntacticislandacquisition:(1)Broadeningthesetofrelevantdataintheacquisitionalintaketoincludeallwh-dependencies(2)Evaluatingoutputbyhowusefulitisforgeneratingacceptabilityjudgmentbehavior
Lidz&Gagliardi2015
Pearl&Sprouse2013a,2013b,2015
Recurringthemes:Syntacticislands
Informingtheoriesofrepresentation&acquisition
Recurringthemes,asseeninsyntacticislandacquisition:(1)Broadeningthesetofrelevantdataintheacquisitionalintaketoincludeallwh-dependencies(2)Evaluatingoutputbyhowusefulitisforgeneratingacceptabilityjudgmentbehavior(3)NotnecessarilyneedingthepriorknowledgewethoughtwedidinUG:containernodes
ratherthanboundingnodes,nodomain-specificconstraintonlength
Lidz&Gagliardi2015
Pearl&Sprouse2013a,2013b,2015
Openquestions
ThislearningstrategyrelyingontheSubjacency-ishrepresentationforwh-dependenciesmakessomedevelopmentalpredictions–canweverifytheseexperimentally?
“that-trace”effectprediction:Childreninitiallydispreferalldependenciescontainingthat,evenonesadultsallow,duetotheinfrequencyofcontainernodetrigramswithCPthatinchild-directedspeech
Pearl&Sprouse2013a,2013b,2015
ThislearningstrategyrelyingontheSubjacency-ishrepresentationforwh-dependenciesmakessomedevelopmentalpredictions–canweverifytheseexperimentally?
Pearl&Sprouse2013a,2013b,2015
Subjectextraction*Whodoyouthinkthat__readthebook?Whodoyouthink__readthebook?
“that-trace”effectprediction:Childreninitiallydispreferalldependenciescontainingthat,evenonesadultsallow,duetotheinfrequencyofcontainernodetrigramswithCPthatinchild-directedspeech
Openquestions
ThislearningstrategyrelyingontheSubjacency-ishrepresentationforwh-dependenciesmakessomedevelopmentalpredictions–canweverifytheseexperimentally?
Pearl&Sprouse2013a,2013b,2015
Subjectextraction*Whodoyouthinkthat__readthebook?Whodoyouthink__readthebook?
ObjectextractionWhatdoyouthinkthatheread__?Whatdoyouthinkheread__?
“that-trace”effectprediction:Childreninitiallydispreferalldependenciescontainingthat,evenonesadultsallow,duetotheinfrequencyofcontainernodetrigramswithCPthatinchild-directedspeech
Openquestions
Howdoesthislearningstrategyforwh-dependenciesmeasureupcross-linguistically?
Islandeffectsvary.Ex:Italiandoesnothaveasubjectislandeffectwhenthewh-dependencyispartofarelativeclause,thoughitdoeswhenthewh-dependencyispartofaquestion.(Sprouseetal.inpress)
WouldtheinputnaturallyleadtheSubjacency-ishlearnertothisdistinction?
Pearl&Sprouse2013a,2013b,2015
Openquestions
Canweextendthislearningstrategytocreateanintegratedtheoryofsyntacticacquisition?
Relatedphenomena:Thedistributionofgaps
Parasiticgaps:Dependenciesthatspananisland(andsoshouldbeungrammatical)butwhicharesomehowrescuedbyanotherdependencyintheutterance.
Pearl&Sprouse2013a,2013b,2015
*Whichbookdidyoulaugh[beforereading__]? *Whichbookdidyoujudge__true[beforereading__parasitic]?
Adjunctisland
Openquestions
Canweextendthislearningstrategytocreateanintegratedtheoryofsyntacticacquisition?
Relatedphenomena:Thedistributionofgaps
Across-the-board(ATB)extraction:Similarsituation.
Pearl&Sprouse2013a,2013b,2015
*Whichbookdidyou[[read__]and[thenreview__]]? dependencyforbothgaps:IP-VP-VP *Whichbookdidyou[[readthepaper]and[thenreview__]]? dependencyforgap:IP-VP-VP
*Whichbookdidyou[[read__]and[thenreviewthepaper]]? dependencyforgap:IP-VP-VP
Coordinatestructureisland
Openquestions
Canweextendthislearningstrategytocreateanintegratedtheoryofsyntacticacquisition?
Semi-relatedphenomena:Bindingdependencies
Theredon’tappeartobethesamerestrictionsonbindingdependenciesthatthereareonwh-dependencies.
Pearl&Sprouse2013a,2013b,2015
Theboythoughtthejokeabouthimselfwasreallyfunny.
*Whodidtheboythink[thejokeabout__]wasreallyfunny? Subjectisland
Openquestions
Today’sPlan
Characterizinglearningproblemspreciselyenoughtoinformativelymodelthem
UGmodelingforays
InvestigatingUniversalGrammar(UG)in
UniversalGrammar
inindomain-specific
domain-general
innatederived
NP
N’det
a adj
red
N’
N0
bottle
one=
• Why?Atraditionalpoverty-of-the-stimulusproblemusedtomotivatespecificproposalsforthecontentsofUG.
•What?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone
Look-aredbottle!
•What?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone
Look-aredbottle! Doyouseeanotherone?
• Why?Atraditionalpoverty-of-the-stimulusproblemusedtomotivatespecificproposalsforthecontentsofUG.
•What?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone
Look-aredbottle! Doyouseeanotherone?
Processofinterpretation:Firstdeterminethelinguisticantecedentofone(whatexpressiononeisreferringto)basedonitssyntacticcategory.! antecedentofone=“redbottle”
• Why?Atraditionalpoverty-of-the-stimulusproblemusedtomotivatespecificproposalsforthecontentsofUG.
•What?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone
Look-aredbottle! Doyouseeanotherone?
Processofinterpretation:Becausetheantecedent(“redbottle”)includesthemodifier“red”,thepropertyREDisimportantforthereferentofonetohave.! referentofone=REDBOTTLE
• Why?Atraditionalpoverty-of-the-stimulusproblemusedtomotivatespecificproposalsforthecontentsofUG.
•What?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone
Look-aredbottle! Doyouseeanotherone?
Twosteps:(1)Identifylinguisticantecedent(basedonone’ssyntacticcategory)(2)Identifyreferent(basedonlinguisticantecedent)
• Why?Atraditionalpoverty-of-the-stimulusproblemusedtomotivatespecificproposalsforthecontentsofUG.
Englishanaphoricone:Acquisitiontarget
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Look-aredbottle!Doyouseeanotherone?
Englishanaphoricone:Acquisitiontarget
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
AdultKnowledge
Standardlinguistictheory(Chomsky
1970,Jackendoff1977)haspositedthatoneinthesekindsofutterancesisasyntacticcategorysmallerthananentirenounphrase(NP),butlargerthanjustanoun(N0).ThiscategoryhasbeencalledN’,andincludesstringslike“bottle”and“redbottle”.
Look-aredbottle!Doyouseeanotherone?
Englishanaphoricone:Acquisitiontarget
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
AdultKnowledge
Becauseoneisthoughttobethissamecategory(N’),availableadultinterpretationsforoneincludeboth
“Doyouseeanotherbottle?”and“Doyouseeanotherredbottle?”
Look-aredbottle!Doyouseeanotherone?
Additionalpreferencesallowadultstochoosetheappropriateinterpretationfromtheseoptionsincontext.
Englishanaphoricone:Acquisitiontarget
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
AdultKnowledge
Becauseoneisthoughttobethissamecategory(N’),availableadultinterpretationsforoneincludeboth
“Doyouseeanotherbottle?”and“Doyouseeanotherredbottle?”
Look-aredbottle!Doyouseeanotherone?
Additionalpreferencesallowadultstochoosetheappropriateinterpretationfromtheseoptionsincontext.
Englishanaphoricone:Acquisitiontarget
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
AdultKnowledge
Syntacticcategoryofoneinthisutterance=N’
Referentofonecanbetheobject
thatcontainsthepropertyinthemodifier(REDBOTTLE)
Look-aredbottle!Doyouseeanotherone?
Englishanaphoricone:Acquisitiontarget
Childknowledgeasmeasuredbylookingtimebehavior
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Childbehaviorat18months:Lidzetal.2003
Look-aredbottle!
Nowlook…
Englishanaphoricone:Acquisitiontarget
Childknowledgeasmeasuredbylookingtimebehavior
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Childbehaviorat18months:Lidzetal.2003
Look-aredbottle!
Nowlook…
Control/Noun: “Whatdoyouseenow?” “Doyouseeanotherbottle?” Baselinenoveltypreference Averageprobabilityoflookingtosame
colorbottle:0.459
Prefertolookatnovelbottle.
Englishanaphoricone:Acquisitiontarget
Childknowledgeasmeasuredbylookingtimebehavior
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Childbehaviorat18months:Lidzetal.2003
Look-aredbottle!
Nowlook…
Control/Noun: “Whatdoyouseenow?” “Doyouseeanotherbottle?”
Prefertolookatnovelbottle.(0.459tosamecolor)
Anaphoric/Adjective-Noun: “Doyouseeanotherone?” “Doyouseeanotherredbottle?” Adjustedfamiliaritypreference Averageprobabilityoflookingto
samecolorbottle:0.587
Prefertolookatsamecolorbottle.
Englishanaphoricone:Acquisitiontarget
Childknowledgeasmeasuredbylookingtimebehavior
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Childbehaviorat18months:Lidzetal.2003
Look-aredbottle!
Nowlook…
Control/Noun: “Whatdoyouseenow?” “Doyouseeanotherbottle?”
Prefertolookatnovelbottle.(0.459tosamecolor)
Anaphoric/Adjective-Noun: “Doyouseeanotherone?” “Doyouseeanotherredbottle?” Prefertolookatsamecolorbottle. (0.587tosamecolor)
Englishanaphoricone:Acquisitiontarget
Childknowledgeasmeasuredbylookingtimebehavior
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Childbehaviorat18months:Lidzetal.2003
Look-aredbottle!
Nowlook…
Control/Noun: “Whatdoyouseenow?” “Doyouseeanotherbottle?”
Prefertolookatnovelbottle.(0.459tosamecolor)
Anaphoric/Adjective-Noun: “Doyouseeanotherone?” “Doyouseeanotherredbottle?” Prefertolookatsamecolorbottle. (0.587tosamecolor)
DevelopedknowledgeaccordingtoLidzetal.2003:18-month-oldsinterpretone’santecedentas“redbottle”(anN’)anditsreferentastheREDBOTTLE.
Englishanaphoricone:Acquisitiontarget
Targetstateforacquisition:knowledgeandbehavior
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Childbehaviorat18months:Lidzetal.2003
Look-aredbottle!
Nowlook…
Control/Noun: “Whatdoyouseenow?” “Doyouseeanotherbottle?”
Prefertolookatnovelbottle.(0.459tosamecolor)
Anaphoric/Adjective-Noun: “Doyouseeanotherone?” “Doyouseeanotherredbottle?” Prefertolookatsamecolorbottle. (0.587tosamecolor)
DevelopedknowledgeaccordingtoLidzetal.2003:18-month-oldsinterpretone’santecedentas“redbottle”(anN’)anditsreferentastheREDBOTTLE.
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsincommon:
Syntacticcategoriesexist(particularlyNP,N’,andN0),andcanberecognized.
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsincommon:
Syntacticcategoriesexist(particularlyNP,N’,andN0),andcanberecognized.
Anaphoricelementslikeonetakelinguisticantecedentsofthesamecategory.
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Baker(1978):Onethatwon’twork=DirUnamb
Onlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Onlyutteranceswhereone’santecedentisunambiguousarerelevant.
DirUnamb:specificcombinationofutteranceandsituation“Look–aredbottle!Hmmm-theredoesn’tseemtobeanotheronehere,though.”
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Baker(1978):Onethatwon’twork=DirUnamb
Onlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Onlyutteranceswhereone’santecedentisunambiguousarerelevant.
Whywon’titwork?Thedirectunambiguousdataaretoosparse.There’snothingtolearnfrom.
Pearl&Mis2011,2016affirmation:0examplesinthe17,521utterancesintheBrown-Evecorpus(Brown1973)fromCHILDES.
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Baker(1978):Onethatcouldwork=DirUnamb+N’Onlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Onlyutteranceswhereone’santecedentisunambiguousarerelevant.
Childrenalreadyknowthatonecan’tbeN0,soitmustbeN’.
Thissolvestheproblemofone’ssyntacticcategory.
UGknowledge
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Pearl&Lidz2009:Onethatdoesn’twork=DirEOOnlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Useprobabilisticinferencetoleverageambiguousinformationaboutone.
Allambiguousdataarerelevant(EqualOpportunity).
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Pearl&Lidz2009:Onethatdoesn’twork=DirEO
Onlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Useprobabilisticinferencetoleverageambiguousinformationaboutone.
DirRefSynAmb:Ambiguousaboutwhetherantecedentis“bottle”(N0,N’)or“redbottle”(N’).“Look–aredbottle!Oh,look–anotherone!”
Allambiguousdataarerelevant(EqualOpportunity).
0.66%ofutterancescontainingapronouninBrown-Evecorpus
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Pearl&Lidz2009:Onethatdoesn’twork=DirEO
Onlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Useprobabilisticinferencetoleverageambiguousinformationaboutone.
DirSynAmb:Ambiguousaboutantecedentcategory(bottle=N0,N’).“Look–abottle!Oh,look–anotherone!”
Allambiguousdataarerelevant(EqualOpportunity).
7.52%ofutterancescontainingapronouninBrown-Evecorpus
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Pearl&Lidz2009:Onethatdoesn’twork=DirEOOnlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Useprobabilisticinferencetoleverageambiguousinformationaboutone.
DirSynAmb:Ambiguousaboutantecedentcategory(bottle=N0,N’).“Look–abottle!Oh,look–anotherone!”
Allambiguousdataarerelevant(EqualOpportunity).
Turnouttobeharmfultolearning-theycausethelearnertothinkone’scategoryshouldbeN0.
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Pearl&Lidz2009,Regier&Gahl2004:Onethatdoesworkfortargetknowledge=DirFilteredOnlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Useprobabilisticinferencetoleverageambiguousinformationaboutone.
DirSynAmb:Ambiguousaboutantecedentcategory(bottle=N0,N’).“Look–abottle!Oh,look–anotherone!”
FilterouttheharmfulDirSynAmbdata.
Turnouttobeharmfultolearning-theycausethelearnertothinkone’scategoryshouldbeN0.
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Pearl&Mis2011,2016:Onethatcouldwork=IndirPro
Onlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Useprobabilisticinferencetoleverageambiguousinformationaboutone.
Utterancesusingotherpronounsanaphoricallyarerelevantforlearningaboutanaphoricone.Thisisindirectevidencecomingfromotherpronouns.
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
Pearl&Mis2011,2016:Onethatcouldwork=IndirPro
Onlyutterancesdirectlyusingonearerelevantforlearningaboutanaphoricone.
Useprobabilisticinferencetoleverageambiguousinformationaboutone.
Utterancesusingotherpronounsanaphoricallyarerelevantforlearningaboutanaphoricone.Thisisindirectevidencecomingfromotherpronouns.
IndirUnamb:Relevantbecauseindicateswhetherantecedentincludesthementionedproperty(italwaysdoeshere),whichishelpfulwhenchoosingbetweendifferentinterpretationoptionsinothercontexts.
“Look–aredbottle!Iwantone/it.”
aredbottle
8.42%ofutterancescontainingapronouninBrown-Evecorpus
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
LearningproposalcomparisonsSuccessful?
RepresentaYons Behavior
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
LearningproposalcomparisonsSuccessful?
RepresentaYons Behavior
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
LearningproposalcomparisonsSuccessful?
RepresentaYons Behavior
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
LearningproposalcomparisonsSuccessful?
RepresentaYons Behavior
Englishanaphoricone:Representations
Proposedsolutionsfornecessaryknowledge&learningbiases
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Thingsthatdiffer:
Whichinputisconsideredrelevantfromtheperceptualintake=acquisitionalintake
LearningproposalcomparisonsSuccessful?
RepresentaYons Behavior
Englishanaphoricone:Dataintake
Dataintake:Thedatarelevantforlearning
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Datapotentiallyintheacquisitionalintake
Englishanaphoricone:Learningperiod
Learningperiod:Howlongchildrenhavetolearn=howmuchdata
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Beforethislearningprocesscanbegin,childrenneedtoknowsomethingaboutsyntacticcategories.ExperimentaldatafromBooth&Waxman(2003)suggeststheyrecognizelinguisticmarkersofcategorieslikeNounandAdjectivearound14months.
Syntacticcategoriesexist(particularlyNP,N’,andN0),andcanberecognized.
Beginning:14months
Englishanaphoricone:Learningperiod
Learningperiod:Howlongchildrenhavetolearn=howmuchdata
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
TheexperimentaldatafromLidzetal.(2003)suggesttheyshouldreachtheknowledgestatethatgeneratesthatobservablebehaviorby18months.
Beginning:14monthsEnd:18months
Englishanaphoricone:Learningperiod
Learningperiod:Howlongchildrenhavetolearn=howmuchdata
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
UsingempiricalestimatesfromHart&Risley(1995),wecanestimatethisasapproximately36,500datapointscontainingananaphoricpronoun.
Beginning:14monthsEnd:18months
=4months’worthofdata
Englishanaphoricone:Targetstate
Pearl&Mis2011,Pearl&Mis2016
Lidz&Gagliardi2015
Childbehaviorat18months:Lidzetal.2003
Look-aredbottle!
Nowlook…
Control/Noun: “Whatdoyouseenow?” “Doyouseeanotherbottle?”
Prefertolookatnovelbottle.(0.459tosamecolor)
Anaphoric/Adjective-Noun: “Doyouseeanotherone?” “Doyouseeanotherredbottle?” Prefertolookatsamecolorbottle. (0.587tosamecolor)
DevelopedknowledgeaccordingtoLidzetal.2003:18-month-oldsinterpretone’santecedentas“redbottle”(anN’)anditsreferentastheREDBOTTLE.
Targetstate:knowledgeandbehavior
Englishanaphoricone:Learningprocess
Pearl&Mis2011,Pearl&Mis2016
Lidz&Gagliardi2015
Modelofunderstandingareferentialexpressioninvolvingananaphoricpronoun,whichincludesbothsyntacticinformationandreferentialinformationwhendeterminingtheantecedentwhichthenpicksoutthereferent.
Update&iterationofdevelopinggrammar
DevelopedknowledgeaccordingtoLidzetal.2003:18-month-oldsinterpretone’santecedentas“redbottle”(anN’)anditsreferentastheREDBOTTLE.
Englishanaphoricone:Learningprocess
Pearl&Mis2011,Pearl&Mis2016
Lidz&Gagliardi2015
Modelofunderstandingareferentialexpressioninvolvingananaphoricpronoun,whichincludesbothsyntacticinformationandreferentialinformationwhendeterminingtheantecedentwhichthenpicksoutthereferent.
Update&iterationofdevelopinggrammar
DevelopedknowledgeaccordingtoLidzetal.2003:18-month-oldsinterpretone’santecedentas“redbottle”(anN’)anditsreferentastheREDBOTTLE.
pN’=probabilitythatone’scategoryisN’(vs.N0)
pincl=probabilitythatone’santecedentincludesthementionedmodifier(e.g.,red)vs.not
dx=probabilitythatdatapointindicatesthis
Dx=1foreverydatapointencountered
Englishanaphoricone:Learningprocess
Pearl&Mis2011,Pearl&Mis2016
Lidz&Gagliardi2015
Modelofunderstandingareferentialexpressioninvolvingananaphoricpronoun,whichincludesbothsyntacticinformationandreferentialinformationwhendeterminingtheantecedentwhichthenpicksoutthereferent.
Update&iterationofdevelopinggrammar
Control/Noun: “Whatdoyouseenow?” “Doyouseeanotherbottle?”
Prefertolookatnovelbottle.(0.459tosamecolor)
Anaphoric/Adjective-Noun: “Doyouseeanotherone?” “Doyouseeanotherredbottle?” Prefertolookatsamecolorbottle. (0.587tosamecolor)
pbeh=probabilityofproducingtargetbehavior(lookingtosamecolorbottle)
Englishanaphoricone:Learningprocess
Pearl&Mis2011,Pearl&Mis2016
Lidz&Gagliardi2015
Modelofunderstandingareferentialexpressioninvolvingananaphoricpronoun,whichincludesbothsyntacticinformationandreferentialinformationwhendeterminingtheantecedentwhichthenpicksoutthereferent.
Update&iterationofdevelopinggrammar
Control/Noun: “Whatdoyouseenow?” “Doyouseeanotherbottle?”
Prefertolookatnovelbottle.(0.459tosamecolor)
Anaphoric/Adjective-Noun: “Doyouseeanotherone?” “Doyouseeanotherredbottle?” Prefertolookatsamecolorbottle. (0.587tosamecolor)
prep|beh=probabilityofhavingtargetrepresentation(antecedent=“redbottle”)whenproducingtargetbehavior(lookingtosamecolorbottle)
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’pinclpbehprep|beh
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
Alearnerwhoonlylooksatdirectunambiguousdatahasnodatatolearnfrom,soitlearnsnothing.(Povertyofthestimulus.)
It’satchanceforhavingthetargetsyntacticandreferentialknowledgenecessarytochoosethecorrectantecedent.
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01)
pincl 0.500(<0.01)
pbeh 0.475(<0.01)
prep|beh 0.158(<0.01)
Itdoesn’tgeneratetheobservedtoddlerlookingpreference,andit’sunlikelytohavethetargetrepresentationifitlooksatthefamiliarbottle.
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
Implication:Somethingelseisneeded.(Baker(1978)’soriginalobservation)
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01)
pincl 0.500(<0.01)
pbeh 0.475(<0.01)
prep|beh 0.158(<0.01)
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
WhatifthelearneralsoknowsthatoneiscategoryN’?(Baker1978)
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01) 1.000
pincl 0.500(<0.01)
pbeh 0.475(<0.01)
prep|beh 0.158(<0.01)
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01) 1.000
pincl 0.500(<0.01) 0.500(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01)
Thislearnerstillhasnodatatolearnfrom,soitlearnsnothingaboutthecorrectreferentialknowledgenecessarytochoosethecorrectantecedent.
Thislackofreferentialknowledgecausesitnottogeneratetheobservedtoddlerlookingpreferenceincontext,andevenifithappenstolookatthefamiliarbottle,tobeunlikelytohavethetargetrepresentationwhendoingso.
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01) 1.000
pincl 0.500(<0.01) 0.500(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01)
Implication:KnowingoneiscategoryN’isn’tsufficienttogeneratetargetbehaviorifonlydirectunambiguousdataarerelevant.
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
TheDirFilteredlearner(Regier&Gahl2004,Pearl&Lidz2009)believesoneisN’whenitissmallerthanNPandamentionedpropertyshouldbeincludedintheantecedent,asfoundpreviously.
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01)
It’salsoclosetogeneratingtheobservedtoddlerlookingpreference,andislikelytohavethetargetrepresentationwhenlookingatthefamiliarbottle.
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01)
Implication:Thisnewfindingsuggeststhisisaprettysuccessfullearningstrategyformatchingtheavailablebehavioraldata.
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
TheDirEOlearner(exploredbyPearl&Lidz2009)prefersonetobeN0whenitissmallerthanNP,anddoesnotbelievethementionedpropertyshouldbeincludedintheantecedent.Neitheroftheseisthetargetknowledge.
Thiscausesthelearnernottogeneratetheobservedtoddlerlookingpreference,andnottohavethetargetrepresentationifitlooksatthefamiliarbottle.
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01) 0.246(0.03)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01) 0.379(0.05)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01) 0.464(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01) 0.050(0.01)
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
DirUnamb DirUnamb+N’ DirFiltered DirEO +IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01) 0.246(0.03)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01) 0.379(0.05)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01) 0.464(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01) 0.050(0.01)
Implication:Thisnewfindingsuggeststhisisn’tagoodlearningstrategyformatchingtheavailablebehavioraldata.
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
DirUnamb DirUnamb+N’ DirFiltered DirEO IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01) 0.246(0.03) 0.368(0.04)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01) 0.379(0.05) 1.000(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01) 0.464(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01) 0.050(0.01)
TheIndirProlearnerrobustlydecidestheantecedentshouldincludethementionedproperty.However,thislearnerhasamoderatedispreferenceforbelievingoneisN’whenitissmallerthanNP.Thisisn’tthetargetrepresentation,w.r.tsyntacticcategory.
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
However…thislearnerstillgeneratestheobservedtoddlerlookingpreferenceperfectly,andhasthetargetrepresentationwhenlookingatthefamiliarbottle.
DirUnamb DirUnamb+N’ DirFiltered DirEO IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01) 0.246(0.03) 0.368(0.04)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01) 0.379(0.05) 1.000(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01) 0.464(<0.01) 0.587(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01) 0.050(0.01) 0.998(<0.01)
Why?Thelearnerbelievesverystronglythatthementionedpropertymustbeincludedintheantecedent.
Onlyoneantecedentallowsthis:[N’red[N’[N0bottle]]]
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
DirUnamb DirUnamb+N’ DirFiltered DirEO IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01) 0.246(0.03) 0.368(0.04)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01) 0.379(0.05) 1.000(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01) 0.464(<0.01) 0.587(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01) 0.050(0.01) 0.998(<0.01)
So,becausetheantecedentincludesthementionedproperty,itandthepronounreferringtoit(one)mustbeN’inthiscontext-evenifthelearnerbelievesoneisnotN’ingeneral.
Onlyoneantecedentallowsthis:[N’red[N’[N0bottle]]]
Englishanaphoricone:Learningresults
Pearl&Mis2011,Pearl&Mis2016
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
DirUnamb DirUnamb+N’ DirFiltered DirEO IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01) 0.246(0.03) 0.368(0.04)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01) 0.379(0.05) 1.000(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01) 0.464(<0.01) 0.587(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01) 0.050(0.01) 0.998(<0.01)
Implication:Alearnerviewingotherpronoundataasrelevantcangeneratetargetbehaviorwithoutnecessarilyreachingthetargetknowledgestate–instead,thislearnerhasacontext-sensitiverepresentation(dependingonwhetherapropertywasmentioned).
Pearl&Mis2011,Pearl&Mis2016
Let’slookatthestrategiesthatworkedandseewhattheimplicationsareforUniversalGrammar,ascomparedtotheoriginalUGproposalbyBakerthatdidn’twork.
DirUnamb DirUnamb+N’ DirFiltered DirEO IndirPro
pN’ 0.500(<0.01) 1.000 0.991(<0.01) 0.246(0.03) 0.368(0.04)
pincl 0.500(<0.01) 0.500(<0.01) 0.963(<0.01) 0.379(0.05) 1.000(<0.01)
pbeh 0.475(<0.01) 0.492(<0.01) 0.574(<0.01) 0.464(<0.01) 0.587(<0.01)
prep|beh 0.158(<0.01) 0.306(<0.01) 0.918(<0.01) 0.050(0.01) 0.998(<0.01)
Note:Targetpbeh=0.587,allothertargetp=1.000Averagesover1000simulations,standarddeviationsinparentheses.
Englishanaphoricone:Learningresults
Lidz&Gagliardi2015
inUniversalGrammar
inindomain-specific
domain-general
innatederived
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Thingsincommon:Itmaybepossibletoderivethedomain-specificknowledgeofthespecificsyntacticcategoriesneededusingdistributionalclusteringtechniquesoverwords…butthatremainstobeshown.
Someinnateknowledgemaybenecessary(UG).
?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Thingsincommon:
Itmaybepossibletoderivethedomain-specificknowledgethatanaphoricantecedentsarethesamecategorybyobservingthecategoryofantecedentsthatareunambiguous…butthatremainstobeshown.
Someinnateknowledgemaybenecessary(UG).
??
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Thingsincommon:
Itseemslikelythatthepreferencetoconsiderdirectpositiveevidencerelevantisinnateanddomain-general.
??
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Thingsincommon:
Similarly,thepreferencetouseprobabilisticinferencetoleveragetheinformationinambiguousdataseemslikelytobeinnateanddomain-general.
Whilethisisanewstrategycomponent,it’sunlikelytobepartofUG.
??
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Oldunsuccessfulproposal:
Thedomain-specificknowledgethatoneisnotcategoryN0wasthoughttobeinnateandsopartofUG.
??
UG
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Oldunsuccessfulproposal:Thepreferencetorelyonlyonunambiguousevidencemightbeinnate,butcouldwellbedomain-generalandsonotpartofUG.
??
UG
?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
SuccessfulDirFilteredproposalThedomain-specificpreferencetofilteroutdatawhereonlythesyntacticcategoryisuncertain(whilethereferentisclear)maybeinnateandsopartofUG,oritmaybederivedfromaninnate,domain-generalpreferencetolearnincasesofuncertainty(Pearl&Lidz2009).
??
UG
?
?
Pearl&Mis2011,Pearl&Mis2016
DirSynAmb:Ambiguousaboutantecedentcategory(bottle=N0,N’).“Look–abottle!Oh,look–anotherone!”
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
SuccessfulDirFilteredproposal
Forthedomainoflanguage,uncertaintyincommunicationwouldbewhatmatters.Utteranceswhereonlythesyntacticcategoryisuncertainmaybe“goodenough”forcommunicationpurposessincethereferentisclear.So,childrenareunconcernedaboutimprovinglinguisticknowledgeabouttheseutterancesandignorethem.
??
UG
?
?
Pearl&Mis2011,Pearl&Mis2016
DirSynAmb:Ambiguousaboutantecedentcategory(bottle=N0,N’).“Look–abottle!Oh,look–anotherone!”
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
SuccessfulIndirProproposal
Thedomain-specificknowledgetoconsiderotherpronounsrelevantmaybeinnateandsopartofUGoritmayderivefromanoverhypothesis(Kempetal2007)thelearnerformsaboutthesimilarityofonewithotheranaphoricpronounsintermsoftheirdistribution.
??
UG
?
? ?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Bothsuccessfulproposals
ThenewcomponentsrequiredmaynotnecessarilyneedtobebuiltintoUG.However,iftheyare,theyareless-specificknowledgethanthepreviousproposalsupposed(whichdidn’tactuallycapturechildren’sbehavioranyway).
??
UG
?
? ?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
Someopenquestions
Foreachcomponentthatmaybederivablefromtheinput,canwecreatealearnerthancanactuallyderivethatcomponentfromtheavailablelinguisticinformation?Andifso,whatarethelearningcomponentsrequiredtodoso?
??
UG
?
? ?
Pearl&Mis2011,Pearl&Mis2016
Englishanaphoricone:Strategycomponents
DirFiltered IndirPro
Syntacticcategories
Antecedent=SameCategory
Probabilisticinference
+DirectpositiveevidenceFilteroutDirSynAmb
+Indirectevidence=pronouns
inUniversalGrammar
inindomain-specific
domain-general
innatederived
one≠N0
Onlyunambiguous
DirUnamb+N’
SomeopenquestionsHowgeneral-purposearetheselearningcomponents?Arethecomponentswefindusefulformakingsyntacticgeneralizationsaboutanaphoriconeusefulformakingothersyntacticgeneralizations?Whataboutotherlinguisticgeneralizations?Orothernon-linguisticgeneralizations?
??
UG
?
? ?
Pearl&Mis2011,Pearl&Mis2016
Recurringthemes:Englishanaphoricone
Informingtheoriesofrepresentation&acquisition
Recurringthemes:(1)Broadeningthesetofrelevantdataintheacquisitionalintaketoincludeallpronouns
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Recurringthemes:Englishanaphoricone
Informingtheoriesofrepresentation&acquisition
Recurringthemes:(1)Broadeningthesetofrelevantdataintheacquisitionalintaketoincludeallpronouns(2)Evaluatingoutputbyhowusefulitisforgeneratingtoddlerlookingtimebehavior
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Recurringthemes:Englishanaphoricone
Informingtheoriesofrepresentation&acquisition
Recurringthemes:(1)Broadeningthesetofrelevantdataintheacquisitionalintaketoincludeallpronouns(2)Evaluatingoutputbyhowusefulitisforgeneratingtoddlerlookingtimebehavior(3)NotnecessarilyneedingthepriorknowledgewethoughtwedidinUG:“goodenough”
deriveddatafilterorderivedoverhypothesisaboutpronounsratherthanspecificknowledgeaboutsyntacticcategory
Lidz&Gagliardi2015
Pearl&Mis2011,Pearl&Mis2016
Bigpicture: Understandinghowchildrenacquiresyntacticknowledge
Ifwepreciselydefinethecomponentsofanyacquisitiontaskbydrawingontheinsightsfromdifferentmethodologies,wecanmakeprogressonhowchildrensolvethatacquisitiontask.
Inparticular,wecanunderstandthenatureofchildren’slanguageacquisitiontoolkit—whatfundamentalbuildingblockstheyuseare,andwhatis(orisnot)partofUniversalGrammar.
Theoreticalmethods
Computationalmethods
Lidz&Gagliardi2015
Experimentalmethods
Bigpicture: Understandinghowchildrenacquiresyntacticknowledge
Ifwepreciselydefinethecomponentsofanyacquisitiontaskbydrawingontheinsightsfromdifferentmethodologies,wecanmakeprogressonhowchildrensolvethatacquisitiontask.
Inparticular,wecanunderstandthenatureofchildren’slanguageacquisitiontoolkit—whatfundamentalbuildingblockstheyuseare,andwhatis(orisnot)partofUniversalGrammar.
Theoreticalmethods
Computationalmethods
Lidz&Gagliardi2015
Experimentalmethods
Thistechniqueisausefultool—solet’suseittoinformourtheoriesofrepresentationandacquisition!
Thankyou! JonSprouse BenjaminMisGregCarlson LouAnnGerken JeffLidz
ComputationalModelsofLanguageLearningseminar,UCIrvine2010 Audiencesat:CogSci2011,UChicago2011workshopsonLanguage,Cognition,andComputation&Language,
Variation,andChange,Input&SyntacticAcquisitionWorkshop2012,UMarylandMayfest2012,NewYorkUniversityLinguisticscolloquium2012,StanfordCognition&LanguageWorkshop2013,GALANA2015
ThisworkwassupportedinpartbyNSFgrantsBCS-0843896andBCS-1347028.