Syntactic Language Processing through Hierarchical Heteroclinic Networks Workshop on Heteroclinic Dynamics in Neuroscience University of Nice December,

Embed Size (px)

DESCRIPTION

+

Citation preview

Syntactic Language Processing through Hierarchical Heteroclinic Networks Workshop on Heteroclinic Dynamics in Neuroscience University of Nice December, 18, 2015 Peter beim Graben Bernstein Center for Computational Neuroscience Humboldt-Universitt zu Berlin Outline garden path sentences limited repair parsing sequence generation in neural networks limited repair through bifurcations conclusions outlook + the horse raced past the barn fell. Garden Path Sentences Bever (1970) Garden Path Theory Frazier & Rayner (1982) Frazier (1987) late closure Diagnosis Fodor & Inoue (1994) the horse raced past the barn fell. the horse raced past. the barn fell. ? lexicon: raced = (1) finite verb, past tense raced = (2) non-finite verb, participle past perfect symptom Experimental Findings self-paced reading: slowing down eye-tracking: regressions, longer fixations event-related brain potentials: P600 beim Graben et al. (2008) Schwappach et al. (2015) Garden Path Variants heavy garden paths: the horse raced past the barn fell. mild garden paths: Tad knows Shaq is tall. Bever (1970) Lewis (1998) Limited Repair Parsing Lewis (1998) Tad knows Shaq. (default) Tad knows Shaq is tall. snip Limited Repair Parsing Lewis (1998) Tad knows Shaq. (default) Tad knows Shaq is tall. link Decision Space Tad knows Shaq Tad knows Shaq is tall Lewis (1998) Hale (2011) garden path limited repair reduced relative parser states are vertices in representation space State Descriptions attraction repulsion Dynamics fixed point attractorsaddle identify parser states with equilibrium points in representation space stable directions unstable directions Garden Path Dynamics unwanted attractor Repair by Bifurcation unwanted attractor beim Graben et al. (2004) Neural Network Architecture hierarchy of 3 levels of generalized Lotka-Volterra systems: 1 st level localist representation of 24 parser states 2 nd level 3 parsing strategies 3 rd level 3 attention parameters sequence generation through winnerless competition at levels 1, 2 garden path as undesired attractor repair through bifurcation elicited by decay of attention Afraimovich et al. (2004) Fukai & Tanaka (1997) Kiebel et al. (2009) Haken (1991) st 2 nd 3 rd Neural Network Architecture Afraimovich et al. (2004) Fukai & Tanaka (1997) Kiebel et al. (2009) Haken (1991) st 2 nd 3 rd Results garden path limited repair reduced relative stabilization Results garden path limited repair reduced relative * sympton destabilization Results garden path limited repair reduced relative transition Results garden path limited repair reduced relative stabilization Results garden path limited repair reduced relative destabilization Results garden path limited repair reduced relative transition Dynamical Automata symbologram representation of automata states order parameter expansion open problems: determinism every pair of rationals is representational state infinite-dimensionality Conclusions syntactic language parsing is realized through sequential winnerless competition in neural population networks processing strategies and their transitions are realized through control parameters provided by higher levels in a network hierarchy processing strategies are stabilized and destabilized by attention parameters provided by even higher levels in a network hierarchy limited repair of syntactic garden paths is realized through decaying attention and subsequent bifurcations Outlook neurophysiological evidence? neural population models? continuous time neural dynamics? automata theoretic description of limited repair parsing: nested stack automata and index grammars? Thank you for your attention! Funding: DFG Heisenberg Fellowship Acknowledgements Context-Free Grammars Tad knows Shaq. (default) beim Graben et al. (2004) Context-Free Grammars Tad knows Shaq is tall. beim Graben et al. (2004) Interactive Left-Corner Parsing 1Tadknows(1)scan(Shaq) 2Tadknows Shaq(2)project 3S[VP1]knows Shaq(3)shift(V1) 4S[VP1]knowsShaq(4)scan(is) 5S[VP1]knowsShaq is(9)project 6S[VP1]VP1[NP2]Shaq is(19)shift(NP2) 7S[VP1]VP1[NP2]Shaqis(20)scan(tall) 8S[VP1]VP1[NP2]Shaqis tall(21)shift(V2) 9S[VP1]VP1[NP2]Shaq istall(22)fail 10S[VP1]VP1[NP2]Shaq istall timestackwm labeloperation beim Graben et al. (2008) beim Graben & Potthast (2012) garden path Interactive Left-Corner Parsing 10S[VP1]VP1[NP2]Shaq istall(22)snip 11S[VP1]VP1[NP2] Shaq is tall(23)repair [NP2] [CP] 12S[VP1]VP1[CP] Shaq is tall(24)shift(NP2) 13S[VP1]VP1[CP]Shaqis tall(12)cont. timestackwm labeloperation beim Graben et al. (2008) limited repair Interactive Left-Corner Parsing 13S[VP1]VP1[CP]Shaqis tall(12)project 14S[VP1]VP1[CP]CP[VP2]is tall(13)shift(V2) 15S[VP1]VP1[CP]CP[VP2]is tall(14)project(4) 16 S[VP1]VP1[CP]CP[VP2]VP2[A]tall(15)shift(A) 17 S[VP1]VP1[CP]CP[VP2]VP2[A]A(16)complete 18 S[VP1]VP1[CP]CP[VP2]VP2(17)complete 19S[VP1]VP1[CP]CP(18)complete 20S[VP1]VP1(7)complete 21S(8)accept timestackwm labeloperation reduced relative