BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:The acquisition and processing of grammatical structure: insights 
 from deep learning - Roger Levy\, MIT
DTSTART:20220505T140000Z
DTEND:20220505T150000Z
UID:TALK173963@talks.cam.ac.uk
CONTACT:Marinela Parovic
DESCRIPTION:Psycholinguistics and computational linguistics are the two fi
 elds most dedicated to accounting for the computational operations require
 d to understand natural language. Today\, both fields find themselves resp
 onsible for understanding the behaviors and inductive biases of "black-box
 " systems: the human mind and artificial neural-network language models (N
 LMs)\, respectively. Contemporary NLMs can be trained on a human lifetime
 ’s worth of text or more\, and generate text of apparently remarkable gr
 ammaticality and fluency. Here\, we use NLMs to address questions of learn
 ability and processing of natural language syntax. By testing NLMs trained
  on naturalistic corpora as if they were subjects in a psycholinguistics e
 xperiment\, we show that they exhibit a range of subtle behaviors\, includ
 ing embedding-depth tracking and garden-pathing over long stretches of tex
 t\, suggesting representations homologous to incremental syntactic state i
 n human language processing. Strikingly\, these NLMs also learn many gener
 alizations about the long-distance filler-gap dependencies that are a hall
 mark of natural language syntax\, perhaps most surprisingly many "island" 
 constraints. I conclude with comments on the long-standing idea of whether
  the departures of NLMs from the predictions of the "competence" grammars 
 developed in generative linguistics might provide a "performance" account 
 of human language processing: by and large\, they don’t.
LOCATION:https://cam-ac-uk.zoom.us/j/97599459216?pwd=QTRsOWZCOXRTREVnbTJBd
 XVpOXFvdz09
END:VEVENT
END:VCALENDAR
