BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Compositionality decomposed: how do neural networks generalise? - 
 Dieuwke Hupkes\, Facebook AI Research
DTSTART:20210211T130000Z
DTEND:20210211T140000Z
UID:TALK157291@talks.cam.ac.uk
CONTACT:Agnieszka Slowik
DESCRIPTION:In the past years\, neural networks have taken over the state 
 of the art in many fields in natural language processing. Despite their su
 ccesses\, such models are frequently argued to not be capable of modelling
  the types of structures required to adequately model natural language. In
  particular\, deep neural networks are claimed to be relying largely on st
 atistical patterns in the data instead of inferring compositional solution
 s. Since it is largely unknown what types of composition functions neural 
 network models learn and how they achieve their successes on NLP tasks\, i
 t is difficult to refute or proof such claims. As a consequence\, even if 
 an experiment convincingly shows that an architecture is not able to perfo
 rm a particular compositional generalisation\, it is often unclear what th
 is means\, and how this result relates to linguistic theories about compos
 itionality.  In this talk\, I present a series of experiments concerning a
 rtificial neural networks that aim to take apart different aspects of comp
 ositionality that are grounded in linguistic and psycholinguistic literatu
 re about this topic. In particular\, I focus on the extent to which differ
 ent architectures are systematic\, productive\, under what conditions they
  support substitution of meanings\, whether their representations are loca
 lly consistent and whether they overgeneralise when faced with exceptions 
 to rules. With this study\, we aim to shine some light on the strengths an
 d weaknesses of three different deep learning architectures that are popul
 ar for sequential tasks: LSTMs\, Convolution-based models and Transformers
 \, as well as aspire to reconnect connectionism and symbolism in the discu
 ssion about compositionality and pave the way for modelling approaches tha
 t better integrate the two.
LOCATION:Remote
END:VEVENT
END:VCALENDAR
