BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Predicting Rich Linguistic Structure with Neural Networks - Jan Bu
 ys\, University of Oxford
DTSTART:20170613T130000Z
DTEND:20170613T140000Z
UID:TALK72634@talks.cam.ac.uk
CONTACT:Amandla Mabona
DESCRIPTION:The recent success of neural networks has raised questions abo
 ut the role of linguistic structure in NLP models\, but also opens up new 
 opportunities. In this talk I’ll show how neural networks enable us to p
 redict richer linguistic representations than previously feasible\, and ex
 plore how to incorporate linguistically-informed structural biases in lang
 uage generation models. First\, I’ll present a robust end-to-end parser 
 for Minimal Recursion Semantics (MRS)\, a framework for compositional sema
 ntics implemented in high-precision computational grammars. This task pres
 ents several challenges for structure prediction models\; I’ll show how 
 they can be addressed through generalizing transition-based parsing in the
  framework of encoder-decoder recurrent neural networks\, and using pointe
 r networks. Results show that incorporating structure into the neural arch
 itecture improves performance over attention-based baselines by a large ma
 rgin on both MRS and Abstract Meaning Representation parsing. Second\, I
 ’ll present a generative dependency parser which also serves as a neural
  syntactic language model. A feed-forward architecture and a decoding algo
 rithm based on particle filtering enable efficient and accurate inference\
 , while unsupervised fine-tuning improves language modelling perplexity. F
 inally\, I’ll present a sequence-to-sequence model in which the alignmen
 t between the input and output is a latent variable marginalized through d
 ynamic programming. The model is structured to learn mostly monotone align
 ments\, which makes it applicable to many transduction tasks while enablin
 g online inference.
LOCATION:FW11\, Computer Laboratory
END:VEVENT
END:VCALENDAR
