BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Learning Hierarchical Word and Sentence Representations - Dani Yog
 atama\, DeepMind
DTSTART:20170303T120000Z
DTEND:20170303T130000Z
UID:TALK69218@talks.cam.ac.uk
CONTACT:Kris Cao
DESCRIPTION:Languages encode meaning in terms of hierarchical\, nested str
 uctures. For example\, we often found coarse-to-fine organization of words
 ’ meanings in the field of lexical semantics (e.g.\, WordNet)\; and rela
 tionships among words in a sentence are largely organized in terms of late
 nt nested structures (Chomsky\, 1957).\nIn this talk\, I will first discus
 s how to incorporate hierarchical prior knowledge into a word representati
 on model. I will show how to use regularizers to encourage hierarchical or
 ganization of the latent dimensions of vector-space word embeddings.\n\nI 
 will then talk about a reinforcement learning method to learn tree-structu
 red neural networks for computing representations of natural language sent
 ences. In contrast to sequential RNNs which ignore tree structure\, our mo
 del generates a latent tree for each sentence using a reward from a semant
 ic interpretation task to syntactically structure the composition. I will 
 show that learning how words compose to form sentence meanings leads to be
 tter performance on various downstream tasks.
LOCATION:FW26\, Computer Laboratory
END:VEVENT
END:VCALENDAR
