BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Injecting Logical Background Knowledge into Embeddings for Relatio
 n Extraction - Tim Rocktäschel\, UCL
DTSTART:20150508T110000Z
DTEND:20150508T120000Z
UID:TALK58360@talks.cam.ac.uk
CONTACT:Tamara Polajnar
DESCRIPTION:Matrix factorization approaches to relation extraction provide
  several attractive features: they support distant supervision\, handle op
 en schemas\, and leverage unlabeled data. Such models learn dense fixed-le
 ngth vector representations (embeddings) of binary relations and entity-pa
 irs. Inference of an unseen factual statement amounts to a simple efficien
 t dot product between the corresponding relation and entity-pair vector\, 
 making these models highly scalable. However\, it is unclear to what exten
 t models based on distributed representations support complex reasoning as
  enabled\, for instance\, by symbolic representations such as first-order 
 logic. Moreover\, distributed representations are hard to debug and it is 
 not clear how symbolic background knowledge can be incorporated into such 
 models.\n\nRule-based extractors\, on the other hand\, can be easily exten
 ded to novel relations and improved for existing but inaccurate relations\
 , through first-order formulae that capture auxiliary domain knowledge. Ho
 wever\, usually a large set of such formulae is necessary to achieve gener
 alization.\n\nIn this talk\, I will introduce a paradigm for learning low-
 dimensional embeddings of entity-pairs and relations that combine the adva
 ntages of matrix factorization with first-order logic domain knowledge. Sp
 ecifically\, I will introduce a novel training algorithm to jointly\noptim
 ize over factual and first-order logic information and talk about our ongo
 ing research in this direction.
LOCATION:FW26\, Computer Laboratory
END:VEVENT
END:VCALENDAR
