Neural text generation from rich semantic representations
- đ¤ Speaker: Michael Goodman (University of Washington) đ Website
- đ Date & Time: Tuesday 16 July 2019, 12:20 - 12:35
- đ Venue: FW11, William Gates Building
Abstract
We propose neural models to generate high-quality text from structured representations based on Minimal Recursion Semantics (MRS). MRS is a rich semantic representation that encodes more precise semantic detail than other representations such as Abstract Meaning Representation (AMR). We show that a sequence-to-sequence model that maps a linearization of Dependency MRS , a graph-based representation of MRS , to English text can achieve a BLEU score of 66.11 when trained on gold data. The performance can be improved further using a high-precision, broad-coverage grammar-based parser to generate a large silver training corpus, achieving a final BLEU score of 77.17 on the full test set, and 83.37 on the subset of test data most closely matching the silver data domain. Our results suggest that MRS -based representations are a good choice for applications that need both structured semantics and the ability to produce natural language text as output.
Series This talk is part of the DELPH-IN Summit Open Session series.
Included in Lists
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)



Tuesday 16 July 2019, 12:20-12:35