University of Cambridge > Talks.cam > DELPH-IN Summit Open Session > Neural text generation from rich semantic representations

Neural text generation from rich semantic representations

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Guy Edward Toh Emerson.

We propose neural models to generate high-quality text from structured representations based on Minimal Recursion Semantics (MRS). MRS is a rich semantic representation that encodes more precise semantic detail than other representations such as Abstract Meaning Representation (AMR). We show that a sequence-to-sequence model that maps a linearization of Dependency MRS , a graph-based representation of MRS , to English text can achieve a BLEU score of 66.11 when trained on gold data. The performance can be improved further using a high-precision, broad-coverage grammar-based parser to generate a large silver training corpus, achieving a final BLEU score of 77.17 on the full test set, and 83.37 on the subset of test data most closely matching the silver data domain. Our results suggest that MRS -based representations are a good choice for applications that need both structured semantics and the ability to produce natural language text as output.

This talk is part of the DELPH-IN Summit Open Session series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity