COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Language Technology Lab Seminars > Imitation learning for language generation
Imitation learning for language generationAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Dimitri Kartsaklis. Natural language generation (NLG) is the task of generating natural language from a meaning representation. Rule-based approaches require domain-specific and manually constructed linguistic resources, while most corpus based approaches rely on aligned training data and/or phrase templates. The latter are needed to restrict the search space for the structured prediction task defined by the unaligned datasets. In this talk we will discuss the use of imitation learning for structured prediction which learns an incremental model that handles the large search space while avoiding explicitly enumerating it. We will show how we adapted the Locally Optimal Learning to Search (Chang et al., 2015) framework which allows us to train against non-decomposable loss functions such as the BLEU or ROUGE scores while not assuming gold standard alignments. Furthermore, we will present an analysis of the datasets which examines common issues with NLG evaluation. This talk is part of the Language Technology Lab Seminars series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsCambridge Coding Academy free tech talks DPMMS info aggregatorOther talksSingle cell seminar: August Self-sculpting of a dissolvable body due to gravitational convection Crafting Your Academic Cover Letter Automotive Acoustics and Software over the Air A Michelangelo discovery: Project update and conclusions How can Trustzone help with securing microservice-oriented apps? |