Sentence Generation using a Dynamic Canvas
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Dimitri Kartsaklis.
I will discuss the Attentive Unsupervised Text Writer (AUTR), a word level generative model for natural language. It uses a recurrent neural network with a dynamic attention and canvas memory mechanism to iteratively construct sentences. By viewing the state of the memory at intermediate stages and where the model is placing its attention, we gain insight into how it constructs sentences. We demonstrate that AUTR learns a useful latent representation for each sentence and achieves competitive log-likelihood lower bounds whilst being computationally efficient. It is effective at generating and reconstructing sentences, as well as imputing missing words.
This talk is part of the Language Technology Lab Seminars series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|