COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
Machine TranslationAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Matthew Ireland. Sequence prediction is a popular machine learning task, consisting of making predictions about future symbols based on a previously observed sequence. This makes it ideal for NLP tasks such as machine translation, where the ordering of the symbols (words) affects the meaning of the text. In this talk we will look at how sequence-to-sequence models can be emulated using the Encoder-Decoder architecture. This approach will involve two recurrent neural networks, one to encode the input sequence (Encoder) and the other to decode the encoded input sequence into the target sequence (Decoder). We will then consider how this model can be applied to machine translation, and some other common use cases. This talk is part of the Churchill CompSci Talks series. This talk is included in these lists:Note that ex-directory lists are not shown. |
Other listsMRC Cognition and Brain Sciences Unit Life Sciences and Society Seminars Sociology of Intellectuals Reading Group SeminarOther talksCommunity Scale Engagement: Collective Action on Carbon PP2A-B55 inhibitors Arpp19 and ENSA define the cell cycle program by controlling the temporal pattern of protein phosphorylation Statistics Clinic 2021 Easter Vacation - Skype session Central galaxy quenching: a consistent look across simulations and observations Deep hedging: Learning Risk-Neutral Implied Volatility Dynamics Writing Good Essays: What You Should Know About Writing! |