COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Churchill CompSci Talks > Attention in sequence-to-sequence models
Attention in sequence-to-sequence modelsAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Matthew Ireland. Sequence-to-sequence models are used to map sequential data from one domain to another, and are at the core of machine translation (MT) technologies such as Google Translate. In this talk, we build up the major milestones of neural machine translation. This talk will also introduce the attention mechanism, which forms the basis of virtually all state-of-the-art machine translation techniques we have today. This talk is part of the Churchill CompSci Talks series. This talk is included in these lists:Note that ex-directory lists are not shown. |
Other listsNexttechhub intestinal Amnesty International Refugee Rights CampaignOther talksCultural groups, essentialism, and ontic risk Title: How antibiotic AWaRe are you? Statistics Clinic Michaelmas 2021 II Science & Engineering This Too is Cambridge |