![]() |
COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. | ![]() |
![]() How much linguistics is needed for NLP?Add to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Kris Cao. Many problems in Natural Language Processing, from Machine Translation to Parsing, can be viewed as transduction tasks. Recently, sequence-to-sequence mapping approaches using recurrent networks and parallel corpora have shown themselves to be capable of learning fairly complex transductions without the need for heavy (or any) annotation or alignment data. Traditional linguistically-motivated features such as syntactic types and dependencies are entirely latent in such models, reducing the need for expert linguistic knowledge in designing new solutions in NLP . In this talk, I will discuss the strengths and weaknesses of such approaches, before presenting some ameliorations based on attention mechanisms and working memory enhancements to standard recurrent neural networks. This talk is part of the NLIP Seminar Series series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsCambridge Neurological Society Public Health talks Modern Greek Lecture Series Meeting the Challenge of Healthy Ageing in the 21st Century POLIS events and lectures Computing and MathematicsOther talksDeveloping a single-cell transcriptomic data analysis pipeline Downstream dispersion of bedload tracers Cooperation, Construction, Coercion, Consent: Understanding the Role of Reimagined Urban Space within Nazi Germany and Fascist Italy POSTPONED - Acoustics in the 'real world' - POSTPONED Intelligence and the frontal lobes |