University of Cambridge > > NLIP Seminar Series > How much linguistics is needed for NLP?

How much linguistics is needed for NLP?

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Kris Cao.

Many problems in Natural Language Processing, from Machine Translation to Parsing, can be viewed as transduction tasks. Recently, sequence-to-sequence mapping approaches using recurrent networks and parallel corpora have shown themselves to be capable of learning fairly complex transductions without the need for heavy (or any) annotation or alignment data. Traditional linguistically-motivated features such as syntactic types and dependencies are entirely latent in such models, reducing the need for expert linguistic knowledge in designing new solutions in NLP . In this talk, I will discuss the strengths and weaknesses of such approaches, before presenting some ameliorations based on attention mechanisms and working memory enhancements to standard recurrent neural networks.

This talk is part of the NLIP Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity