University of Cambridge > Talks.cam > NLIP Seminar Series > Distributional semantics and beyond: Composition, generation and alignment

Distributional semantics and beyond: Composition, generation and alignment

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Tamara Polajnar.

Distributional methods for semantics approximate the meaning of linguistic expressions with vectors that summarize the contexts in which they occur in large samples of text. This has been a very successful approach to lexical semantics where semantic relatedness of words is assessed by comparing vectors. In this talk, I will present work on extending distributional semantics in order to increase the range of semantic phenomena it can account for. In recent years, distributional semantics models have been extended to the modelling of phrases and sentences through composition operations. I will motivate and explore the mirror problem of generation: Given a distributional vector representing some meaning, how can we generate the phrase that best expresses that meaning? In the second part of my talk I will introduce the problem of structure-preserving alignment of distributional semantic spaces, discussing potential applications to bilingual lexicon acquisition and language-vision interfaces.

This talk is part of the NLIP Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity