University of Cambridge > Talks.cam > Machine Learning @ CUED > Consensus finding, exponential models and infinite rankings

Consensus finding, exponential models and infinite rankings

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

This talk is concerned with summarizing—by means of statistical models—of data that expresses preferences. This data is typically a set of rankings of n items by a panel of experts; the simplest summary is the “consensus ranking”, or the “centroid” of the set of rankings. Such problems appear in many tasks, ranging from combining voter preferences to boosting of search engines.

We study the problem in its more general form of estimating a parametric model known as the Generalized Mallows (GM) model. I will present an exact estimation algorithm, non-polynomial in theory, but extremely effective in comparison with existing algorithms. From a statistical point of view, we show that the GM model is an exponential family, and introduce the conjugate prior for this model class.

Then we introduce the infinite GM model, corresponding to “rankings” over an infinite set of items, and show that this model is both elegant and of practical significance. Finally, the talk will touch upon the subject of multimodal distributions and clustering.

Joint work with: Bhushan Mandhani, Le Bao, Kapil Phadnis, Arthur Patterson and Jeff Bilmes

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity