Determinantal Point Processes (DPPs) are a family of probabilistic models that have a repulsive behavior\, and lend themselves naturally to many tasks in machine learning (such as recomm endation systems) where returning a diverse set of objects is important. While there are fast algori thms for sampling\, marginalization and conditioni ng\, much less is known about learning the paramet ers of a DPP. In this talk\, I will present recent results related to this problem\, specifically - Rates of convergence for the maximum likelihood es timator: by studying the local and global geometry of the expected log-likelihood function we are ab le to establish rates of convergence for the MLE a nd give a complete characterization of the cases w here these are parametric. We also give a partial description of the critical points for the expecte d log-likelihood. - Optimal rates of convergence f or this problem: these are achievable by the metho d of moments and are governed by a combinatorial p arameter\, which we call the cycle sparsity. - A f ast combinatorial algorithm to implement the metho d of moments efficiently.

The necessary b ackground on DPPs will be given in the talk.

< span>

Joint work with Victor-Emmanuel Brunel (M .I.T)\, Ankur Moitra (M.I.T) and John Urschel (M.I .T). LOCATION:Seminar Room 1\, Newton Institute CONTACT:info@newton.ac.uk END:VEVENT END:VCALENDAR