Over the past decade\, a rich interaction has grow n up between statistical physics and statistical inference. In particular\, physics often predicts phase transitions where\, if a data set becomes t oo sparse or too noisy\, it suddenly becomes impo ssible to find its underlying structure\, or even to distinguish it from a &ldquo\;null model&rdquo \; with no structure at all. For instance\, in the community detection problem in networks\, there is a transition below which the vertices cannot b e labeled better than chance\, and the network can not be distinguished from an Erdos-Renyi random g raph. Similarly\, in the clustering problem in Ga ussian mixture models\, it becomes impossible to f ind the clusters\, or even to tell whether cluste rs exist\, i.e.\, to distinguish the data from a s ingle Gaussian.

Many of the physics pred ictions have been made rigorous\, but there is sti ll a very lively interchange between the fields\, both about these transitions and about asymptoti cally optimal algorithms. In particular\, while ef ficient message-passing and spectral algorithms a re known that succeed down to the so-called Keste n-Stigum bound\, in a number of problems we believ e that it is information-theoretically possible ( but perhaps not computationally feasible) to go f urther. I will give a friendly introduction to thi s field\, and present some new results proving th is conjecture.

This is joint work w ith Jess Banks\, Joe Neeman\, Praneeth Netrapalli\ , and Jiaming Xu. LOCATION:Seminar Room 1\, Newton Institute CONTACT:INI IT END:VEVENT END:VCALENDAR