COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Microsoft Research Machine Learning and Perception Seminars > Probabilistic programs and the computability and complexity of Bayesian reasoning
Probabilistic programs and the computability and complexity of Bayesian reasoningAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Microsoft Research Cambridge Talks Admins. This event may be recorded and made available internally or externally via http://research.microsoft.com. Microsoft will own the copyright of any recordings made. If you do not wish to have your image/voice recorded please consider this before attending When is Bayesian reasoning possible? and when is it efficient? There has been a recent surge of interest in probabilistic programming languages and general-purpose inference engines. By providing languages for specifying large, modular, potentially recursively-defined probabilistic models, these systems make it possible to tackle very complex inductive inference problems, opening up new avenues for AI and applications. We present recent results elucidating aspects of the theoretical underpinnings of probabilistic programming systems, especially those whose underlying probabilistic languages have, in a technical sense, the same expressive power as probabilistic Turing machines. This algorithmic perspective on probability distributions, combined with new general purpose Monte Carlo inference strategies, presents a host of challenges and opportunities for theoretical computer science. In particular, we investigate the class of computable probability distributions—-distributions for which there exists a probabilistic program for generating exact samples—-and explore the fundamental limitations on performing Bayesian reasoning with probabilistic program representations. We ask, when does a conditional distribution have such a representation and when can we compute it? In addition to highlighting some positive results demonstrating that Bayesian reasoning is possible when the prior distribution has additional structure such as exchangeability or noise, we prove the nonexistence of algorithms (even inefficient ones) for Bayesian reasoning in the general case. We close with some complexity-theoretic aspects. This talk is part of the Microsoft Research Machine Learning and Perception Seminars series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsASNC Research Seminar LCHES Seminars on Human Evolution Thin-Film Magnetism GroupOther talks"Itsa me! Luigi!" [citation needed] - unlocking your referencing skills Action Stations! Towns, Cities and the Tilting of Britain's Political Axis Oncological imaging: introduction and non-radionuclide techniques Exploring the Galaxy's alpha-element abundances and globular cluster populations with hydrodynamic simulations Validation & testing of novel therapeutic targets to treat osteosarcoma Molecular mechanisms of cardiomyopathies in patients with severe non-ischemic heart failure Glucagon like peptide-1 receptor - a possible role for beta cell physiology in susceptibility to autoimmune diabetes The evolution of photosynthetic efficiency Active bacterial suspensions: from individual effort to team work Joinings of higher rank diagonalizable actions |