COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Microsoft Research Cambridge, public talks > Bayesian Computation Without Tears: Probabilistic Programming and Universal Stochastic Inference
Bayesian Computation Without Tears: Probabilistic Programming and Universal Stochastic InferenceAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Microsoft Research Cambridge Talks Admins. Latent variable modeling and Bayesian inference are appealing in theory they provide a unified mathematical framework for solving a wide range of machine learning problems but are often difficult to apply effectively in practice. Accurate inference in even simple models can seem computationally intractable, while more realistic models are difficult to even write down precisely. In this talk, I will introduce new probabilistic programming technology that alleviates many of these difficulties. Unlike graphical models, which marries statistics with graph theory, probabilistic programming marries Bayesian inference with universal computation. Probabilistic programming can make it easier to build useful, fast machine learning software that goes significantly beyond graphical models in flexibility and power. I will illustrate probabilistic programming using page-long probabilistic programs that break simple CAPTCH As by running randomized CAPTCHA generators backwards and interpret noisy time-series data from clinical medicine. I will also present CrossCat, a black-box, parameter free, fully Bayesian machine learning method, based on an optimized engine for one probabilistic program that learns simple but flexible probabilistic programs from data. CrossCat estimates the full joint distribution underlying high-dimensional datasets, including the noisy, incomplete tables that come from modern database systems. It also can efficiently simulate from any of its finite-dimensional conditional distributions and accurately solves problems of prediction, imputation, feature selection and classification. Throughout, I will highlight the ways probabilistic programming points the way to a new model of computation, based on universal inference over distributions rather than universal calculation of functions, and exposes the mathematical and algorithmic structure needed to engineer efficient, distributed machine learning systems. I will include a brief discussion of natively probabilistic hardware that carries these principles down to the physical level. I will also touch on the directions this model opens up for research in computational complexity including steps towards an explanation of the unreasonable effectiveness of simple, randomized algorithms on apparently intractable problems and in programming languages and artificial intelligence. This talk is part of the Microsoft Research Cambridge, public talks series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsHistory and Philosophy of Science long list Cambridge Defend Education Open Source for NLPOther talksNetworks, resilience and complexity Arithmetic and Dynamics on Markoff-Hurwitz Varieties The Fyodorov-Bouchaud conjecture and Liouville conformal field theory Challenges to monetary policy in a global context Developing a single-cell transcriptomic data analysis pipeline Cafe Synthetique: Synthetic Biology Industry Night Unbiased Estimation of the Eigenvalues of Large Implicit Matrices PTPmesh: Data Center Network Latency Measurements Using PTP Amino acid sensing: the elF2a signalling in the control of biological functions Protein Folding, Evolution and Interactions Symposium Protein Folding, Evolution and Interactions Symposium “Soap cost a dollar”: Jostling with minds in economic contexts |