BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Microsoft Research Cambridge\, public talks
SUMMARY:Bayesian Computation Without Tears: Probabilistic
Programming and Universal Stochastic Inference - V
ikash Mansinghka\, MIT
DTSTART;TZID=Europe/London:20111121T110000
DTEND;TZID=Europe/London:20111121T120000
UID:TALK34618AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/34618
DESCRIPTION:Latent variable modeling and Bayesian inference ar
e appealing in theory they provide a unified mathe
matical framework for solving a wide range of mach
ine learning problems but are often difficult to a
pply effectively in practice. Accurate inference i
n even simple models can seem computationally intr
actable\, while more realistic models are difficul
t to even write down precisely.\n\nIn this talk\,
I will introduce new probabilistic programming tec
hnology that alleviates many of these difficulties
. Unlike graphical models\, which marries statisti
cs with graph theory\, probabilistic programming m
arries Bayesian inference with universal computati
on. Probabilistic programming can make it easier t
o build useful\, fast machine learning software th
at goes significantly beyond graphical models in f
lexibility and power. I will illustrate probabilis
tic programming using page-long probabilistic prog
rams that break simple CAPTCHAs by running randomi
zed CAPTCHA generators backwards and interpret noi
sy time-series data from clinical medicine.\n\nI w
ill also present CrossCat\, a black-box\, paramete
r free\, fully Bayesian machine learning method\,
based on an optimized engine for one probabilistic
program that learns simple but flexible probabili
stic programs from data. CrossCat estimates the fu
ll joint distribution underlying high-dimensional
datasets\, including the noisy\, incomplete tables
that come from modern database systems. It also c
an efficiently simulate from any of its finite-dim
ensional conditional distributions and accurately
solves problems of prediction\, imputation\, featu
re selection and classification.\n\nThroughout\, I
will highlight the ways probabilistic programming
points the way to a new model of computation\, ba
sed on universal inference over distributions rath
er than universal calculation of functions\, and e
xposes the mathematical and algorithmic structure
needed to engineer efficient\, distributed machine
learning systems. I will include a brief discussi
on of natively probabilistic hardware that carries
these principles down to the physical level.\nI w
ill also touch on the directions this model opens
up for research in computational complexity includ
ing steps towards an explanation of the unreasonab
le effectiveness of simple\, randomized algorithms
on apparently intractable problems and in program
ming languages and artificial intelligence.\n
LOCATION:Small lecture theatre\, Microsoft Research Ltd\, 7
J J Thomson Avenue (Off Madingley Road)\, Cambrid
ge
CONTACT:Microsoft Research Cambridge Talks Admins
END:VEVENT
END:VCALENDAR