Many statistical appli cations involve models for which it is difficult t o evaluate the likelihood\, but relatively easy to sample from. Approximate Bayesian computation is a likelihood-free method for implementing Bayesian inference in such cases. We present a number of s urprisingly strong asymptotic results for the regr ession-adjusted version of approximate Bayesian Co mputation introduced by Beaumont et al. (2002). We show that for an appropriate choice of the bandwi dth in approximate Bayesian computation\, using re gression-adjustment will lead to a posterior that\ , asymptotically\, correctly quantifies uncertaint y. Furthermore\, for such a choice of bandwidth we can implement an importance sampling algorithm to sample from the posterior whose acceptance probab ility tends to 1 as we increase the data sample si ze. This compares favourably to results for standa rd approximate Bayesian computation\, where the on ly way to obtain its posterior that correctly quan tifies uncertainty is to choose a much smaller ban dwidth\, for which the acceptance probability tend s to 0 and hence for which Monte Carlo error will dominate.

Related Links LOCATION:Seminar Room 1\, Newton Institute CONTACT:info@newton.ac.uk END:VEVENT END:VCALENDAR