University of Cambridge > > Isaac Newton Institute Seminar Series > Probabilistic Numerical Computation: a Role for Statisticians in Numerical Analysis?

Probabilistic Numerical Computation: a Role for Statisticians in Numerical Analysis?

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact

This talk has been canceled/deleted

Consider the consequences of an alternative history. What if Euler had read the posthumous publication of the paper by Thomas Bayes on “An Essay towards solving a Problem in the Doctrine of Chances”? This was published in 1763 in the Philosophical Transactions of the Royal Society, so if Euler had read this article, we can wonder whether the section in his three volume book Institutionum calculi integralis, published in 1768, on numerical solution of differential equations might have been quite different.

Would the awareness by Euler of the “Bayesian” proposition of characterising uncertainty due to unknown quantities using the probability calculus have changed the development of numerical methods and their analysis to one that is more inherently statistical?

Fast forward the clock two centuries. F.M. Larkin published a paper in 1972 “Gaussian Measure on Hilbert Space and Applications in Numerical Analysis”. Therein a formal definition of the mathematical tools required to consider, for Hilbert spaces, probabilistic numerical analysis were laid down and methods such as Bayesian Quadrature or Bayesian Monte Carlo were developed in full.

Now the question of viewing numerical analysis as a problem of Statistical Inference in many ways seems natural and is being demanded by applied mathematicians, engineers and physicists who need to carefully and fully account for all sources of uncertainty in mathematical modelling and numerical simulation.

At present we have a research frontier that has emerged in scientific computation founded on the principle that error in numerical methods, which for example solves differential equations, entails uncertainty that ought to be subjected to formal statistical analysis. This viewpoint raises exciting challenges for contemporary statistical and numerical analysis, including the design of statistical methods that enable the coherent propagation of probability measures through a computational and inferential pipeline. 

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

This talk is not included in any other list

Note that ex-directory lists are not shown.


© 2006-2018, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity