COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > An Introduction to the Conjugate Gradient Method
An Introduction to the Conjugate Gradient MethodAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Isaac Reid. Zoom link available upon request (it is sent out on our mailing list, eng-mlg-rcc [at] lists.cam.ac.uk). Sign up to our mailing list for easier reminders via lists.cam.ac.uk. (Taken from: An Introduction to the Conjugate Gradient Method Without the Agonizing Pain, Jonathan Richard Shewchuk. Andy will walk us through this article.) The Conjugate Gradient Method is the most prominent iterative method for solving sparse systems of linear equations. Unfortunately, many textbook treatments of the topic are written with neither illustrations nor intuition, and their victims can be found to this day babbling senselessly in the corners of dusty libraries. For this reason, a deep, geometric understanding of the method has been reserved for the elite brilliant few who have painstakingly decoded the mumblings of their forebears. Nevertheless, the Conjugate Gradient Method is a composite of simple, elegant ideas that almost anyone can understand. Of course, a reader as intelligent as yourself will learn them almost effortlessly. The idea of quadratic forms is introduced and used to derive the methods of Steepest Descent, Conjugate Directions, and Conjugate Gradients. Eigenvectors are explained and used to examine the convergence of the Jacobi Method, Steepest Descent, and Conjugate Gradients. Other topics include preconditioning and the nonlinear Conjugate Gradient Method. I have taken pains to make this article easy to read. Sixty-six illustrations are provided. Dense prose is avoided. Concepts are explained in several different ways. Most equations are coupled with an intuitive interpretation. Reading recommendations: https://www.cs.cmu.edu/~quake-papers/painless-conjugate-gradient.pdf This talk is part of the Machine Learning Reading Group @ CUED series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsCTR Seminar Series Multilingualism and Exchange in the Ancient and Medieval World Cambridge Advanced Imaging SeminarsOther talksDoes reconstructing the past history of kidney cancer help predict clinical behaviour? Biophysics in Drug Discovery LMB Seminar: Feedback control of mitosis in the context of the kinetochore Title TBC |