Deep learning (DL) has had unprecedented succes s and is now entering scientific computing with fu ll force. However\, DL suffers from a universal ph enomenon: instability\, despite universal approxim ating properties that often guarantee the existenc e of stable neural networks (NNs). We show the fol lowing paradox. There are basic well-conditioned p roblems in scientific computing where one can prov e the existence of NNs with great approximation qu alities\, however\, there does not exist any algor ithm\, even randomised\, that can train (or comput e) such a NN. Indeed\, for any positive integers K > 2 and L\, there are cases where simultaneously: (a) no randomised algorithm can compute a NN corr ect to K digits with probability greater than 1/2\ , (b) there exists a deterministic algorithm that computes a NN with K-1 correct digits\, but any su ch (even randomised) algorithm needs arbitrarily m any training data\, (c) there exists a determinist ic algorithm that computes a NN with K-2 correct d igits using no more than L training samples. These results provide basic foundations for Smale's 18t h problem and imply a potentially vast\, and cruci al\, classification theory describing conditions u nder which (stable) NNs with a given accuracy can be computed by an algorithm. We begin this theory by initiating a unified theory for compressed sens ing and DL\, leading to sufficient conditions for the existence of algorithms that compute stable NN s in inverse problems. We introduce Fast Iterative REstarted NETworks (FIRENETs)\, which we prove an d numerically verify are stable. Moreover\, we pro ve that only O(|\\log(\\epsilon)|) layers are need ed for an \\epsilon accurate solution to the inver se problem (exponential convergence)\, and that th e inner dimensions in the layers do not exceed the dimension of the inverse problem. Thus\, FIRENETs are computationally very efficient. The reference for this talk is: https://arxiv.org/abs/2101.08286 LOCATION:Seminar Room 1\, Newton Institute CONTACT: END:VEVENT END:VCALENDAR