This is an interesting but fragmented attempt to introduce functional analysis by examining a variety of optimization problems. Its strength is that it is heavily slanted towards applications and particular problems. Its weakness is that it jumps around a lot, and it’s hard to say at the end exactly what general facts and techniques you have now acquired. The book is aimed at upper-division undergraduates, although this might be optimistic for many curricula. It is a 2012 reprint of the 1967 Harper & Row edition.

The book is not related to the Constructive Analysis developed by Errett Bishop and others, for example in Constructive Analysis. It is real analysis in the sense that most problems deal with the Banach space *C*[*a*, *b*] and the Hilbert spaces ℓ^{ p} and *L*^{2}[*a*, *b*] (although it does not use the Lebesgue integral). It is constructive in the sense that the existence of a solution is proved by defining a recursion, whose limit from an arbitrary starting point is the desired solution and whose limit exists because the recursion is a contraction mapping. The emphasis in the book is on developing these recursions, and generally speaking there is no error analysis or consideration of numerical issues. A construction is given rather than an existence proof, but the method is not constructive in the stronger sense of allowing us to calculate something to a pre-specified precision.

Prerequisites are a good course in advanced calculus or real analysis, at the level of Apostol’s Mathematical Analysis or Rudin’s Principles of Mathematical Analysis (both of which are heavily referenced). The book starts off in the first chapter by looking at several optimization problems for real functions. Root-finding is included, on the grounds that one optimization method is to find the zeroes of the derivative, and that Newton’s method is a simple example of the kind of recursions we will be studying further. It also deals with gradient methods (steepest descent) and starts the generalization process with metric spaces. The second chapter keeps the idea of iteration, but veers off to the very different kinds of problem where the domain is given by a convex region or a non-differentiable function (linear, nonlinear, and convex programming). The third chapter to some extent brings all these strands together in Banach and Hilbert spaces, and this is where the classical functional analysis topics such as the Hahn-Banach theorem and the Banach-Steinhaus theorem appear.

Overall I thought this approach to introducing functional analysis is not very successful, because it goes in too many directions at once and lacks focus. I think students at this level would be better served by a more straightforward book such as Saxe’s Beginning Functional Analysis, although that book is relatively weak in applications.

Allen Stenger is a math hobbyist and retired software developer. He is webmaster and newsletter editor for the MAA Southwestern Section and is an editor of the Missouri Journal of Mathematical Sciences. His mathematical interests are number theory and classical analysis. He volunteers in his spare time at MathNerds.org, a math help site that fosters inquiry learning.