Global Optimization Using Interval Analysis

Eldon Hansen
Global Optimization Using Interval Analysis
Dekker, New York 1992


The book presents techniques for the solution of global optimization problems (with several suspected local minima) and of nonlinear systems of equations with several solutions.

Since interval methods are hardly known in the optimization community, the first part of the book gives a thorough introduction to interval analysis. The main advantage of interval techniques lies in their capacity to provide global information about functions over large regions (box-shaped), e.g., strict bounds on function values, Lipschitz constants, higher derivatives, and thus allows to estimate the effects of using linear or quadratic approximations to a function. While the bounds are sometimes very wide, they can be proved to be always realistic when the boxes are narrow enough. Therefore, in the applications, interval techniques are combined with branch and bound techniques which split boxes adaptively until the overestimation problems become insignificant.

The techniques developed are then applied, in turn, to the solution of nonlinear systems, global unconstrained optimization, and global constrained optimization, with an increase in complexity in the three problems. In order that the methods are efficient, Jacobians of nonlinear systems, and hence Hessians for optimization problems, must be provided explicitly (though the methods also work, slowly, with first order or even zeroth order information only). Ultimately, everything is reduced to the solution of systems of linear interval equations. For realistic bounds on the solutions of these systems, it is necessary to precondition the system by an explicit inverse matrix, which limits the dimension of the problems which can be solved to several hundreds.

Everything is well motivated, and many small examples are provided which can be checked with pencil and paper. For proofs of more difficult theoretical results references are given to the literature. The algorithmic aspect is treated in detail, so that it is fairly easy to write programs based on the descriptions given. It is emphasized that when the interval arithmetic is implemented with outward directed rounding then the results of the algorithms are mathematically correct in spite of the rounding errors made by the computer, a fact which might be relevant in some applications.

The test problems discussed by the author have dimension at most 10 only, apart from a quadratic problem and a single example in 27 dimensions; and the constrained problems treated have only toy dimensions (the constrained Example 13.9, with 18 dimensions, is reduced to 5 dimensions, unconstrained). This is annoying, since it gives a wrong impression on the range of problems solvable by interval methods. The presentation of the test results is also rather careless; for some of the problems in section 9.15, the initial box is not specified, and on p.148 it is admitted that the results were obtained by an old implementation less efficient than later versions (which probably incorporate all tricks mentioned in the chapter).

It is surprising that hardly any use is made of classical optimization theory or algorithms; the only tools from optimization used are the Fritz John first order optimality conditions. This is an indication that one can expect much further improvements by combining the methods treated with more of the traditional methods, and in particular with dc-methods. The future of the subject, once firmly rooted in the established tradition, looks very bright.

Altogether, I think that this book is an excellent introduction to interval techniques for the solution of global optimization problems and of nonlinear systems of equations with several solutions, though it takes only the first step towards good general purpose global optimization software.


Global Optimization
Interval Methods
my home page (http://arnold-neumaier.at)

Arnold Neumaier (Arnold.Neumaier@univie.ac.at)