https://arnold-neumaier.at/DFO.html
This two year project lead to a software suite for efficently solving optimization problems in which the objective to be optimized is available only through a black box providing a score for any given scenario. Optimization is the basic technique for improving existing products or procedures to better match a desired objective under restrictions encoded into the constraints.
A growing number of applications in science and engineering have objectives without well-defined functional description, e.g., when function values arise from expensive measurements or complex simulations. This fueled a resurgence of interest in derivative-free optimization algorithms, since these specifically work with functions and constraints for which no derivative information is available. Our group has long-term expertise in derivative-free optimization, contributing both to deterministic and stochastic algorithms with theoretical convergence guarantees and to easily parallelizable stochastic algorithms based on the heuristic adaptation of populations of good points.
In this project, we extended the methods implemented in our earlier unconstrained black box solvers to the case of bound constraints (lower and upper bounds on decision variables), integer constraints (where certain decision variables can take only integral values), and linear inequality and equality constraints (where certain linear expressions must match or must exceed a given bound).
We aimed at a robust and efficient derivative-free solver suite that copes with expensive, often noisy function values and high dimensions, the bottleneck of current techniques. We exploited in particular local effective low-dimensionality by employing appropriate subspace techniques (that concentrate the search by focusing on adaptively constructed problems with a few auxiliary variables only).
We integrated into our solvers various known, and some new techniques for handling the bound constraints, integer constraints, and linear constraints, selecting the best from many possible variants tried. To facilitate the testing of our algorithms and their comparison with those of others, we extended our recent optimization test environment to the constrained case. The software released includes the test environment and the new solver suite, and the associated publications demonstrate the theoretical efficiency and the practical state-of-the-art performance compared with other solvers. We also implemented a preliminary (and not yet released) self-tuning environment that applies the new solvers to optimize their parameters and their available choices.
This project is funded by a grant of the Austrian Science Fund FWF under contract number P 34317-N.
The project started in June 2021 and ended in December 2023.
Details of the achievements, the associated publications, and links to the software created can be found below, or in the Final Report of the project.
Supported by the budget were:
We designed several new algorithms for noisy derivative-free optimization under various conditions:
We also upgraded several of our other optimization algorithms [2,13,15,17,18].
We designed
All our solvers were tested and tuned to ensure being competitive with the best noncommercial solvers available, or even better. Some of our solvers exist in a reentrant version that allows the solution of optimization problems where function values may be computed by software on different platforms or measured by performing experiments or external simulations, with communication through file updates.
Improving the TTSopt package (which is the union of the above
four packages) and the tuning and benchmarking of competitive solvers
will continue after the end of the project.
[1]
A. Brilli, M. Kimiaei, G. Liuzzi and S. Lucidi,
Worst case complexity bounds for linesearch-type derivative-free
algorithms,
Journal of Optimization Theory and Applications, uunder revision (2024).
pdf file
[2]
M. Kimiaei,
An active set trust region method for bound constrained optimization,
Bulletin of the Iranian Mathematical Society 48 (2022), 1721--1745.
pdf file
[3]
M. Kimiaei,
An improved randomized algorithm with noise level tuning for
large-scale noisy unconstrained DFO problems,
Manuscript, submitted (2023).
pdf file
[4]
M. Kimiaei, A.H. Ibrahim and S. Ghaderi,
A subspace inertial method for derivative-free nonlinear monotone
equations,
Optimization, published online (2023),
pdf file
[5] M. Kimiaei, R. Jezek, A. Neumaier and B. Azmi, TEnvOpt -- a test environment for optimization, Manuscript, in preparation (2024).
[6] M. Kimiaei, R. Jezek, A. Neumaier, R. Kostal and M. Stollmayer. TTSopt -- Testing and tuning software for optimization, Manuscript, in preparation (2024).
[7]
M. Kimiaei and A. Neumaier,
A limited memory for unconstrained nonlinear least squares,
Soft Computing 26 (2022), 465--490.
pdf file
[8]
M. Kimiaei and A. Neumaier,
Efficient unconstrained black box optimization,
Mathematical Programming Computation, 14 (2022), 271--318.
pdf file
[9]
M. Kimiaei and A. Neumaier,
Efficient composite heuristics for integer bound constrained noisy
optimization, Unpublished manuscript (2022).\\
pdf file
[10]
M. Kimiaei and A. Neumaier,
Effective matrix adaptation strategy for noisy derivative-free
optimization,
Mathematical Programming Computation, under revision (2024).
pdf file
[11]
M. Kimiaei and A. Neumaier,
Heuristic methods for noisy derivative-free bound-constrained
mixed-integer optimization,
Mathematical Programming Computation, under revision (2024).
pdf file
[12] M. Kimiaei and A. Neumaier, A feasible point method for derivative-free noisy linearly constrained optimization, Manuscript, in preparation (2024).
[13]
M. Kimiaei, A. Neumaier and B. Azmi,
LMBOPT - a limited memory method for bound-constrained optimization,
Mathematical Programming Computation 14 (2022), 271--318.
pdf file
[14]
M. Kimiaei, A. Neumaier and P. Faramarzi,
New subspace method for unconstrained derivative-free optimization,
ACM Transactions on Mathematical Software 49 (2023), 1--25.
pdf file
[15]
A. Neumaier, B. Azmi and M. Kimiaei,
An active set method for bound-constrained optimization,
Optimization Methods and Software, published online (2024),
pdf file
[16] A. Neumaier and R. Jezek, Tuning optimization solvers, Manuscript, in preparation (2024).
[17]
A. Neumaier and M. Kimiaei,
An improvement of the Goldstein line search,
Optimization Letters, published online (2024).
pdf file
[18]
A. Neumaier, M. Kimiaei and B. Azmi,
Globally linearly convergent nonlinear conjugate gradients without
Wolfe line search,
Numerical Algorithms, published online (2024),
pdf file
The following derivative-free solvers for various kinds of optimization problems were finished or upgraded during the project, and are publicly available.
VRBBO: Vienna Randomized Black Box Optimization (unconstrained)
VRDFON: Vienna Randomized Noisy Derivative-Free Optimization (unconstrained)
SSDFO: Subspace Derivative-Free Optimization (unconstrained)
IMATRS: Matrix Adaptation Trust Region Derivative-Free Optimization (pure-integer bound-constrained)
MATRS: Matrix Adaptation Trust Region Derivative-Free Optimization (mixed-integer bound-constrained)
MADFO: Matrix Adaptation Derivative-Free Optimization (unconstrained)
In addition, we made the software package
TEnvOpt: Test Environment for Optimization
available, which contains our test environment together with the solver collection TEsolv and the problem collection TEprob. This release, however, does not yet contain the automatic tuning facilities that are still under development.
Software for local optimization
Arnold Neumaier (Arnold.Neumaier@univie.ac.at)