Derivative-free optimization 2021-2023

https://arnold-neumaier.at/DFO.html


Project summary

This two year project lead to a software suite for efficently solving optimization problems in which the objective to be optimized is available only through a black box providing a score for any given scenario. Optimization is the basic technique for improving existing products or procedures to better match a desired objective under restrictions encoded into the constraints.

A growing number of applications in science and engineering have objectives without well-defined functional description, e.g., when function values arise from expensive measurements or complex simulations. This fueled a resurgence of interest in derivative-free optimization algorithms, since these specifically work with functions and constraints for which no derivative information is available. Our group has long-term expertise in derivative-free optimization, contributing both to deterministic and stochastic algorithms with theoretical convergence guarantees and to easily parallelizable stochastic algorithms based on the heuristic adaptation of populations of good points.

In this project, we extended the methods implemented in our earlier unconstrained black box solvers to the case of bound constraints (lower and upper bounds on decision variables), integer constraints (where certain decision variables can take only integral values), and linear inequality and equality constraints (where certain linear expressions must match or must exceed a given bound).

We aimed at a robust and efficient derivative-free solver suite that copes with expensive, often noisy function values and high dimensions, the bottleneck of current techniques. We exploited in particular local effective low-dimensionality by employing appropriate subspace techniques (that concentrate the search by focusing on adaptively constructed problems with a few auxiliary variables only).

We integrated into our solvers various known, and some new techniques for handling the bound constraints, integer constraints, and linear constraints, selecting the best from many possible variants tried. To facilitate the testing of our algorithms and their comparison with those of others, we extended our recent optimization test environment to the constrained case. The software released includes the test environment and the new solver suite, and the associated publications demonstrate the theoretical efficiency and the practical state-of-the-art performance compared with other solvers. We also implemented a preliminary (and not yet released) self-tuning environment that applies the new solvers to optimize their parameters and their available choices.


Project funding

This project is funded by a grant of the Austrian Science Fund FWF under contract number P 34317-N.

The project started in June 2021 and ended in December 2023.

Details of the achievements, the associated publications, and links to the software created can be found below, or in the Final Report of the project.

Supported by the budget were:

  • Dr. Morteza Kimiaei from June 2021 until June 2023 (full time),
  • The master students Roman Kostal and Maximilian Stollmayer from August 2021 until February 2022 (both 15 hours/week).
  • Mag. Romana Jezek from June 2022 until June 2023 (20 hours/week).


    Project results


    Algorithms designed

    We designed several new algorithms for noisy derivative-free optimization under various conditions:

  • noiseless unconstrained [14],
  • noisy unconstrained [1,3,8],
  • unconstrained matrix evolution strategy for problems with very large noise [10],
  • linearly constrained [12],
  • integer bound-constrained [9],
  • mixed integer bound-constrained [11],
  • subspace methods for nonlinear systems [4] and nonlinear least squares [7].

    We also upgraded several of our other optimization algorithms [2,13,15,17,18].


    Software designed

    We designed

  • a software package TTSopt for testing and tuning optimization software [1]. The package has several consistently interacting parts whose details and interplay are still under development:
  • TEsolv, a package with wrappers for 53 derivative-free solvers for unconstrained, bound-constrained, and general constrained optimization problems, with or without discrete constraints. This includes our own solvers VRBBO, VRDFON, SSDFO, IMATRS, MATRS, MADFO developed or upgraded during the project, and described below.
  • TEprob, a package with interfaces for 18 well-known collections of optimization problems with continuous, integer, and mixed-integer variables and various kinds of constraints.
  • TEnvOpt [5], a testing environment for comparing a configurable set of solvers on a configurable set of test problems, producing automatically a graphical and tabular result analysis.
  • TUopt, a tuning environment for the optimal selection of tuning parameters of a solver, using a new machine learning based tuning strategy [16].
  • In the final stage, the tuning environment will allow for a fairly complex structure of the tuning parameters, including categorical parameters and case-dependent dimensionality.

    All our solvers were tested and tuned to ensure being competitive with the best noncommercial solvers available, or even better. Some of our solvers exist in a reentrant version that allows the solution of optimization problems where function values may be computed by software on different platforms or measured by performing experiments or external simulations, with communication through file updates.

    Improving the TTSopt package (which is the union of the above four packages) and the tuning and benchmarking of competitive solvers will continue after the end of the project.

  • We also designed a much improved Goldstein line search [17] and applied it first to a new conjugate gradient method [18] and a bound-constrained solver [15] for noiseless optimization.


    References

    [1] A. Brilli, M. Kimiaei, G. Liuzzi and S. Lucidi, Worst case complexity bounds for linesearch-type derivative-free algorithms, Journal of Optimization Theory and Applications, uunder revision (2024).
    pdf file

    [2] M. Kimiaei, An active set trust region method for bound constrained optimization, Bulletin of the Iranian Mathematical Society 48 (2022), 1721--1745.
    pdf file

    [3] M. Kimiaei, An improved randomized algorithm with noise level tuning for large-scale noisy unconstrained DFO problems, Manuscript, submitted (2023).
    pdf file

    [4] M. Kimiaei, A.H. Ibrahim and S. Ghaderi, A subspace inertial method for derivative-free nonlinear monotone equations, Optimization, published online (2023),
    pdf file

    [5] M. Kimiaei, R. Jezek, A. Neumaier and B. Azmi, TEnvOpt -- a test environment for optimization, Manuscript, in preparation (2024).

    [6] M. Kimiaei, R. Jezek, A. Neumaier, R. Kostal and M. Stollmayer. TTSopt -- Testing and tuning software for optimization, Manuscript, in preparation (2024).

    [7] M. Kimiaei and A. Neumaier, A limited memory for unconstrained nonlinear least squares, Soft Computing 26 (2022), 465--490.
    pdf file

    [8] M. Kimiaei and A. Neumaier, Efficient unconstrained black box optimization, Mathematical Programming Computation, 14 (2022), 271--318.
    pdf file

    [9] M. Kimiaei and A. Neumaier, Efficient composite heuristics for integer bound constrained noisy optimization, Unpublished manuscript (2022).\\
    pdf file

    [10] M. Kimiaei and A. Neumaier, Effective matrix adaptation strategy for noisy derivative-free optimization, Mathematical Programming Computation, under revision (2024).
    pdf file

    [11] M. Kimiaei and A. Neumaier, Heuristic methods for noisy derivative-free bound-constrained mixed-integer optimization, Mathematical Programming Computation, under revision (2024).
    pdf file

    [12] M. Kimiaei and A. Neumaier, A feasible point method for derivative-free noisy linearly constrained optimization, Manuscript, in preparation (2024).

    [13] M. Kimiaei, A. Neumaier and B. Azmi, LMBOPT - a limited memory method for bound-constrained optimization, Mathematical Programming Computation 14 (2022), 271--318.
    pdf file

    [14] M. Kimiaei, A. Neumaier and P. Faramarzi, New subspace method for unconstrained derivative-free optimization, ACM Transactions on Mathematical Software 49 (2023), 1--25.
    pdf file

    [15] A. Neumaier, B. Azmi and M. Kimiaei, An active set method for bound-constrained optimization, Optimization Methods and Software, published online (2024),
    pdf file

    [16] A. Neumaier and R. Jezek, Tuning optimization solvers, Manuscript, in preparation (2024).

    [17] A. Neumaier and M. Kimiaei, An improvement of the Goldstein line search, Optimization Letters, published online (2024).
    pdf file

    [18] A. Neumaier, M. Kimiaei and B. Azmi, Globally linearly convergent nonlinear conjugate gradients without Wolfe line search, Numerical Algorithms, published online (2024),
    pdf file


    Software available

    The following derivative-free solvers for various kinds of optimization problems were finished or upgraded during the project, and are publicly available.

    VRBBO: Vienna Randomized Black Box Optimization (unconstrained)

    VRDFON: Vienna Randomized Noisy Derivative-Free Optimization (unconstrained)

    SSDFO: Subspace Derivative-Free Optimization (unconstrained)

    IMATRS: Matrix Adaptation Trust Region Derivative-Free Optimization (pure-integer bound-constrained)

    MATRS: Matrix Adaptation Trust Region Derivative-Free Optimization (mixed-integer bound-constrained)

    MADFO: Matrix Adaptation Derivative-Free Optimization (unconstrained)


    In addition, we made the software package

    TEnvOpt: Test Environment for Optimization

    available, which contains our test environment together with the solver collection TEsolv and the problem collection TEprob. This release, however, does not yet contain the automatic tuning facilities that are still under development.


    Other relevant information

    Software by my research group

    Software for local optimization


    my home page (http://arnold-neumaier.at/)

    Arnold Neumaier (Arnold.Neumaier@univie.ac.at)