Boris S. Mordukhovich
Derivative-free methods in nonconvex optimization


This paper discusses some new directions and results, obtained in joint work with Pham Duy Khanh (Ho Chi Minh City University of Education, Vietnam) and Dat Ba Tran (Rowan University, USA), for models of derivative-free optimization (DFO) with nonconvex data. We overview several approaches to DFO problems and focus on finite-difference approximation schemes. Our algorithms address the two major classes: objective functions with globally Lipschitzian and locally Lipschitzian gradients, respectively. Global convergence results with constructive convergence rates are established for both cases in noiseless and noisy environments. The developed algorithms in the noiseless case are based on the backtracking line search and achieves fundamental convergence properties. The noisy version is essentially more involved being based on the novel dynamic step line search. Numerical experiments demonstrate higher robustness of the proposed algorithms compared with other finite-difference-based schemes.

Keywords: Derivative-free optimization, Nonconvex smooth objective functions, Finite differences, Convergent algorithms, Convergence rates

DOI: https://doi.org/10.54381/icp.2025.2.01
Institute of Control Systems of the Ministry of Science and Education of the Republic of Azerbaijan
Copyright © 1997-. e-Mail: [email protected]