TY - GEN
T1 - Convergence of Anisotropic Consensus-Based Optimization in Mean-Field Law
AU - Fornasier, Massimo
AU - Klock, Timo
AU - Riedl, Konstantin
N1 - Publisher Copyright:
© 2022, Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - In this paper we study anisotropic consensus-based optimization (CBO), a population-based metaheuristic derivative-free optimization method capable of globally minimizing nonconvex and nonsmooth functions in high dimensions. CBO is based on stochastic swarm intelligence, and inspired by consensus dynamics and opinion formation. Compared to other metaheuristic algorithms like Particle Swarm Optimization, CBO is of a simpler nature and therefore more amenable to theoretical analysis. By adapting a recently established proof technique, we show that anisotropic CBO converges globally with a dimension-independent rate for a rich class of objective functions under minimal assumptions on the initialization of the method. Moreover, the proof technique reveals that CBO performs a convexification of the optimization problem as the number of particles goes to infinity, thus providing an insight into the internal CBO mechanisms responsible for the success of the method. To motivate anisotropic CBO from a practical perspective, we further test the method on a complicated high-dimensional benchmark problem, which is well understood in the machine learning literature.
AB - In this paper we study anisotropic consensus-based optimization (CBO), a population-based metaheuristic derivative-free optimization method capable of globally minimizing nonconvex and nonsmooth functions in high dimensions. CBO is based on stochastic swarm intelligence, and inspired by consensus dynamics and opinion formation. Compared to other metaheuristic algorithms like Particle Swarm Optimization, CBO is of a simpler nature and therefore more amenable to theoretical analysis. By adapting a recently established proof technique, we show that anisotropic CBO converges globally with a dimension-independent rate for a rich class of objective functions under minimal assumptions on the initialization of the method. Moreover, the proof technique reveals that CBO performs a convexification of the optimization problem as the number of particles goes to infinity, thus providing an insight into the internal CBO mechanisms responsible for the success of the method. To motivate anisotropic CBO from a practical perspective, we further test the method on a complicated high-dimensional benchmark problem, which is well understood in the machine learning literature.
KW - Anisotropy
KW - Consensus-based optimization
KW - High-dimensional global optimization
KW - Mean-field limit
KW - Metaheuristics
UR - https://www.scopus.com/pages/publications/85129265035
U2 - 10.1007/978-3-031-02462-7_46
DO - 10.1007/978-3-031-02462-7_46
M3 - Conference contribution
AN - SCOPUS:85129265035
SN - 9783031024610
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 738
EP - 754
BT - Applications of Evolutionary Computation - 25th European Conference, EvoApplications 2022, Held as Part of EvoStar 2022, Proceedings
A2 - Jiménez Laredo, Juan Luis
A2 - Hidalgo, J. Ignacio
A2 - Babaagba, Kehinde Oluwatoyin
PB - Springer Science and Business Media Deutschland GmbH
T2 - 25th European Conference on the Applications of Evolutionary Computation, EvoApplications 2022
Y2 - 20 April 2022 through 22 April 2022
ER -