CGRS — An advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method

Christian Gnandt, Rainer Callies

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

A new hybrid method for global optimization of continuous functions is proposed. It is a combination of an extended random search method and a descent method. Random search is used as the global search strategy. A newly developed distribution-based region control makes use of already detected local minima to refine this search strategy. The approach resembles classical step size control in deterministic optimization. The descent method is embedded as a local search strategy for the detection of local minima. A special realization of this approach is presented in this paper and called CGRS. In CGRS the conjugate gradient method is utilized as descent method. The proof of global convergence in probability for CGRS is given and extended to other descent methods used in the hybrid optimization approach. In order to demonstrate the numerical properties of the approach test sets of multidimensional non-convex optimization problems are solved. The results are compared to well-established hybrid methods for global optimization. The new algorithm shows a high success rate with good and adjustable solution precision. Parameter tuning is not necessary, but of course possible. The new method proves to be efficient in terms of computational costs.

Original languageEnglish
Pages (from-to)99-115
Number of pages17
JournalJournal of Computational and Applied Mathematics
Volume333
DOIs
StatePublished - 1 May 2018

Keywords

  • Conjugate gradient method
  • Convergence in probability
  • Distribution-based region control
  • Global optimization
  • Hybrid approach
  • Random search

Fingerprint

Dive into the research topics of 'CGRS — An advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method'. Together they form a unique fingerprint.

Cite this