TY - JOUR
T1 - Newton-based stochastic optimization using q-Gaussian smoothed functional algorithms
AU - Ghoshdastidar, Debarghya
AU - Dukkipati, Ambedkar
AU - Bhatnagar, Shalabh
N1 - Publisher Copyright:
© 2014 Elsevier Ltd. All rights reserved.
PY - 2014/10/1
Y1 - 2014/10/1
N2 - We present the first q-Gaussian smoothed functional (SF) estimator of the Hessian and the first Newton-based stochastic optimization algorithm that estimates both the Hessian and the gradient of the objective function using q-Gaussian perturbations. Our algorithm requires only two system simulations (regardless of the parameter dimension) and estimates both the gradient and the Hessian at each update epoch using these. We also present a proof of convergence of the proposed algorithm. In a related recent work (Ghoshdastidar, Dukkipati, Bhatnagar, 2014), we presented gradient SF algorithms based on the q-Gaussian perturbations. Our work extends prior work on SF algorithms by generalizing the class of perturbation distributions as most distributions reported in the literature for which SF algorithms are known to work turn out to be special cases of the q-Gaussian distribution. Besides studying the convergence properties of our algorithm analytically, we also show the results of numerical simulations on a model of a queuing network, that illustrate the significance of the proposed method. In particular, we observe that our algorithm performs better in most cases, over a wide range of q-values, in comparison to Newton SF algorithms with the Gaussian and Cauchy perturbations, as well as the gradient q-Gaussian SF algorithms.
AB - We present the first q-Gaussian smoothed functional (SF) estimator of the Hessian and the first Newton-based stochastic optimization algorithm that estimates both the Hessian and the gradient of the objective function using q-Gaussian perturbations. Our algorithm requires only two system simulations (regardless of the parameter dimension) and estimates both the gradient and the Hessian at each update epoch using these. We also present a proof of convergence of the proposed algorithm. In a related recent work (Ghoshdastidar, Dukkipati, Bhatnagar, 2014), we presented gradient SF algorithms based on the q-Gaussian perturbations. Our work extends prior work on SF algorithms by generalizing the class of perturbation distributions as most distributions reported in the literature for which SF algorithms are known to work turn out to be special cases of the q-Gaussian distribution. Besides studying the convergence properties of our algorithm analytically, we also show the results of numerical simulations on a model of a queuing network, that illustrate the significance of the proposed method. In particular, we observe that our algorithm performs better in most cases, over a wide range of q-values, in comparison to Newton SF algorithms with the Gaussian and Cauchy perturbations, as well as the gradient q-Gaussian SF algorithms.
KW - Hessian estimate
KW - Stochastic optimization
KW - Two-timescale algorithms
KW - q-Gaussian perturbations
UR - http://www.scopus.com/inward/record.url?scp=84908135892&partnerID=8YFLogxK
U2 - 10.1016/j.automatica.2014.08.021
DO - 10.1016/j.automatica.2014.08.021
M3 - Article
AN - SCOPUS:84908135892
SN - 0005-1098
VL - 50
SP - 2606
EP - 2614
JO - Automatica
JF - Automatica
IS - 10
ER -