TY - GEN
T1 - A semismooth Newton method for adaptive distributed sparse linear regression
AU - Shutin, Dmitriy
AU - Vexler, Boris
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015
Y1 - 2015
N2 - The presented work studies an application of a technique known as a semismooth Newton (SSN) method to accelerate the convergence of distributed quadratic programming LASSO (DQP-LASSO) - a consensus-based distributed sparse linear regression algorithm. The DQP-LASSO algorithm exploits an alternating directions method of multipliers (ADMM) algorithm to reduce a global LASSO problem to a series of local (per agent) LASSO optimizations, which outcomes are then appropriately combined. The SSN algorithm enjoys superlinear convergence and thus permits implementing these local optimizations more efficiently. Yet in some cases SSN might experience convergence issues. Here it is shown that the ADMM-inherent regularization also provides sufficient regularization to stabilize the SSN algorithm, thus ensuring a stable convergence of the whole scheme. Additionally, the structure of the SSN algorithm also permits an adaptive implementation of a distributed sparse regression. This allows for an estimation of time-varying sparse vectors, as well as leverages storage requirements for processing streams of data.
AB - The presented work studies an application of a technique known as a semismooth Newton (SSN) method to accelerate the convergence of distributed quadratic programming LASSO (DQP-LASSO) - a consensus-based distributed sparse linear regression algorithm. The DQP-LASSO algorithm exploits an alternating directions method of multipliers (ADMM) algorithm to reduce a global LASSO problem to a series of local (per agent) LASSO optimizations, which outcomes are then appropriately combined. The SSN algorithm enjoys superlinear convergence and thus permits implementing these local optimizations more efficiently. Yet in some cases SSN might experience convergence issues. Here it is shown that the ADMM-inherent regularization also provides sufficient regularization to stabilize the SSN algorithm, thus ensuring a stable convergence of the whole scheme. Additionally, the structure of the SSN algorithm also permits an adaptive implementation of a distributed sparse regression. This allows for an estimation of time-varying sparse vectors, as well as leverages storage requirements for processing streams of data.
UR - http://www.scopus.com/inward/record.url?scp=84963850968&partnerID=8YFLogxK
U2 - 10.1109/CAMSAP.2015.7383829
DO - 10.1109/CAMSAP.2015.7383829
M3 - Conference contribution
AN - SCOPUS:84963850968
T3 - 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2015
SP - 433
EP - 436
BT - 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2015
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 6th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2015
Y2 - 13 December 2015 through 16 December 2015
ER -