TY - GEN
T1 - Conditional gradient methods via stochastic path-integrated differential estimator
AU - Yurtsever, Alp
AU - Sra, Suvrit
AU - Cevher, Volkan
N1 - Publisher Copyright:
Copyright © 2019 ASME
PY - 2019
Y1 - 2019
N2 - We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework à la conditional gradient sliding (CGS) of Lan & Zhou (2016), and propose SPIDER-CGS.
AB - We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework à la conditional gradient sliding (CGS) of Lan & Zhou (2016), and propose SPIDER-CGS.
UR - http://www.scopus.com/inward/record.url?scp=85078201001&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85078201001
T3 - 36th International Conference on Machine Learning, ICML 2019
SP - 12640
EP - 12662
BT - 36th International Conference on Machine Learning, ICML 2019
PB - International Machine Learning Society (IMLS)
T2 - 36th International Conference on Machine Learning, ICML 2019
Y2 - 9 June 2019 through 15 June 2019
ER -