Stochastic Methods for Alternating Direction Method of Multiplies
Speaker: Taiji Suzuki, Tokyo Institute of Technology (http://www.is.titech.ac.ip/~s-taiji/)
Time: 2.00PM
Date: Friday February 28th 2014
Location: Seminar Room L503, 5th floor, Library Building
Abstract:
In this talk, we present new stochastic optimization methods that are applicable to a wide range of structured regularizations. Structured regularization is a useful statistical tool to deal with a complicated data structure such as group sparsity, graphical sparsity, and low rank tensor structure. The proposed methods are based on stochastic optimization techniques and Alternating Direction Method of Multipliers (ADMM). ADMM is a general framework for optimizing a composite function, and has a wide range of applications. We propose two types of stochastic variants of ADMM, which correspond to (a) online stochastic optimization and (b) stochastic dual coordinate descent respectively. Both methods require only one or few sample observations at each iteration, and are suitable for large-scale data analysis. It is shown that the online type method (a) yields the minimax optimal convergence rate in an online setting and the stochastic dual coordinate descent type method (b) yields exponential convergence rate in a batch setting.
Series: Statistics Seminar Series
Social Media Links