Convergence of the Fast Subspace Decent (FASD) Method for Convex Optimization Problems

Tuesday, October 1, 2019 - 10:00 to 10:45

Thackeray Hall 427

Speaker Information
Steven Wise
University of Tennessee, Knoxville

Abstract or Additional Information

Nonlinear multi-level methods, such as the full approximation storage (FAS) multigrid scheme, are widely used solvers for nonlinear problems. In this talk, a new framework to devise and analyze FAS-type methods for non-quadratic convex optimization problems is developed. Our Fast Subspace Descent (FASD) method, which generalizes the classical FAS method, can be viewed as an inexact version of a nonlinear multigrid method based on space decomposition and subspace correction, namely, the Successive Subspace Optimization (SSO) method of Jinchao XU and coauthors. FASD is quite general, allowing for a lot of flexibility in choosing how subspace corrections are computed; it is an abstraction of SSO, the preconditioned steepest descent (PSD) method, and standard FAS. In our algorithm, we show that the local problem in each subspace can be simplified to be linear and one gradient decent iteration is enough to ensure global and linear (geometric) convergence of the FASD scheme.  We will start with a slightly improved convergence result for SSO in the Hilbert space setting and then prove the global linear convergence  of FASD by appropriately adapting the proof for SSO. We will describe how to apply FASD to solve nonlinear 2nd and 4th-order elliptic PDE. 

This is joint work with Long CHEN and Xiaozhe HU. The (work-in-progress) application to the 4th-order problem is also joint work with Long CHEN, Abner SALGADO and our graduate student, Jeahyun PARK.

Research Area