Abstract
In this paper we propose an adaptively extrapolated proximal gradient method, which is based on the accelerated proximal gradient method (also known as FISTA); however, we locally optimize the extrapolation parameter by carrying out an exact (or inexact) line search. It turns out that in some situations, the proposed algorithm is equivalent to a class of SR1 (identity minus rank 1) proximal quasi-Newton methods. Convergence is proved in a general nonconvex setting, and hence, as a byproduct, we also obtain new convergence guarantees for proximal quasi-Newton methods. The efficiency of the new method is shown in numerical experiments on a sparsity regularized nonlinear inverse problem.
Original language | English |
---|---|
Pages (from-to) | 2482-2503 |
Number of pages | 22 |
Journal | SIAM Journal on Optimization |
Volume | 29 |
Issue number | 4 |
DOIs | |
Publication status | Published - 1 Jan 2019 |
Keywords
- FISTA
- Proximal algorithm
- Proximal quasi-Newton
- SR1 quasi-Newton
ASJC Scopus subject areas
- Software
- Theoretical Computer Science