Abstract
We analyze a “piggyback''-style method for computing the derivative of a loss which depends on the solution of a convex-concave saddle-point problem, with respect to the bilinear term. We attempt to derive guarantees for the algorithm under minimal regularity assumptions on the functions. Our final convergence results include possibly nonsmooth objectives. We illustrate the versatility of the proposed piggyback algorithm by learning optimized shearlet transforms, which are a class of popular sparsifying transforms in the field of imaging
Originalsprache | englisch |
---|---|
Seiten (von - bis) | 1003-1030 |
Fachzeitschrift | SIAM Journal on Mathematics of Data Science |
Jahrgang | 4 |
Ausgabenummer | 3 |
DOIs | |
Publikationsstatus | Veröffentlicht - 2022 |