Abstract

We propose and analyze an accelerated iterative dual diagonal descent algorithm for the solution of linear inverse problems with strongly convex regularization and general data-fit functions. We develop an inertial approach of which we analyze both convergence and stability properties. Using tools from inexact proximal calculus, we prove early stopping results with optimal convergence rates for additive data terms and further consider more general cases, such as the Kullback--Leibler divergence, for which different type of proximal point approximations hold.

Keywords

  1. iterative regularization
  2. duality
  3. acceleration
  4. forward-backward splitting
  5. diagonal methods
  6. stability and convergence analysis

MSC codes

  1. 90C25
  2. 49N45
  3. 49N15
  4. 68U10
  5. 90C06

Get full access to this article

View all available purchase options and get full access to this article.

References

1.
F. Alvarez, On the minimizing property of a second order dissipative system in Hilbert spaces, SIAM J. Control Optim., 38 (2000), pp. 1102--1119.
2.
V. Apidopoulos, J.-F. Aujol, and C. Dossal, Convergence rate of inertial forward--backward algorithm beyond Nesterov's rule, Math. Program., 180 (2020), pp. 137--156.
3.
V. Apidopoulos, J.-F. Aujol, and C. Dossal, The differential inclusion modeling FISTA algorithm and optimality of convergence rate in the case b $\leq3$, SIAM J. Optim., 28 (2018), pp. 551--574.
4.
A. Aravkin, J. Burke, D. Drusvyatskiy, M. Friedlander, and S. Roy, Level-set methods for convex optimization, Math. Program., 174 (2019), pp. 359--390.
5.
H. Attouch, Viscosity solutions of minimization problems, SIAM J. Optim., 6 (1996), pp. 769--806.
6.
H. Attouch, A. Cabot, Z. Chbani, and H. Riahi, Inertial forward--backward algorithms with perturbations: Application to Tikhonov regularization, J. Optim. Theory Appl., 179 (2018), pp. 1--36.
7.
H. Attouch, A. Cabot, and M.-O. Czarnecki, Asymptotic behavior of nonautonomous monotone and subgradient evolution equations, Trans. Amer. Math. Soc., 370 (2018), pp. 755--790.
8.
H. Attouch, Z. Chbani, J. Peypouquet, and P. Redont, Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity, Math. Program., 168 (2018), pp. 123--175.
9.
H. Attouch, Z. Chbani, and H. Riahi, Combining fast inertial dynamics for convex optimization with Tikhonov regularization, J. Math. Anal. Appl., 457 (2018), pp. 1065--1094.
10.
H. Attouch, Z. Chbani, and H. Riahi, Rate of convergence of the Nesterov accelerated gradient method in the subcritical case $\alpha\leq 3$, ESAIM Control Optim. Calc. Var., 25 (2019).
11.
H. Attouch and R. Cominetti, A dynamical approach to convex minimization coupling approximation with the steepest descent method, J. Differential Equations, 128 (1996), pp. 519--540.
12.
H. Attouch, M. Czarnecki, and J. Peypouquet, Coupling forward-backward with penalty schemes and parallel splitting for constrained variational inequalities, SIAM J. Optim., 21 (2011), pp. 1251--1274.
13.
H. Attouch and M.-O. Czarnecki, Asymptotic behavior of coupled dynamical systems with multiscale aspects, J. Differential Equations, 248 (2010), pp. 1315--1344.
14.
H. Attouch and M.-O. Czarnecki, Asymptotic behavior of gradient-like dynamical systems involving inertia and multiscale aspects, J. Differential Equations, 262 (2017), pp. 2745--2770.
15.
H. Attouch and J. Peypouquet, The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than $1/k^2$, SIAM J. Optim., 26 (2016), pp. 1824--1834.
16.
H. Attouch, J. Peypouquet, and P. Redont, A dynamical approach to an inertial forward-backward algorithm for convex minimization, SIAM J. Optim., 24 (2014), pp. 232--256.
17.
J. Aujol and C. Dossal, Stability of over-relaxations for the forward-backward algorithm, application to FISTA, SIAM J. Optim., 25 (2015), pp. 2408--2433.
18.
F. R. Bach, Exploring large feature spaces with hierarchical multiple kernel learning, in Advances in Neural Information Processing Systems, 2009, pp. 105--112.
19.
M. Bachmayr and M. Burger, Iterative total variation schemes for nonlinear inverse problems, Inverse Problems, 25 (2009).
20.
M. A. Bahraoui and B. Lemaire, Convergence of diagonally stationary sequences in convex optimization, Set-Valued Anal., 2 (1994), pp. 49--61.
21.
A. B. Bakushinsky and M. Y. Kokurin, Iterative Methods for Approximate Solution of Inverse Problems, Math. Appl. 577, Springer, New York, 2005.
22.
H. Bauschke and P. Combettes, Convex Analysis and Monotone Operator Theory in Hilbert Spaces, CMS Books in Math., Springer, New York, 2017.
23.
A. Beck and M. Teboulle, Mirror descent and nonlinear projected subgradient methods for convex optimization, Oper. Res. Lett., 31 (2003), pp. 167 -- 175.
24.
A. Beck and M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., 2 (2009), pp. 183--202.
25.
A. Beck and M. Teboulle, A fast dual proximal gradient algorithm for convex minimization and applications, Oper. Res. Lett., 42 (2014), pp. 1--6.
26.
M. Benning and M. Burger, Error estimates for general fidelities, Electron. Trans. Numer. Anal., 38 (2011), pp. 44--68.
27.
M. Benning and M. Burger, Modern regularization methods for inverse problems, Acta Numer., 27 (2018), pp. 1--111.
28.
E. V. D. Berg and M. P. Friedlander, Probing the Pareto frontier for basis pursuit solutions, SIAM J. Sci. Comput., 31 (2009), pp. 890--912.
29.
R. I. Boţ and T. Hein, Iterative regularization with a general penalty term\textemdashtheory and application to L1 and TV regularization, Inverse Problems, 28 (2012).
30.
K. Bredies, K. Kunisch, and T. Pock, Total generalized variation, SIAM J. Imaging Sci., 3 (2010), pp. 492--526.
31.
P. Brianzi, F. Di Benedetto, and C. Estatico, Preconditioned iterative regularization in banach spaces, Comput. Optim. Appl., 54 (2013), pp. 263--282.
32.
M. Burger and S. Osher, A guide to the TV zoo, in Level Set and PDE Based Reconstruction Methods in Imaging, Lecture Notes in Math. 2090, Springer, New York, 2013, pp. 1--70.
33.
M. Burger, E. Resmerita, and L. He, Error estimation for Bregman iterations and inverse scale space methods in image restoration, Computing, 81 (2007), pp. 109--135.
34.
A. Cabot and L. Paoli, Asymptotics for some vibro-impact problems with a linear dissipation term, J. Math. Pures Appl., 87 (2007), pp. 291--323.
35.
L. Calatroni, J. C. De Los Reyes, and C.-B. Schönlieb, Infimal convolution of data discrepancies for mixed noise removal, SIAM J. Imaging Sci., 10 (2017), pp. 1196--1233.
36.
A. Chambolle and C. Dossal, On the convergence of the iterates of the “fast iterative shrinkage/thresholding algorithm,'' J. Optim. Theory Appl., 166 (2015), pp. 968--982.
37.
A. Chambolle and P.-L. Lions, Image recovery via total variation minimization and related problems, Numer. Mathe., 76 (1997), pp. 167--188.
38.
P. L. Combettes and V. R. Wajs, Signal recovery by proximal forward-backward splitting, Multiscale Model. Simul., 4 (2005), pp. 1168--1200.
39.
M.-O. Czarnecki, N. Noun, and J. Peypouquet, Splitting forward-backward penalty scheme for constrained variational problems, J. Convex Anal., 23 (2016), pp. 531--565.
40.
C.-A. Deledalle, S. Vaiter, J. M. Fadili, and G. Peyré, Stein unbiased gradient estimator of the risk (SUGAR) for multiple parameter selection, SIAM J. Imaging Sci., 7 (2014), pp. 2448--2487.
41.
O. Devolder, F. Glineur, and Y. Nesterov, First-order methods of smooth convex optimization with inexact oracle, Math. Program., 146 (2014), pp. 37--75.
42.
H. W. Engl, M. Hanke, and A. Neubauer, Regularization of Inverse Problems, Math. Appl. 375, Springer, New York, Media, 1996.
43.
G. Garrigos, L. Rosasco, and S. Villa, Iterative regularization via dual diagonal descent, J. Math. Imaging Vis., 60 (2018), pp. 189--215.
44.
M. Hintermüller and A. Langer, Subspace correction methods for a class of nonsmooth and nonadditive convex variational problems with mixed $l^1$/$l^2$ data-fidelity in image processing, SIAM J. Imaging Sci., 6 (2013), pp. 2134--2173.
45.
B. Kaltenbacher, A. Neubauer, and O. Scherzer, Iterative Regularization Methods for Nonlinear Ill-Posed Problems, De Gruyter, Berlin, 2008.
46.
B. Kaltenbacher, F. Schöpfer, and T. Schuster, Iterative methods for nonlinear ill-posed problems in banach spaces: Convergence and applications to parameter identification problems, Inverse Problems, 25 (2009).
47.
B. Kaltenbacher and I. Tomba, Convergence rates for an iteratively regularized Newton\textendashLandweber iteration in banach space, Inverse Problems, 29 (2013).
48.
W. Krichene, A. Bayen, and P. L. Bartlett, Accelerated Mirror Descent in Continuous and Discrete Time, in Advances in Neural Information Processing Systems, 2015, pp. 2827--2835.
49.
S. Matet, L. Rosasco, S. Villa, and B. L. Vu, Don't Relax: Early Stopping for Convex Regularization, preprint, https://arxiv.org/abs/1707.05422, 2017.
50.
Y. Nesterov, A method for solving the convex programming problem with convergence rate ${O}(1/k^2)$, Sov. Math. Dokl., 269 (1983), pp. 543--547.
51.
Y. Nesterov, Introductory Lectures on Convex Optimization, Appl. Optim. 87, Springer, New York, 2004.
52.
A. Neubauer, On Nesterov acceleration for landweber iteration of linear ill-posed problems, J. Inverse Ill-Posed Problems, 25 (2016).
53.
L. I. Rudin, S. Osher, and E. Fatemi, Nonlinear total variation based noise removal algorithms, Phys. D, 60 (1992), pp. 259--268.
54.
S. Salzo and S. Villa, Inexact and accelerated proximal point algorithms, J. Convex Anal., 19 (2012), pp. 1167--1192.
55.
M. Schmidt, N. L. Roux, and F. R. Bach, Convergence rates of inexact proximal-gradient methods for convex optimization, in Advances in Neural Information Processing Systems 24, J. Shawe-Taylor, R. S. Zemel, P. L. Bartlett, F. Pereira, and K. Q. Weinberger, eds., Curran Associates, 2011, pp. 1458--1466.
56.
C. M. Stein, Estimation of the mean of a multivariate normal distribution, Ann. Statist., 9 (1981), pp. 1135--1151.
57.
I. Steinwart and A. Christmann, Support Vector Machines, Springer, New York, 2008.
58.
W. Su, S. Boyd, and E. J. Candès, A differential equation for modeling Nesterov's accelerated gradient method: Theory and insights, J. Mach. Learn. Res., 17 (2016), pp. 1--43.
59.
S. Villa, S. Salzo, L. Baldassarre, and A. Verri, Accelerated and inexact forward-backward algorithms, SIAM J. Optim., 23 (2013), pp. 1607--1633.
60.
M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables, J. R. Stat. Soc. Ser. B Stat. Method., 68 (2006), pp. 49--67.
61.
C. Zalinescu, Convex Analysis in General Vector Spaces, World Scientific, River Edge, NJ, 2002.
62.
H. Zou and T. Hastie, Regularization and variable selection via the elastic net, J. R. Stat. Soc. Ser. B Stat. Methodol., 67 (2005), pp. 301--320.

Information & Authors

Information

Published In

cover image SIAM Journal on Optimization
SIAM Journal on Optimization
Pages: 754 - 784
ISSN (online): 1095-7189

History

Submitted: 27 December 2019
Accepted: 29 October 2020
Published online: 1 March 2021

Keywords

  1. iterative regularization
  2. duality
  3. acceleration
  4. forward-backward splitting
  5. diagonal methods
  6. stability and convergence analysis

MSC codes

  1. 90C25
  2. 49N45
  3. 49N15
  4. 68U10
  5. 90C06

Authors

Affiliations

Funding Information

Centre National de la Recherche Scientifique https://doi.org/10.13039/501100004794

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

View Options

View options

PDF

View PDF

Figures

Tables

Media

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media