A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
Open Journal of Mathematical Optimization, Tome 3 (2022), article no. 1, 15 p.

In this short note, we provide a simple version of an accelerated forward-backward method (a.k.a. Nesterov’s accelerated proximal gradient method) possibly relying on approximate proximal operators and allowing to exploit strong convexity of the objective function. The method supports both relative and absolute errors, and its behavior is illustrated on a set of standard numerical experiments.

Using the same developments, we further provide a version of the accelerated proximal hybrid extragradient method of [21] possibly exploiting strong convexity of the objective function.

Reçu le :
Révisé le :
Accepté le :
Publié le :
DOI : 10.5802/ojmo.12
Barré, Mathieu 1 ; Taylor, Adrien 1 ; Bach, Francis 1

1 INRIA (Sierra project-team) – Dépt. d’informatique, Ecole normale supérieure, CNRS, PSL Research University, Paris, France
@article{OJMO_2022__3__A1_0,
     author = {Barr\'e, Mathieu and Taylor, Adrien and Bach, Francis},
     title = {A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives},
     journal = {Open Journal of Mathematical Optimization},
     eid = {1},
     pages = {1--15},
     publisher = {Universit\'e de Montpellier},
     volume = {3},
     year = {2022},
     doi = {10.5802/ojmo.12},
     language = {en},
     url = {http://www.numdam.org/articles/10.5802/ojmo.12/}
}
TY  - JOUR
AU  - Barré, Mathieu
AU  - Taylor, Adrien
AU  - Bach, Francis
TI  - A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
JO  - Open Journal of Mathematical Optimization
PY  - 2022
SP  - 1
EP  - 15
VL  - 3
PB  - Université de Montpellier
UR  - http://www.numdam.org/articles/10.5802/ojmo.12/
DO  - 10.5802/ojmo.12
LA  - en
ID  - OJMO_2022__3__A1_0
ER  - 
%0 Journal Article
%A Barré, Mathieu
%A Taylor, Adrien
%A Bach, Francis
%T A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
%J Open Journal of Mathematical Optimization
%D 2022
%P 1-15
%V 3
%I Université de Montpellier
%U http://www.numdam.org/articles/10.5802/ojmo.12/
%R 10.5802/ojmo.12
%G en
%F OJMO_2022__3__A1_0
Barré, Mathieu; Taylor, Adrien; Bach, Francis. A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives. Open Journal of Mathematical Optimization, Tome 3 (2022), article  no. 1, 15 p. doi : 10.5802/ojmo.12. http://www.numdam.org/articles/10.5802/ojmo.12/

[1] Alves, Maicon M. Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods (2021) (https://arxiv.org/abs/2102.02045v1)

[2] Bansal, Nikhil; Gupta, Anupam Potential-function proofs for gradient methods, Theory Comput., Volume 15 (2019) no. 1, 4, 32 pages | MR | Zbl

[3] Barré, Mathieu; Taylor, Adrien; Bach, Francis Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators (2020) (https://arxiv.org/abs/2006.06041v2)

[4] Barré, Mathieu; Taylor, Adrien; Bach, Francis Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators (2021) (https://arxiv.org/abs/2006.06041)

[5] Beck, Amir; Teboulle, Marc A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., Volume 2 (2009) no. 1, pp. 183-202 | DOI | MR | Zbl

[6] Bello-Cruz, Yunier; Gonçalves, Max L. N.; Krislock, Nathan On Inexact Accelerated Proximal Gradient Methods with Relative Error Rules (2020) (https://arxiv.org/abs/2005.03766)

[7] Brøndsted, Arne; Rockafellar, Ralph T. On the subdifferentiability of convex functions, Proc. Am. Math. Soc., Volume 16 (1965) no. 4, pp. 605-611 | DOI | MR | Zbl

[8] Burachik, Regina S.; Iusem, Alfredo N.; Svaiter, Benar F. Enlargement of monotone operators with applications to variational inequalities, Set-Valued Anal., Volume 5 (1997) no. 2, pp. 159-180 | DOI | MR | Zbl

[9] Burachik, Regina S.; Sagastizábal, Claudia A.; Svaiter, Benar F. ε-Enlargements of maximal monotone operators: Theory and applications, Reformulation: nonsmooth, piecewise smooth, semismooth and smoothing methods, Springer, 1998, pp. 25-43 | Zbl

[10] Burachik, Regina S.; Scheimberg, Susana; Svaiter, Benar F. Robustness of the hybrid extragradient proximal-point algorithm, J. Optim. Theory Appl., Volume 111 (2001) no. 1, pp. 117-136 | DOI | MR | Zbl

[11] Chambolle, Antonin; Pock, Thomas An introduction to continuous optimization for imaging, Acta Numer., Volume 25 (2016), pp. 161-319 | DOI | MR | Zbl

[12] Chang, Chih-Chung; Lin, Chih-Jen LIBSVM: A library for support vector machines, ACM Trans. Intell. Syst. Technol., Volume 2 (2011), 27, 37 pages (Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm)

[13] Chierchia, Giovanni; Chouzenoux, Emilie; Combettes, Patrick L.; Pesquet, Jean-Christophe The Proximity Operator Repository. User’s guide (2020) http://proximity-operator.net/download/guide.pdf

[14] d’Aspremont, Alexandre; Scieur, Damien; Taylor, Adrien Acceleration methods, Foundations and Trends® in Optimization, Volume 5 (2021) no. 1-2, pp. 1-245 | DOI

[15] Güler, Osman New proximal point algorithms for convex minimization, SIAM J. Optim., Volume 2 (1992) no. 4, pp. 649-664 | DOI | MR | Zbl

[16] Hiriart-Urruty, Jean-Baptiste; Lemaréchal, Claude Convex analysis and minimization algorithms I: Fundamentals, Grundlehren der Mathematischen Wissenschaften, 305, Springer, 2013

[17] Jenatton, Rodolphe; Mairal, Julien; Obozinski, Guillaume; Bach, Francis Proximal methods for sparse hierarchical dictionary learning, Proceedings of the 27th International Conference on International Conference on Machine Learning (ICML), Association for Computing Machinery (2010), pp. 487-494 | Zbl

[18] Jiang, Kaifeng; Sun, Defeng; Toh, Kim-Chuan An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP, SIAM J. Optim., Volume 22 (2012) no. 3, pp. 1042-1064 | DOI | MR | Zbl

[19] Mairal, Julien; Jenatton, Rodolphe; Obozinski, Guillaume; Bach, Francis Convex and network flow optimization for structured sparsity, J. Mach. Learn. Res., Volume 12 (2011) no. Sep, pp. 2681-2720 | MR | Zbl

[20] Millán, Reinier D.; Machado, Majela P. Inexact proximal epsilon-subgradient methods for composite convex optimization problems, J. Glob. Optim., Volume 75 (2019) no. 4, pp. 1029-1060 | DOI | Zbl

[21] Monteiro, Renato D. C.; Svaiter, Benar F. An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods, SIAM J. Optim., Volume 23 (2013) no. 2, pp. 1092-1125 | DOI | MR | Zbl

[22] Nesterov, Yurii A method of solving a convex programming problem with convergence rate O(1/k 2 ), Sov. Math., Dokl., Volume 27 (1983), pp. 372-376

[23] Nesterov, Yurii Introductory Lectures on Convex Optimization: a Basic Course, Applied Optimization, Kluwer Academic Publishing, 2004 | DOI

[24] Nesterov, Yurii Gradient methods for minimizing composite functions, Math. Program., Volume 140 (2013) no. 1, pp. 125-161 | DOI | MR | Zbl

[25] Parikh, Neal; Boyd, Stephen Proximal algorithms, Foundations and Trends® in Optimization, Volume 1 (2014) no. 3, pp. 127-239 | DOI

[26] Rockafellar, Ralph T. Convex Analysis, Princeton Landmarks in Mathematics, Princeton University Press, 1997

[27] Rudin, Leonid; Osher, Stanley Total variation based image restoration with free local constraints, Proceedings of 1st International Conference on Image Processing, IEEE (1994), pp. 31-35 | DOI

[28] Rudin, Leonid; Osher, Stanley; Fatemi, Emad Nonlinear total variation based noise removal algorithms, Physica D, Volume 60 (1992) no. 1-4, pp. 259-268 | DOI | MR | Zbl

[29] Ryu, Ernest K.; Boyd, Stephen Primer on monotone operator methods, Appl. Comput. Math., Volume 15 (2016) no. 1, pp. 3-43 | MR | Zbl

[30] Salzo, Saverio; Villa, Silvia Inexact and Accelerated Proximal Point Algorithms, J. Convex Anal., Volume 19 (2012) no. 4, pp. 1167-1192 | MR | Zbl

[31] Schmidt, Mark; Le Roux, Nicolas; Bach, Francis Convergence rates of inexact proximal-gradient methods for convex optimization, Advances in neural information processing systems (NIPS), MIT Press (2011), pp. 1458-1466

[32] Solodov, Mikhail V.; Svaiter, Benar F. A hybrid approximate extragradient–proximal point algorithm using the enlargement of a maximal monotone operator, Set-Valued Anal., Volume 7 (1999) no. 4, pp. 323-345 | DOI | MR | Zbl

[33] Solodov, Mikhail V.; Svaiter, Benar F. A comparison of rates of convergence of two inexact proximal point algorithms, Nonlinear optimization and related topics, Springer, 2000, pp. 415-427 | DOI | Zbl

[34] Taylor, Adrien; Hendrickx, Julien M.; Glineur, François Smooth strongly convex interpolation and exact worst-case performance of first-order methods, Math. Program., Volume 161 (2017) no. 1-2, pp. 307-345 | DOI | MR | Zbl

[35] Tseng, Paul On accelerated proximal gradient methods for convex-concave optimization (2008) (Technical report)

[36] Villa, Silvia; Salzo, Saverio; Baldassarre, Luca; Verri, Alessandro Accelerated and inexact forward-backward algorithms, SIAM J. Optim., Volume 23 (2013) no. 3, pp. 1607-1633 | DOI | MR | Zbl

[37] Wang, Yilun; Yang, Junfeng; Yin, Wotao; Zhang, Yin A new alternating minimization algorithm for total variation image reconstruction, SIAM J. Imaging Sci., Volume 1 (2008) no. 3, pp. 248-272 | DOI | MR | Zbl

[38] Wilson, Ashia C.; Recht, Ben; Jordan, Michael I. A Lyapunov Analysis of Accelerated Methods in Optimization, J. Mach. Learn. Res., Volume 22 (2021) no. 113, pp. 1-34 | MR | Zbl

Cité par Sources :