A new family of Dai-Liao conjugate gradient methods with modified secant equation for unconstrained optimization
RAIRO. Operations Research, Tome 55 (2021) no. 6, pp. 3281-3291

In this paper, a new family of Dai-Liao–type conjugate gradient methods are proposed for unconstrained optimization problem. In the new methods, the modified secant equation used in [H. Yabe and M. Takano, Comput. Optim. Appl. 28 (2004) 203–225] is considered in Dai and Liao’s conjugacy condition. Under some certain assumptions, we show that our methods are globally convergent for general functions with strong Wolfe line search. Numerical results illustrate that our proposed methods can outperform some existing ones.

Reçu le :
Accepté le :
Première publication :
Publié le :
DOI : 10.1051/ro/2021159
Classification : 65K05, 90C26, 90C30
Keywords: Conjugate gradient method, Dai-Liao–type method, modified secant equation
@article{RO_2021__55_6_3281_0,
     author = {Zheng, Yutao},
     title = {A new family of {Dai-Liao} conjugate gradient methods with modified secant equation for unconstrained optimization},
     journal = {RAIRO. Operations Research},
     pages = {3281--3291},
     year = {2021},
     publisher = {EDP-Sciences},
     volume = {55},
     number = {6},
     doi = {10.1051/ro/2021159},
     mrnumber = {4338798},
     zbl = {1483.65103},
     language = {en},
     url = {https://www.numdam.org/articles/10.1051/ro/2021159/}
}
TY  - JOUR
AU  - Zheng, Yutao
TI  - A new family of Dai-Liao conjugate gradient methods with modified secant equation for unconstrained optimization
JO  - RAIRO. Operations Research
PY  - 2021
SP  - 3281
EP  - 3291
VL  - 55
IS  - 6
PB  - EDP-Sciences
UR  - https://www.numdam.org/articles/10.1051/ro/2021159/
DO  - 10.1051/ro/2021159
LA  - en
ID  - RO_2021__55_6_3281_0
ER  - 
%0 Journal Article
%A Zheng, Yutao
%T A new family of Dai-Liao conjugate gradient methods with modified secant equation for unconstrained optimization
%J RAIRO. Operations Research
%D 2021
%P 3281-3291
%V 55
%N 6
%I EDP-Sciences
%U https://www.numdam.org/articles/10.1051/ro/2021159/
%R 10.1051/ro/2021159
%G en
%F RO_2021__55_6_3281_0
Zheng, Yutao. A new family of Dai-Liao conjugate gradient methods with modified secant equation for unconstrained optimization. RAIRO. Operations Research, Tome 55 (2021) no. 6, pp. 3281-3291. doi: 10.1051/ro/2021159

[1] N. Andrei, An unconstrained optimization test functions collection. Adv. Model. Optim. 10 (2008) 147–161. | MR | Zbl

[2] Y. Cheng, Q. Mou, X. Pan and S. Yao, A sufficient descent conjugate gradient method and its global convergence. Optim. Methods Softw. 31 (2016) 577–590. | MR | Zbl | DOI

[3] Y. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10 (1999) 177–182. | MR | Zbl | DOI

[4] Y. Dai, Nonlinear conjugate gradient methods, in Wiley Encyclopedia of Operations Research and Management Science (2011). DOI:. | DOI

[5] Y. Dai.Y. Huang, X. Liu, A family of spectral gradient methods for optimization. Comput. Optim. Appl. 74 (2019) 43–65. | MR | Zbl | DOI

[6] Y. Dai and L. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43 (2001) 87–101. | MR | Zbl | DOI

[7] E. Dolan and J. Moré, Benchmarking optimization software with performance profiles. Math. Program. 91 (2002) 201–213. | MR | Zbl | DOI

[8] R. Fletcher, Function minimization by conjugate gradients. Comput. J. 7 (1964) 149–154. | MR | Zbl | DOI

[9] J. Ford, Y. Narushima and H. Yabe, Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40 (2008) 191–216. | MR | Zbl | DOI

[10] J. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2 (1992) 21–42. | MR | Zbl | DOI

[11] N. Gould, D. Orban and Ph Toint, CUTEst: a Constrained and Unconstrained Testing Environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60 (2015) 545–557. | MR | Zbl | DOI

[12] W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16 (2005) 170–192. | MR | Zbl | DOI

[13] W. Hager and H. Zhang, A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2 (2006) 35–58. | MR | Zbl

[14] M. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49 (1952) 409–436. | MR | Zbl | DOI

[15] Y. Huang, Y. Dai, X. Liu and H. Zhang, Gradient methods exploiting spectral properties. Optim. Methods Softw. 35 (2020) 681–705. | MR | Zbl | DOI

[16] C. Kou and Y. Dai, A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165 (2015) 209–224. | MR | Zbl | DOI

[17] G. Li, C. Tang and Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202 (2007) 523–539. | MR | Zbl | DOI

[18] E. Polak and G. Ribiere, Note sur la convergence de méthodes de directions conjuguées. ESAIM: M2AN 3 (1969) 35–43. | MR | Zbl | Numdam

[19] B. Polyak, The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9 (1969) 94–112. | Zbl | DOI

[20] H. Yabe and M. Takano, Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28 (2004) 203–225. | MR | Zbl | DOI

[21] J. Zhang and C. Xu, Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137 (2001) 269–278. | MR | Zbl | DOI

[22] K. Zhang, H. Liu and Z. Liu, A new Dai-Liao conjugate gradient method with optimal parameter choice. Numer. Funct. Anal. Optim. 40 (2019) 194–215. | MR | Zbl | DOI

[23] Y. Zheng and B. Zheng, Two new Dai-Liao–type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175 (2017) 502–509. | MR | Zbl | DOI

Cité par Sources :