New iterative conjugate gradient method for nonlinear unconstrained optimization
RAIRO. Operations Research, Tome 56 (2022) no. 4, pp. 2315-2327

Conjugate gradient methods (CG) are an important class of methods for solving unconstrained optimization problems, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new conjugate gradient method for unconstrained optimization. This method is a convex combination of Fletcher and Reeves (abbreviated FR), Polak–Ribiere–Polyak (abbreviated PRP) and Dai and Yuan (abbreviated DY) methods. The new conjugate gradient methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for this method. The numerical experiments are done to test the efficiency of the proposed method, which confirms its promising potentials.

DOI : 10.1051/ro/2022109
Classification : 65K05, 90C25, 90C26, 90C27, 90C30
Keywords: Unconstrained optimization, hybrid conjugate gradient method, sufficient descent, convex combination, global convergence
@article{RO_2022__56_4_2315_0,
     author = {Ben Hanachi, Sabrina and Sellami, Badreddine and Belloufi, Mohammed},
     title = {New iterative conjugate gradient method for nonlinear unconstrained optimization},
     journal = {RAIRO. Operations Research},
     pages = {2315--2327},
     year = {2022},
     publisher = {EDP-Sciences},
     volume = {56},
     number = {4},
     doi = {10.1051/ro/2022109},
     mrnumber = {4458849},
     language = {en},
     url = {https://www.numdam.org/articles/10.1051/ro/2022109/}
}
TY  - JOUR
AU  - Ben Hanachi, Sabrina
AU  - Sellami, Badreddine
AU  - Belloufi, Mohammed
TI  - New iterative conjugate gradient method for nonlinear unconstrained optimization
JO  - RAIRO. Operations Research
PY  - 2022
SP  - 2315
EP  - 2327
VL  - 56
IS  - 4
PB  - EDP-Sciences
UR  - https://www.numdam.org/articles/10.1051/ro/2022109/
DO  - 10.1051/ro/2022109
LA  - en
ID  - RO_2022__56_4_2315_0
ER  - 
%0 Journal Article
%A Ben Hanachi, Sabrina
%A Sellami, Badreddine
%A Belloufi, Mohammed
%T New iterative conjugate gradient method for nonlinear unconstrained optimization
%J RAIRO. Operations Research
%D 2022
%P 2315-2327
%V 56
%N 4
%I EDP-Sciences
%U https://www.numdam.org/articles/10.1051/ro/2022109/
%R 10.1051/ro/2022109
%G en
%F RO_2022__56_4_2315_0
Ben Hanachi, Sabrina; Sellami, Badreddine; Belloufi, Mohammed. New iterative conjugate gradient method for nonlinear unconstrained optimization. RAIRO. Operations Research, Tome 56 (2022) no. 4, pp. 2315-2327. doi: 10.1051/ro/2022109

[1] A. Abubakar, P. Kumam, M. Malik, P. Chaipunya and A. Ibrahim, A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection. AIMS Math. 66 (2021) 126. | MR

[2] M. Al-Baali, Descent property and global convergence of the Fletcher-Reeves method with inexact line search. IMA J. Numer. Anal. 5 (1985) 121–124. | MR | DOI

[3] N. Andrei, Another nonlinear conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 24 (2008) 89–104. | MR | DOI

[4] N. Andrei, An unconstrained optimization test functions collection. Adv. Model. Optim. 10 (2008) 147–161. | MR

[5] N. Andrei, Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization. Numer. Algorithms 54 (2010) 23–46. | MR | DOI

[6] B. Balaram, M. Narayanan and P. Rajendrakumar, Optimal design of multi-parametric nonlinear systems using a parametric continuation based genetic algorithm approach. Nonlinear Dyn. 67 (2012) 2759–2777. | MR | DOI

[7] I. Bongartz, A. R. Conn, N. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. (TOMS) 21 (1995) 123–160. | DOI

[8] Y. H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10 (1999) 177–182. | MR | DOI

[9] S. S. Djordjević, New hybrid conjugate gradient method as a convex combination of FR and PRP methods. Filomat 30 (2016) 3083–3100. | MR | DOI

[10] S. S. Djordjević, New hybrid conjugate gradient method as a convex combination of LS and FR conjugate gradient methods. Acta Math. Sci. 39 (2019) 214–228. | MR | DOI

[11] E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles. Math. Program. 91 (2002) 201–213. | MR | DOI

[12] R. Fletcher, Practical Methods of Optimization, Unconstrained Optimization. Vol. 1. John Wiley and Son, New York (1980). | MR

[13] R. Fletcher and C. Reeves, Function minimization by conjugate gradients. Comput. J. 7 (1964) 149–154. | MR | DOI

[14] W. W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16 (2005) 170–192. | MR | DOI

[15] M. R. Hestenes and E. L. Stiefel, Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Standards 49 (1952) 409–436. | MR | DOI

[16] J. K. Liu and S. J. Li, New hybrid conjugate gradient method for unconstrained optimization. Appl. Math. Comput. 245 (2014) 36–43. | MR

[17] D. Liu and C. Storey, Efficient generalized conjugate gradient algorithms, Part 1: theory. J. Optim. Theory App. 69 (1991) 129–137. | MR | DOI

[18] P. K.-H. Phua, W. Fan and Y. Zeng, Parallel algorithms for large-scale nonlinear optimization. Int. Trans. Oper. Res. 5 (1998) 67–77.

[19] E. Polak and G. Ribiere, Note sur la convergence de méthodes de directions conjuguées. Revue Française D’informatique et De Recherche Opérationnelle, Série Rouge 3 (1969) 35–43. | MR | Numdam

[20] B. T. Polyak, The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9 (1969) 94–112. | DOI

[21] Z. Salleh and A. Alhawarat, An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property. J. Inequalities App. 2016 (2016) 1–14. | MR

[22] Z. Wei, G. Li and L. Qi, New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems. Appl. Math. Comput. 179 (2006) 407–430. | MR

[23] P. Wolfe, Convergence conditions for ascent methods. SIAM Rev. 11 (1969) 226–235. | MR | DOI

[24] P. Wolfe, Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13 (1971) 185–188. | MR | DOI

[25] H. Yan, L. Chen and B. Jiao, HS-LS-CD hybrid conjugate gradient algorithm for unconstrained optimization. In: Second International Workshop on Computer Science and Engineering. Vol. 1. IEEE (2009) 264–268.

[26] G. Yuan, Z. Wei and Q. Zhao, A modified Polak–Ribiere–Polyak conjugate gradient algorithm for large scale optimization problems. IIE Trans. 46 (2014) 397–413. | DOI

[27] X. Y. Zheng, X. L. Dong, J. R. Shi and W. Yang, Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei. Numer. Algorithm 84 (2019) 603–608. | MR | DOI

Cité par Sources :