A new class of nonlinear conjugate gradient coefficients for unconstrained optimization

dc.contributor.authorAmina Boumediene
dc.contributor.authorTahar Bechouat
dc.contributor.authorRachid Benzine
dc.contributor.authorGhania Hadji
dc.date.accessioned2023-09-14T15:45:44Z
dc.date.available2023-09-14T15:45:44Z
dc.date.issued2022
dc.description.abstractThe nonlinear Conjugate gradient method (CGM) is a very effective way in solving largescale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by BNPRPk . They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.
dc.identifier.urihttps://dspace.univ-soukahras.dz/handle/123456789/1723
dc.language.isoen
dc.publisherAsian-European Journal of Mathematics
dc.titleA new class of nonlinear conjugate gradient coefficients for unconstrained optimization
dc.typeArticle

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
A new class of nonlinear 2021-1.pdf
Size:
76.08 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed to upon submission
Description:

Collections