Department of Mathematics

Permanent URI for this communityhttps://dspace.univ-soukahras.dz/handle/123456789/40

Browse

Search Results

Now showing 1 - 3 of 3
  • Thumbnail Image
    Item
    A new hybrid conjugate gradient method of unconstrained optimization methods
    (Asian-European Journal of Mathematics, 2022) Chenna Nasreddine; Badreddine Sellami; Belloufi Mohammed
    In this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.
  • Thumbnail Image
    Item
    New iterative conjugate gradient method for nonlinear unconstrained optimization
    (RAIRO-Operations Research, 2022) Sabrina Ben Hanachi; Badreddine Sellami; Mohammed Belloufi
    Conjugate gradient methods (CG) are an important class of methods for solving unconstrained optimization problems, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new conjugate gradient method for unconstrained optimization. This method is a convex combination of Fletcher and Reeves (abbreviated FR), Polak–Ribiere–Polyak (abbreviated PRP) and Dai and Yuan (abbreviated DY) methods. The new conjugate gradient methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for this method. The numerical experiments are done to test the efficiency of the proposed method, which confirms its promising potentials.
  • Thumbnail Image
    Item
    A new hybrid CG method as convex combination
    (Mathematical Foundations of Computing, 2023) Amina Hallal; Mohammed Belloufi; Badreddine Sellami
    Conjugate gradient methods are among the most efficient methods for solving optimization models. In this paper, a newly proposed conjugate gradient method is proposed for solving optimization problems as a convex combination of the Harger-Zhan and Dai-Yaun nonlinear conjugate gradient methods, which is capable of producing the sufficient descent condition with global convergence properties under the strong Wolfe conditions. The numerical results demonstrate the efficiency of the proposed method with some benchmark problems.