Journal Articles

Permanent URI for this collectionhttps://dspace.univ-soukahras.dz/handle/123456789/216

Browse

Search Results

Now showing 1 - 10 of 23
  • Thumbnail Image
    Item
    On the numerical solution of weakly singular integral equations of the first kind using a regularized projection method
    (International Journal of Computer Mathematics, 2024-05-28) Bechouat Tahar; Boussetila Nadjib
    This study investigates a numerical method based on the Jacobi–Gauss quadrature for solving Fredholm integral equations of the first kind with a weakly singular kernel by combining the Tikhonov regularization and projection methods. This numerical method reduces the solution of the weakly singular integral equations of the first kind to the solution of a linear system of algebraic equations. The theoretical analysis of the proposed technique is provided. Finally, several tests are presented to show the validity and efficiency of this approach.
  • Thumbnail Image
    Item
    THE BEST SPECTRAL CORRECTION OF DMDY CONJUGATE GRADIENT METHOD
    (Annals – Series on Mathematics and Its Applications, 2024-05-14) Khaoula Meansri, Noureddine Benrabia, Mourad Ghiat, Hamza Guebbai and Imane Hafaidia
    In this paper, we present an enhanced spectral correction for the DMDY conjugate gradient method. Our approach involves integrating a third term and determining its parameter through three different approaches. The primary objective is to ensure the sufficient descent condition. By applying the Wolfe line search conditions, we establish the global convergence property for all three proposed algorithms. Numerical tests conclusively demonstrate the superior efficiency of our algorithms, surpassing that of existing methods. keywords: Spectral correction, Conjugate gradient methods, Sufficient descent condition and global convergence, Numerical tests.
  • Thumbnail Image
    Item
    A novel conformable fractional approach to the Brusselator system with numerical simulation
    (2024-03-12) Mohamed Lamine Merikhi · Hamza Guebbai · Noureddine Benrabia · Mohamed Moumen Bekkouche
    In this study, we delve into a comprehensive analysis of the Brusselator system, combining both analytical and numerical approaches. In summary, our initial approach involves revisiting the classic Brusselator system using a conformable fractional derivative-based approach. Starting from this innovative reformulation, we obtain a nonlinear Volterra-type equation. This transformation allows us to simultaneously demonstrate the existence and uniqueness of the solution, while providing us with the necessary tools to develop an efficient numerical approximation method to solve the problem. Subsequently, we present a numerical simulation based on the Nyström method.
  • Thumbnail Image
    Item
    DEVELOPING A NEW CONJUGATE GRADIENT ALGORITHM WITH THE BENEFIT OF SOME DESIRABLE PROPERTIES OF THE NEWTON ALGORITHM FOR UNCONSTRAINED OPTIMIZATION
    (Journal of Applied Analysis & Computation, 2024-02-15) Naima Hamel, Noureddine Benrabia, Mourad Ghiat, Hamza Guebbai
    The conjugate gradient method and the Newton method are both numerical optimization techniques. In this paper, we aim to combine some desirable characteristics of these two methods while avoiding their drawbacks, more specifically, we aim to develop a new optimization algorithm that preserves some essential features of the conjugate gradient algorithm, including the simplicity, the low memory requirements, the ability to solve large scale problems and the convergence to the solution regardless of the starting vector (global convergence). At the same time, this new algorithm approches the quadratic convergence behavior of the Newton method in the numerical sense while avoiding the computational cost of evaluating the Hessian matrix directly and the sensitivity of the selected starting vector. To do this, we propose a new hybrid conjugate gradient method by linking (CD) and (WYL) methods in a convex blend, the hybridization paramater is computed so that the new search direction accords with the Newton direction, but avoids the computational cost of evaluating the Hessian matrix directly by using the secant equation. This makes the proposed algorithm useful for solving large scale optimization problems. The sufficient descent condition is verified, also the global convergence is proved under a strong Wolfe Powel line search. The numerical tests show that, the proposed algorithm provides the quadratic convergence behavior and confirm its efficiency as it outperformed both the (WYL) and (CD) algorithms.
  • Thumbnail Image
    Item
    Applying the Powell's Symmetrical Technique to Conjugate Gradient Methods with the Generalized Conjugacy Condition
    (Numerical Functional Analysis and Optimization, 2016-07-02) Noureddine Benrabia, Yamina Laskri, Hamza Guebbai, Mehiddin Al-Baali
    This article proposes new conjugate gradient method for unconstrained optimization by applying the Powell symmetrical technique in a defined sense. Using the Wolfe line search conditions, the global convergence property of the method is also obtained based on the spectral analysis of the conjugate gradient iteration matrix and the Zoutendijk condition for steepest descent methods. Preliminary numerical results for a set of 86 unconstrained optimization test problems verify the performance of the algorithm and show that the Generalized Descent Symmetrical Hestenes-Stiefel algorithm is competitive with the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP+) algorithms.
  • Thumbnail Image
    Item
    On the regularization method for Fredholm integral equations with odd weakly singular kernel
    (Computational and Applied Mathematics, 2018-03-23) Noureddine Benrabia & Hamza Guebbai
    In this paper, we propose a numerical method to approach the solution of a Fredholm integral equation with a weakly singular kernel by applying the convolution product as a regularization operator and the Fourier series as a projection. Preliminary numerical results show that the order of convergence of the method is better than the one of conventional projection methods.
  • Thumbnail Image
    Item
    A new optimization method based on Perry's idea through the use of the matrix power
    (Journal of Applied Mathematics and Computational Mechanics, 2021-04-03) Imane Hafaidia, Noureddine Benrabia, Mourad Ghiat, Hamza Guebbai
    The purpose of this paper is to present a new conjugate gradient method for solving unconstrained nonlinear optimization problems, based on Perry’s idea. An accelerated adaptive algorithm is proposed, where our search direction satisfies the sufficient descent condition. The global convergence is analyzed using the spectral analysis. The numerical results are described for a set of standard test problems, and it is shown that the performance of the proposed method is better than that of the CG-DESCENT, the mBFGS and the SPDOC
  • Thumbnail Image
    Item
    A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems
    (Journal of Applied Mathematics and Computing, 2023-03-25) Naima Hamel, Noureddine Benrabia, Mourad Ghiat, Hamza Guebbai
    In this paper, we propose a new hybrid conjugate gradient method to solve unconstrained optimization problems. This new method is defined as a convex combination of DY and DL conjugate gradient methods. The special feature is that our search direction respects Newton’s direction, but without the need to store or calculate the second derivative (the Hessian matrix), due to the use of the secant equation that allows us to remove the troublesome part required by the Newton method. Our search direction not only satisfies the descent property, but also the sufficient descent condition through the use of the strong Wolfe line search, the global convergence is proved. The numerical comparison shows the efficiency of the new algorithm, as it outperforms both the DY and DL algorithms.
  • Thumbnail Image
    Item
    A new hybrid conjugate gradient method of unconstrained optimization methods
    (Asian-European Journal of Mathematics, 2022) Chenna Nasreddine; Badreddine Sellami; Belloufi Mohammed
    In this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.
  • Thumbnail Image
    Item
    A new hybrid conjugate gradient method of unconstrained optimization methods
    (2022) Chenna Nasreddine; Sellami Badreddine; Belloufi Mohammed
    In this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.