Journal Articles
Permanent URI for this collectionhttps://dspace.univ-soukahras.dz/handle/123456789/216
Browse
Item Applying the Powell's Symmetrical Technique to Conjugate Gradient Methods with the Generalized Conjugacy Condition(Numerical Functional Analysis and Optimization, 2016-07-02) Noureddine Benrabia, Yamina Laskri, Hamza Guebbai, Mehiddin Al-BaaliThis article proposes new conjugate gradient method for unconstrained optimization by applying the Powell symmetrical technique in a defined sense. Using the Wolfe line search conditions, the global convergence property of the method is also obtained based on the spectral analysis of the conjugate gradient iteration matrix and the Zoutendijk condition for steepest descent methods. Preliminary numerical results for a set of 86 unconstrained optimization test problems verify the performance of the algorithm and show that the Generalized Descent Symmetrical Hestenes-Stiefel algorithm is competitive with the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP+) algorithms.Item On the regularization method for Fredholm integral equations with odd weakly singular kernel(Computational and Applied Mathematics, 2018-03-23) Noureddine Benrabia & Hamza GuebbaiIn this paper, we propose a numerical method to approach the solution of a Fredholm integral equation with a weakly singular kernel by applying the convolution product as a regularization operator and the Fourier series as a projection. Preliminary numerical results show that the order of convergence of the method is better than the one of conventional projection methods.Item A Variant of Projection-Regularization Method for ill-posed linear operator equations(International Journal of Computational Methods, 2020-09-20) Bechouat Tahar; Boussetila Nadjib; Rebbani FaouziaIn the present paper, we report on a strategy for computing the numerical approximate solution for a class of ill-posed operator equations in Hilbert spaces: Kf = g. This approach is a combination of Tikhonov regularization method and the finite rank approximation of K. Finally, numerical results are given to show the effectiveness of this method.Item A new optimization method based on Perry's idea through the use of the matrix power(Journal of Applied Mathematics and Computational Mechanics, 2021-04-03) Imane Hafaidia, Noureddine Benrabia, Mourad Ghiat, Hamza GuebbaiThe purpose of this paper is to present a new conjugate gradient method for solving unconstrained nonlinear optimization problems, based on Perry’s idea. An accelerated adaptive algorithm is proposed, where our search direction satisfies the sufficient descent condition. The global convergence is analyzed using the spectral analysis. The numerical results are described for a set of standard test problems, and it is shown that the performance of the proposed method is better than that of the CG-DESCENT, the mBFGS and the SPDOCItem A new class of nonlinear conjugate gradient coefficients for unconstrained optimization(Asian-European Journal of Mathematics, 2022) Amina Boumediene; Tahar Bechouat; Rachid Benzine; Ghania HadjiThe nonlinear Conjugate gradient method (CGM) is a very effective way in solving largescale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by BNPRPk . They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.Item Benrabia distribution: properties and applications(2022) Mohammed Benrabia; Loai M. A. AlZoubiIn this paper, we propose a new two parameter continuous distribution. It is called a Benrabia distribution. Some statistical properties are derived such as: the moment generating function, the moments and related measures, the reliability analysis and related functions. Also, the distribution of order statistics and the quantile function are presented and the R´ enyi entropy is derived. The method of maximum likelihood estimation is used to estimate the distribution parameters. A simulation is performed to investigate the performance of MLE, real data applications show that the proposed distribution can provide a better fit than several well-known distributions.Item A new hybrid conjugate gradient method of unconstrained optimization methods(Asian-European Journal of Mathematics, 2022) Chenna Nasreddine; Badreddine Sellami; Belloufi MohammedIn this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.Item Alzoubi Distribution: Properties and Applications(2022) Mohammed Benrabia; Loai M. A. AlzoubiIn this article, a new two parameters distribution named Alzoubi distribution (AzD) is suggested. Its moments have been obtained. Reliability analysis including hazard rate, cumulative hazard rate and reversed hazard rate functions and the entropy have been discussed, the deviation about mean and median is derived, and the distribution of order statistics is obtained. A simulation study is performed to estimate the model parameters using the maximum likelihood and the ordinary and weighted least squares methods. The goodness of fit to real data set shows the superiority of the new distribution. Keywords: Mixing distribution, Alzoubi distribution, moments, reliability analysis, R´ enyi entropy, maximum likelihood estimation, moment generating function.Item A new hybrid conjugate gradient method of unconstrained optimization methods(2022) Chenna Nasreddine; Sellami Badreddine; Belloufi MohammedIn this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.Item Loai Distribution: Properties, Parameters Estimation and Application to Covid-19 Real Data(2022) Loai M. A. Alzoubi; Mohammad M. Gharaibeh; Ahmad M. Alkhazaalh; Mohammed BenrabiaIn this paper, we propose a new two parameter continuous distribution. It is called Loai distribution. Some statistical properties of this distribution are derived such as: the moment generating function, the moments and related measures, the reliability analysis, and associated functions. Also, the distribution of order statistics and the quantile function are presented. Shannon, Re’nyi and Tsallis entropies are derived. The method of maximum likelihood and some other methods of estimation are used to estimate the distribution parameters. A simulation study is performed to investigate the performance of different methods of estimation. Covid-19 real data applications show that the proposed distribution can provide a better t than several well-known distributions.Item New iterative conjugate gradient method for nonlinear unconstrained optimization(RAIRO-Operations Research, 2022) Sabrina Ben Hanachi; Badreddine Sellami; Mohammed BelloufiConjugate gradient methods (CG) are an important class of methods for solving unconstrained optimization problems, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new conjugate gradient method for unconstrained optimization. This method is a convex combination of Fletcher and Reeves (abbreviated FR), Polak–Ribiere–Polyak (abbreviated PRP) and Dai and Yuan (abbreviated DY) methods. The new conjugate gradient methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for this method. The numerical experiments are done to test the efficiency of the proposed method, which confirms its promising potentials.Item Exponential decay and numerical solution of nonlinear Bresse-Timoshenko system with second sound(Journal of Thermal Stresses, 2022) Salim Adjemi; Ahmed Berkane; Salah Zitouni; Tahar BechouatThis paper aims to study the one-dimensional nonlinear Bresse-Timoshenko system with second sound where the heat conduction given by Cattaneo’s law is effective in the second equation. We prove that the system is exponentially stable by using the energy method that requires constructing a suitable Lyapunov functional through exploiting the multipliers method. Furthermore, the result does not depend on any condition on the coefficients of the system. Finally, we validate our theoretical result by performing some numerical approximations based on the standard finite elements method, by using the backward Euler scheme for the temporal and spatial discretization.Item An Implicit Iteration Method for Solving Linear Ill-Posed Operator Equations(Numerical Analysis and Applications, 2023) Bechouat TaharIn this work, we study a new implicit method to computing the solutions of ill-posed linear operator equations of the first kind under the setting of compact operators. The regularization theory can be used to demonstrate the stability and convergence of this scheme. Furthermore, we obtain convergence results and effective stopping criteria according to Morozov’s discrepancy principle. The numerical performances are conducted to show the validity of our implicit method and demonstrate its applicability to deblurring problems.Item A Collocation Method for Fredholm Integral Equations of the First Kind via Iterative Regularization Scheme(Mathematical Modelling and Analysis, 2023) Bechouat TaharTo solve the ill-posed integral equations, we use the regularized collocation method. This numerical method is a combination of the Legendre polynomials with non-stationary iterated Tikhonov regularization with fixed parameter. A theoretical justification of the proposed method under the required assumptions is detailed. Finally, numerical experiments demonstrate the efficiency of this method.Item TWO MODIFIED CONJUGATE GRADIENT METHODS FOR SOLVING UNCONSTRAINED OPTIMIZATION AND APPLICATION(RAIRO Operations Research, 2023) Abd Elhamid Mehamdia; Yacine Chaib; Bechouat TaharConjugate gradient methods are a popular class of iterative methods for solving linear systems of equations and nonlinear optimization problems as they do not require the storage of any matrices. In order to obtain a theoretically effective and numerically efficient method, two modified conjugate gradient methods (called the MCB1 and MCB2 methods) are proposed. In which the coefficient 𝛽𝑘 in the two proposed methods is inspired by the structure of the conjugate gradient parameters in some existing conjugate gradient methods. Under the strong Wolfe line search, the sufficient descent property and global convergence of the MCB1 method are proved. Moreover, the MCB2 method generates a descent direction independently of any line search and produces good convergence properties when the strong Wolfe line search is employed. Preliminary numerical results show that the MCB1 and MCB2 methods are effective and robust in minimizing some unconstrained optimization problems and each of these modifications outperforms the four famous conjugate gradient methods. Furthermore, the proposed algorithms were extended to solve the problem of mode function.Item Numerical solution of the two-dimensional first kind Fredholm integral equations using a regularized collocation method(Computational and Applied Mathematics, 2023) Tahar Bechouat; Nadjib BoussetilaIn this study, we present an advanced numerical model for solving the 2D first kind Fredholm integral equations, which are well known to be ill-posed problems. This numerical approach is built on the quadrature formula with the Lavrentiev regularization method. Under some essential assumptions, a comprehensive theoretical explanation of the presented numerical approach is provided. Finally, various numerical examples support the theoretical findings and demonstrate the accuracy of our method.Item A new hybrid CG method as convex combination(Mathematical Foundations of Computing, 2023) Amina Hallal; Mohammed Belloufi; Badreddine SellamiConjugate gradient methods are among the most efficient methods for solving optimization models. In this paper, a newly proposed conjugate gradient method is proposed for solving optimization problems as a convex combination of the Harger-Zhan and Dai-Yaun nonlinear conjugate gradient methods, which is capable of producing the sufficient descent condition with global convergence properties under the strong Wolfe conditions. The numerical results demonstrate the efficiency of the proposed method with some benchmark problems.Item New Method for Solving the Inverse Thermal Conduction Problem (Θ-Scheme Combined with CG Method under Strong Wolfe Line Search)(Buildings, 2023) Rachid Djeffal; Djemoui Lalmi; Sidi Mohammed El Amine Bekkouche; Tahar Bechouat; Zohir YounsiMost thermal researchers have solved thermal conduction problems (inverse or direct) using several different methods. These include the usual discretization methods, conventional and special estimation methods, in addition to simple synchronous gradient methods such as finite elements, including finite and special quantitative methods. Quantities found through the finite difference methods, i.e., explicit, implicit or Crank–Nicolson scheme method, have also been adopted. These methods offer many disadvantages, depending on the different cases; when the solutions converge, limited range stability conditions. Accordingly, in this paper, a new general outline of the thermal conduction phenomenon, called (θ-scheme), as well as a gradient conjugate method that includes strong Wolfe conditions has been used. This approach is the most useful, both because of its accuracy (16 decimal points of importance) and the speed of its solutions and convergence; by addressing unfavorable adverse problems and stability conditions, it can also have wide applications. In this paper, we applied two approaches for the control of the boundary conditions: constant and variable. The θ-scheme method has rarely been used in the thermal field, though it is unconditionally more stable for θ∈ [0.5, 1]. The simulation was carried out using Matlab software.Item A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems(Journal of Applied Mathematics and Computing, 2023-03-25) Naima Hamel, Noureddine Benrabia, Mourad Ghiat, Hamza GuebbaiIn this paper, we propose a new hybrid conjugate gradient method to solve unconstrained optimization problems. This new method is defined as a convex combination of DY and DL conjugate gradient methods. The special feature is that our search direction respects Newton’s direction, but without the need to store or calculate the second derivative (the Hessian matrix), due to the use of the secant equation that allows us to remove the troublesome part required by the Newton method. Our search direction not only satisfies the descent property, but also the sufficient descent condition through the use of the strong Wolfe line search, the global convergence is proved. The numerical comparison shows the efficiency of the new algorithm, as it outperforms both the DY and DL algorithms.Item DEVELOPING A NEW CONJUGATE GRADIENT ALGORITHM WITH THE BENEFIT OF SOME DESIRABLE PROPERTIES OF THE NEWTON ALGORITHM FOR UNCONSTRAINED OPTIMIZATION(Journal of Applied Analysis & Computation, 2024-02-15) Naima Hamel, Noureddine Benrabia, Mourad Ghiat, Hamza GuebbaiThe conjugate gradient method and the Newton method are both numerical optimization techniques. In this paper, we aim to combine some desirable characteristics of these two methods while avoiding their drawbacks, more specifically, we aim to develop a new optimization algorithm that preserves some essential features of the conjugate gradient algorithm, including the simplicity, the low memory requirements, the ability to solve large scale problems and the convergence to the solution regardless of the starting vector (global convergence). At the same time, this new algorithm approches the quadratic convergence behavior of the Newton method in the numerical sense while avoiding the computational cost of evaluating the Hessian matrix directly and the sensitivity of the selected starting vector. To do this, we propose a new hybrid conjugate gradient method by linking (CD) and (WYL) methods in a convex blend, the hybridization paramater is computed so that the new search direction accords with the Newton direction, but avoids the computational cost of evaluating the Hessian matrix directly by using the secant equation. This makes the proposed algorithm useful for solving large scale optimization problems. The sufficient descent condition is verified, also the global convergence is proved under a strong Wolfe Powel line search. The numerical tests show that, the proposed algorithm provides the quadratic convergence behavior and confirm its efficiency as it outperformed both the (WYL) and (CD) algorithms.