Department of Computer Science
Permanent URI for this communityhttps://dspace.univ-soukahras.dz/handle/123456789/15
Browse
Item A modified incremental density based clustering algorithm(IEEE, 2022) Aida ChefrourCluster analysis, generally known as clustering, is a technique for separating data into groups (clusters) of similar objects. Except if the system is completely retrained, traditional clustering classifiers will be unable to learn new information and knowledge (attributes, examples, or classes). Only incremental learning, which outperforms when new data objects are introduced into an existing database, can solve this problem. These evolutionary strategies are applied to dynamic databases by updating the data. We’ll choose to study the Incremental Density- Based Spatial Clustering of Applications with Noise algorithm because of its capacity to discover arbitrary clusters and identify noise. In this study, a modified version of the Incremental Density Based Clustering Algorithm using an Adaptive Median Filtering Technique was used. The difference between our previous proposed AMF-IDBSCAN and the proposed algorithm developed in this work is in the evaluation performance stage. The key idea consists of a database change in the case of introducing new data items to an existing database in order to improve performance. We conducted several experiments on benchmark and synthetic data collected from the University of California Irvine repository in terms of the Generalized Dunn Index, Davies Bouldin Index, and change of time (milliseconds) with the increment of data in the original database. Experiments with datasets of various sizes and dimensions show that the proposed algorithm enhances clustering when compared to several current incremental wellknown techniques.Item A Novel Incremental Learning Algorithm Based on Incremental Vector Support Machina and Incremental Neural Network Learn++(Lavoisier, 2019) Aida Chefrour; Labiba Souici-Meslati; Iness Difi; Nesrine BakkoucheIncremental learning refers to the learning of new information iteratively without having to fully retain the classifier. However, a single classifier cannot realize incremental learning if the classification problem is too complex and scalable. To solve the problem, this paper combines the incremental support vector machine (ISVM) and the incremental neural network Learn++ into a novel incremental learning algorithm called the ISVM-Learn++. The two incremental classifiers were merged by parallel combination and weighted sum combination. The proposed algorithm was tested on three datasets, namely, three databases Ionosphere, Haberman's Survival, and Blood Transfusion Service Center. The results show that the ISVM Learn ++ achieved a learning rate of 98 %, better than that of traditional incremental learning algorithms. The research findings shed new light on incremental supervised machine learning.Item AMF-IDBSCAN: Incremental Density Based Clustering Algorithm using Adaptive Median Filtering Technique(Slovene Society Informatika, 2019) Aida Chefrour; Labiba Souici-MeslatiDensity-based spatial clustering of applications with noise (DBSCAN) is a fundamental algorithm for density-based clustering. It can discover clusters of arbitrary shapes and sizes from a large amount of data, which contains noise and outliers. However, it fails to treat large datasets, outperform when new objects are inserted into the existing database, remove noise points or outliers totally and handle the local density variation that exists within the cluster. So, a good clustering method should allow a significant density modification within the cluster and should learn dynamics and large databases. In this paper, an enhancement of the DBSCAN algorithm is proposed based on incremental clustering called AMF-IDBSCAN which builds incrementally the clusters of different shapes and sizes in large datasets and eliminates the presence of noise and outliers. The proposed AMF-IDBSCAN algorithm uses a canopy clustering algorithm for pre-clustering the data sets to decrease the volume of data, applies an incremental DBSCAN for clustering the data points and Adaptive Median Filtering (AMF) technique for post-clustering to reduce the number of outliers by replacing noises by chosen medians. Experiments with AMF-IDBSCAN are performed on the University of California Irvine (UCI) repository UCI data sets. The results show that our algorithm performs better than DBSCAN, IDBSCAN, and DMDBSCAN.Item CAE-CNN: Image Classification Using Convolutional AutoEncoder Pre-Training(2022) Aida Chefrour; Samia DrissiThe work presented in this paper is in the general framework of classification using deep learning and, more precisely, that of convolutional neural networks (CNN). In particular, the convolutional autoencoder proposes an alternative for the processing of high-dimensional data, to facilitate their classification. In this paper, we propose the incorporation of convolutional autoencoders as a general unsupervised learning data dimension reduction method for creating robust and compressed feature representations in order to improve CNN performance on image classification tasks. For prediction reasons, we applied the two methods to the MNIST image databases. The use of CNN with the convolutional autoencoder gives better results compared to the individual use of each of them in terms of accuracy, to obtain a good classification of the data high-dimensional entrance.Item Incremental supervised learning: algorithms and applications in pattern recognition(SPRINGER, 2019) Aida ChefrourThe most effective well-known methods in the context of static machine learning offer no alternative to evolution and dynamic adaptation to integrate new data or to restructure problems already partially learned. In this area, the incremental learning represents an interesting alternative and constitutes an open research field, becoming one of the major concerns of the machine learning and classification community. In this paper, we study incremental supervised learning techniques and their applications, especially in the field of pattern recognition. This article presents an overview of the main concepts and supervised algorithms of incremental learning, including a synthesis of research studies done in this field and focusing on neural networks, decision trees and support vector machines.Item K-CAE: Image Classification Using Convolutional AutoEncoder Pre Training and K-means Clustering(Slovene Society Informatika, 2023) Aida Chefrour; Samia DrissiThe work presented in this paper is in the general framework of classification using deep learning and, more precisely, that of convolutional Autoencoder. In particular, this last proposes an alternative for the processing of high-dimensional data, to facilitate their classification. In this paper, we propose the incorporation of convolutional autoencoders as a general unsupervised learning data dimension reduction method for creating robust and compressed feature representations for better storage and transmission to the classification process to improve K-means performance on image classification tasks. The experimental results on three image databases, MNIST, Fashion-MNIST, and CIFAR-10, show that the proposed method significantly outperforms deep clustering models in terms of clustering quality.Item Unsupervised Deep Learning: Taxonomy and Algorithms(Slovene Society Informatika, 2022) Aida Chefrour; Labiba Souici-MeslatiClustering is a fundamental challenge in many data-driven application fields and machine learning techniques. The data distribution determines the quality of the outcomes, which has a significant impact on clustering performance. As a result, deep neural networks can be used to learn more accurate data representations for clustering. Many recent studies have focused on employing deep neural networks to develop a clustering-friendly representation, which has resulted in a significant improvement in clustering performance. We present a systematic survey of clustering with deep learning in this study. Then, a taxonomy of deep clustering is proposed, as well as some sample algorithms for our overview. Finally, we discuss some exciting future possibilities for clustering using deep learning and offer some remarks