Programme Overview
Detailed Programme

Clustering and Unsupervised Learning (CUL)

CHAIR: TATIANA TAMBOURATZIS

Time: Wednesday, March 23rd, 11h00-12h40

Paper ID   Title
   
CUL-1 Improved clustering by rotation of cluster centres
CUL-2 Hierarchical Growing Neural Gas
CUL-3 A Fuzzy Clustering Algorithm using Cellular Learning Automata based Evolutionary Algorithm
CUL-4 Estimating the number of clusters from distributional results of partitioning a given data set
CUL-5 AUDyC Neural Network using a new Gaussian Densities Merge Mechanism


CUL-1
 
Title: Improved clustering by rotation of cluster centres
Author(s): D. W. Pearson,
M. Batton-Hubert
Abstract: In this paper we present a method that leads to the improvement of a subtractive clustering model by modifying the centres. In order to keep within certain bounds, a centre is modified by rotating it.


CUL-2
 
Title: Hierarchical Growing Neural Gas
Author(s): K.A.J. Doherty,
R.G. Adams,
N. Davey
Abstract: This paper describes TreeGNG, a top-down unsupervised learning method that produces hierarchical classification schemes. TreeGNG is an extension to the Growing Neural Gas algorithm that maintains a time history of the learned topological mapping. TreeGNG is able to correct poor decisions made during the early phases of the construction of the tree, and provides the novel ability to influence the general shape and form of the learned hierarchy.


CUL-3
 
Title: A Fuzzy Clustering Algorithm using Cellular Learning Automata based Evolutionary Algorithm
Author(s): R. Rastegar,
A. Hariri,
M. Meybodi
Abstract: In this paper, a new fuzzy clustering algorithm that uses cellular learning automata based evolutionary computing (CLA-EC) is proposed. The CLA-EC is a model obtained by combining the concepts of cellular learning automata and evolutionary algorithms. The CLA-EC is used to search for cluster centers in such a way that minimizes the clustering criterion. The simulation results indicate that the proposed algorithm produces clusters with acceptable quality with respect to clustering criterion and provides a performance that is superior to that of the C-means algorithm.


CUL-4
 
Title: Estimating the number of clusters from distributional results of partitioning a given data set
Author(s): U. Möller
Abstract: When estimating the optimal value of the number of clusters, C, of a given data set, one typically uses, for each candidate value of C, a single (final) result of the clustering algorithm. If distributional data of size T are used, these data come from T data sets obtained, e.g., by a bootstrapping technique. Here a new approach is introduced that utilizes distributional data generated by clustering the original data T times in the framework of cost function optimization and cluster validity indices. Results of this method are reported for model data (100 realizations) and gene expression data. The probability of correctly estimating the number of clusters was often higher compared to recently published results of several classical methods and a new statistical approach (Clest).


CUL-5
 
Title: AUDyC Neural Network using a new Gaussian Densities Merge Mechanism
Author(s): Habiboulaye Amadou Boubacar,
Stéphane Lecoeuche,
Salah Maouche
Abstract: In the context of evolutionary data classification, dynamical modeling techniques are useful to continuously learn clusters models. Dedicated to on-line clustering, the AUDyC (Auto-adaptive and Dynamical Clustering) algorithm is an unsupervised neural network with auto-adaptive abilities in non-stationary environment. These particular abilities are based on specific learning rules that are developed into three stages: “Classification”, “Evaluation” and “Fusion”. In this paper, we propose a new densities merge mechanism to improve the “Fusion” stage in order to avoid some local optima drawbacks of Gaussian fitting. The novelty of our approach is to use an ambiguity rule of fuzzy modelling with new merge acceptance criteria. Our approach can be generalized to any type of fuzzy classification method using Gaussian models. Some experiments are presented to show the efficiency of our approach to circumvent to AUDyC NN local optima problems.