A4 Refereed article in a conference publication
Unsupervised Multi-Class Regularized Least-Squares Classification
Authors: Pahikkala T, Airola A, Gieseke F, Kramer O
Editors: Zaki MJ, Siebes A, Yu JX, Goethals B, Webb G, Wu X
Publisher: IEEE Computer Society
Publication year: 2012
Journal: IEEE International Conference on Data Mining
Book title : The 12th IEEE International Conference on Data Mining (ICDM 2012)
First page : 585
Last page: 594
Number of pages: 10
ISBN: 978-1-4673-4649-8
ISSN: 1550-4786
DOI: https://doi.org/10.1109/ICDM.2012.71
Regularized least-squares classification is one of the most promising
alternatives to standard support vector machines, with the desirable
property of closed-form solutions that can be obtained analytically, and
efficiently. While the supervised, and mostly binary case has received
tremendous attention in recent years, unsupervised multi-class settings
have not yet been considered. In this work we present an efficient
implementation for the unsupervised extension of the multi-class
regularized least-squares classification framework, which is, to the
best of the authors' knowledge, the first one in the literature
addressing this task. The resulting kernel-based framework efficiently
combines steepest descent strategies with powerful meta-heuristics for
avoiding local minima. The computational efficiency of the overall
approach is ensured through the application of matrix algebra shortcuts
that render efficient updates of the intermediate candidate solutions
possible. Our experimental evaluation indicates the potential of the
novel method, and demonstrates its superior clustering performance over a
variety of competing methods on real-world data sets.
Downloadable publication This is an electronic reprint of the original article. |