A4 Refereed article in a conference publication

Unsupervised Multi-Class Regularized Least-Squares Classification




AuthorsPahikkala T, Airola A, Gieseke F, Kramer O

EditorsZaki MJ, Siebes A, Yu JX, Goethals B, Webb G, Wu X

PublisherIEEE Computer Society

Publication year2012

JournalIEEE International Conference on Data Mining

Book title The 12th IEEE International Conference on Data Mining (ICDM 2012)

First page 585

Last page594

Number of pages10

ISBN978-1-4673-4649-8

ISSN1550-4786

DOIhttps://doi.org/10.1109/ICDM.2012.71


Abstract

Regularized least-squares classification is one of the most promising
alternatives to standard support vector machines, with the desirable
property of closed-form solutions that can be obtained analytically, and
efficiently. While the supervised, and mostly binary case has received
tremendous attention in recent years, unsupervised multi-class settings
have not yet been considered. In this work we present an efficient
implementation for the unsupervised extension of the multi-class
regularized least-squares classification framework, which is, to the
best of the authors' knowledge, the first one in the literature
addressing this task. The resulting kernel-based framework efficiently
combines steepest descent strategies with powerful meta-heuristics for
avoiding local minima. The computational efficiency of the overall
approach is ensured through the application of matrix algebra shortcuts
that render efficient updates of the intermediate candidate solutions
possible. Our experimental evaluation indicates the potential of the
novel method, and demonstrates its superior clustering performance over a
variety of competing methods on real-world data sets.


Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 18:53