A1 Vertaisarvioitu alkuperäisartikkeli tieteellisessä lehdessä

On Learning and Cross-Validation with Decomposed Nystrom Approximation of Kernel Matrix




TekijätAntti Airola, Tapio Pahikkala, Tapio Salakoski

KustantajaSPRINGER

Julkaisuvuosi2011

JournalNeural Processing Letters

Tietokannassa oleva lehden nimiNEURAL PROCESSING LETTERS

Lehden akronyymiNEURAL PROCESS LETT

Numero sarjassa1

Vuosikerta33

Numero1

Aloitussivu17

Lopetussivu30

Sivujen määrä14

ISSN1370-4621

DOIhttps://doi.org/10.1007/s11063-010-9159-4


Tiivistelmä

The high computational costs of training kernel methods to solve nonlinear tasks limits their applicability. However, recently several fast training methods have been introduced for solving linear learning tasks. These can be used to solve nonlinear tasks by mapping the input data nonlinearly to a low-dimensional feature space. In this work, we consider the mapping induced by decomposing the Nystrom approximation of the kernel matrix. We collect together prior results and derive new ones to show how to efficiently train, make predictions with and do cross-validation for reduced set approximations of learning algorithms, given an efficient linear solver. Specifically, we present an efficient method for removing basis vectors from the mapping, which we show to be important when performing cross-validation.




Last updated on 2024-26-11 at 23:41