A1 Vertaisarvioitu alkuperäisartikkeli tieteellisessä lehdessä
On Learning and Cross-Validation with Decomposed Nystrom Approximation of Kernel Matrix
Tekijät: Antti Airola, Tapio Pahikkala, Tapio Salakoski
Kustantaja: SPRINGER
Julkaisuvuosi: 2011
Journal: Neural Processing Letters
Tietokannassa oleva lehden nimi: NEURAL PROCESSING LETTERS
Lehden akronyymi: NEURAL PROCESS LETT
Numero sarjassa: 1
Vuosikerta: 33
Numero: 1
Aloitussivu: 17
Lopetussivu: 30
Sivujen määrä: 14
ISSN: 1370-4621
DOI: https://doi.org/10.1007/s11063-010-9159-4
The high computational costs of training kernel methods to solve nonlinear tasks limits their applicability. However, recently several fast training methods have been introduced for solving linear learning tasks. These can be used to solve nonlinear tasks by mapping the input data nonlinearly to a low-dimensional feature space. In this work, we consider the mapping induced by decomposing the Nystrom approximation of the kernel matrix. We collect together prior results and derive new ones to show how to efficiently train, make predictions with and do cross-validation for reduced set approximations of learning algorithms, given an efficient linear solver. Specifically, we present an efficient method for removing basis vectors from the mapping, which we show to be important when performing cross-validation.