A4 Vertaisarvioitu artikkeli konferenssijulkaisussa

Efficient Hold-Out for Subset of Regressors




TekijätPahikkala T, Suominen H, Boberg J, Salakoski T

ToimittajaKolehmainen Mikko, Toivanen Pekka, Beliczynski Bartlomiej

Konferenssin vakiintunut nimi9th International Conference on Adaptive and Natural Computing Algorithms

Julkaisuvuosi2009

JournalLecture Notes in Computer Science

Kokoomateoksen nimiProceedings of the 9th International Conference on Adaptive and Natural Computing Algorithms (ICANNGA'09)

Tietokannassa oleva lehden nimiADAPTIVE AND NATURAL COMPUTING ALGORITHMS

Lehden akronyymiLECT NOTES COMPUT SC

Vuosikerta5495

Aloitussivu350

Lopetussivu359

Sivujen määrä10

ISBN978-3-642-04920-0

ISSN0302-9743


Tiivistelmä
Hold-out and cross-validation are among the most useful methods for model selection and performance assessment of machine learning algorithms. In this paper, we present a computationally efficient algorithm for calculating the hold-out performance for sparse regularized least-squares (RLS) in case the method is already trained with the whole training set. The computational complexity of performing the hold-out is O(vertical bar H vertical bar(3) + vertical bar H vertical bar(2)n), where vertical bar H vertical bar is the size of the hold-out set and n is the number of basis vectors. The algorithm can thus be used to calculate various types of cross-validation estimates effectively. For example, when m, is the number of training examples, the complexities of N-fold and leave-one-out cross-validations are O(m(3)/N(2) + (m(2)n)/N) and O(mn), respectively. Further, since sparse RLS can be trained in O(mn(2)) time for several regularization parameter values in parallel, the fast hold-out algorithm enables efficient; selection of the optimal parameter value.



Last updated on 2024-26-11 at 14:38