A1 Vertaisarvioitu alkuperäisartikkeli tieteellisessä lehdessä

Supervised invariant coordinate selection




TekijätLiski E, Nordhausen K, Oja H

KustantajaTaylor & Francis LTD

Julkaisuvuosi2014

JournalStatistics

Tietokannassa oleva lehden nimiSTATISTICS

Lehden akronyymiStatistics

Vuosikerta48

Numero4

Aloitussivu711

Lopetussivu731

Sivujen määrä21

ISSN0233-1888

DOIhttps://doi.org/10.1080/02331888.2013.800067


Tiivistelmä

Dimension reduction plays an important role in high-dimensional data analysis. Principal component analysis, independent component analysis, and sliced inverse regression (SIR) are well known but very different analysis tools for the dimension reduction. It appears that these three approaches can all be seen as a comparison of two different scatter matrices S-1 and S-2. The components for dimension reduction are then given by the eigenvectors of S-1 S-2. In SIR, the second scatter matrix is supervised and therefore the choice of the components is based on the dependence between the observed random vector and a response variable. Based on these notions, we extend the invariant coordinate selection (ICS), allowing the second scatter matrix S-2 to be supervised; supervised ICS can then be used in supervised dimension reduction. It is remarkable that many supervised dimension reduction methods proposed in the literature such as the linear discriminant analysis, canonical correlation analysis, SIR, sliced average variance estimate, directional regression, and principal Hessian directions can be reformulated in this way. Several families of supervised scatter matrices are discussed, and their use in supervised dimension reduction is illustrated with a real data example and simulations.




Last updated on 2024-26-11 at 19:28