Sliced Inverse Regression in Metric Spaces




Virta Joni, Lee Kuang-Yao, Li Lexin

PublisherSTATISTICA SINICA

2022

Statistica Sinica

STATISTICA SINICA

STAT SINICA

32

SI

2315

2337

23

1017-0405

1996-8507

DOIhttps://doi.org/10.5705/ss.202022.0097

https://www3.stat.sinica.edu.tw/statistica/j32n31/J32n3102/J32n3102.html

https://research.utu.fi/converis/portal/detail/Publication/176797275

https://arxiv.org/abs/2206.11511



In this article, we propose a general nonlinear sufficient dimension reduc-tion (SDR) framework when both the predictor and the response lie in some general metric spaces. We construct reproducing kernel Hilbert spaces with kernels that are fully determined by the distance functions of the metric spaces, and then leverage the inherent structures of these spaces to define a nonlinear SDR framework. We adapt the classical sliced inverse regression within this framework for the metric space data. Next we build an estimator based on the corresponding linear opera-tors, and show that it recovers the regression information in an unbiased manner. We derive the estimator at both the operator level and under a coordinate system, and establish its convergence rate. Lastly, we illustrate the proposed method using synthetic and real data sets that exhibit non-Euclidean geometry.

Last updated on 2024-26-11 at 12:24