A1 Vertaisarvioitu alkuperäisartikkeli tieteellisessä lehdessä

Different Coefficients for Studying Dependence




TekijätRainio Oona

KustantajaSPRINGER

Julkaisuvuosi2022

JournalSankhya B: The Indian Journal of Statistics

Tietokannassa oleva lehden nimiSANKHYA-SERIES B-APPLIED AND INTERDISCIPLINARY STATISTICS

Lehden akronyymiSANKHYA SER B

Sivujen määrä20

ISSN0976-8386

eISSN0976-8394

DOIhttps://doi.org/10.1007/s13571-022-00295-0

Verkko-osoitehttps://doi.org/10.1007/s13571-022-00295-0

Rinnakkaistallenteen osoitehttps://research.utu.fi/converis/portal/detail/Publication/176490066


Tiivistelmä
Through computer simulations, we research several different measures of dependence, including Pearson's and Spearman's correlation coefficients, the maximal correlation, the distance correlation, a function of the mutual information called the information coefficient of correlation, and the maximal information coefficient (MIC). We compare how well these coefficients fulfill the criteria of generality, power, and equitability. Furthermore, we consider how the exact type of dependence, the amount of noise and the number of observations affect their performance. According to our results, the maximal correlation is often the best choice of these measures of dependence because it can recognize both functional and non-functional types of dependence, fulfills a certain definition of equitability relatively well, and has very high statistical power when the noise grows if there are enough observations. While Pearson's correlation does not find symmetric non-monotonic dependence, it has the highest statistical power for recognizing linear and non-linear but monotonic dependence. The MIC is very sensitive to the noise and therefore has the weakest statistical power.

Ladattava julkaisu

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 22:13