A1 Refereed original research article in a scientific journal

Different Coefficients for Studying Dependence




AuthorsRainio Oona

PublisherSPRINGER

Publication year2022

JournalSankhya B: The Indian Journal of Statistics

Journal name in sourceSANKHYA-SERIES B-APPLIED AND INTERDISCIPLINARY STATISTICS

Journal acronymSANKHYA SER B

Number of pages20

ISSN0976-8386

eISSN0976-8394

DOIhttps://doi.org/10.1007/s13571-022-00295-0

Web address https://doi.org/10.1007/s13571-022-00295-0

Self-archived copy’s web addresshttps://research.utu.fi/converis/portal/detail/Publication/176490066


Abstract
Through computer simulations, we research several different measures of dependence, including Pearson's and Spearman's correlation coefficients, the maximal correlation, the distance correlation, a function of the mutual information called the information coefficient of correlation, and the maximal information coefficient (MIC). We compare how well these coefficients fulfill the criteria of generality, power, and equitability. Furthermore, we consider how the exact type of dependence, the amount of noise and the number of observations affect their performance. According to our results, the maximal correlation is often the best choice of these measures of dependence because it can recognize both functional and non-functional types of dependence, fulfills a certain definition of equitability relatively well, and has very high statistical power when the noise grows if there are enough observations. While Pearson's correlation does not find symmetric non-monotonic dependence, it has the highest statistical power for recognizing linear and non-linear but monotonic dependence. The MIC is very sensitive to the noise and therefore has the weakest statistical power.

Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 22:13