A1 Refereed original research article in a scientific journal
Different Coefficients for Studying Dependence
Authors: Rainio Oona
Publisher: SPRINGER
Publication year: 2022
Journal: Sankhya B: The Indian Journal of Statistics
Journal name in source: SANKHYA-SERIES B-APPLIED AND INTERDISCIPLINARY STATISTICS
Journal acronym: SANKHYA SER B
Number of pages: 20
ISSN: 0976-8386
eISSN: 0976-8394
DOI: https://doi.org/10.1007/s13571-022-00295-0
Web address : https://doi.org/10.1007/s13571-022-00295-0
Self-archived copy’s web address: https://research.utu.fi/converis/portal/detail/Publication/176490066
Through computer simulations, we research several different measures of dependence, including Pearson's and Spearman's correlation coefficients, the maximal correlation, the distance correlation, a function of the mutual information called the information coefficient of correlation, and the maximal information coefficient (MIC). We compare how well these coefficients fulfill the criteria of generality, power, and equitability. Furthermore, we consider how the exact type of dependence, the amount of noise and the number of observations affect their performance. According to our results, the maximal correlation is often the best choice of these measures of dependence because it can recognize both functional and non-functional types of dependence, fulfills a certain definition of equitability relatively well, and has very high statistical power when the noise grows if there are enough observations. While Pearson's correlation does not find symmetric non-monotonic dependence, it has the highest statistical power for recognizing linear and non-linear but monotonic dependence. The MIC is very sensitive to the noise and therefore has the weakest statistical power.
Downloadable publication This is an electronic reprint of the original article. |