Generalized vec trick for fast learning of pairwise kernel models




Viljanen Markus, Airola Antti, Pahikkala Tapio

PublisherSPRINGER

2022

Machine Learning

MACHINE LEARNING

MACH LEARN

111

2

543

573

31

0885-6125

1573-0565

DOIhttps://doi.org/10.1007/s10994-021-06127-y

https://link.springer.com/article/10.1007/s10994-021-06127-y

https://research.utu.fi/converis/portal/detail/Publication/69083033

https://arxiv.org/abs/2009.01054



Pairwise learning corresponds to the supervised learning setting where the goal is to make predictions for pairs of objects. Prominent applications include predicting drug-target or protein-protein interactions, or customer-product preferences. In this work, we present a comprehensive review of pairwise kernels, that have been proposed for incorporating prior knowledge about the relationship between the objects. Specifically, we consider the standard, symmetric and anti-symmetric Kronecker product kernels, metric-learning, Cartesian, ranking, as well as linear, polynomial and Gaussian kernels. Recently, a O(nm + nq) time generalized vec trick algorithm, where n, m, and q denote the number of pairs, drugs and targets, was introduced for training kernel methods with the Kronecker product kernel. This was a significant improvement over previous O(n(2)) training methods, since in most real-world applications m, q << n. In this work we show how all the reviewed kernels can be expressed as sums of Kronecker products, allowing the use of generalized vec trick for speeding up their computation. In the experiments, we demonstrate how the introduced approach allows scaling pairwise kernels to much larger data sets than previously feasible, and provide an extensive comparison of the kernels on a number of biological interaction prediction tasks.

Last updated on 2024-26-11 at 17:31