A1 Vertaisarvioitu alkuperäisartikkeli tieteellisessä lehdessä

Efficient regularized least-squares algorithms for conditional ranking on relational data




TekijätPahikkala T, Airola A, Stock M, De Baets B, Waegeman W

KustantajaSPRINGER

Julkaisuvuosi2013

JournalMachine Learning

Tietokannassa oleva lehden nimiMACHINE LEARNING

Lehden akronyymiMACH LEARN

Numero sarjassa2-3

Vuosikerta93

Numero2-3

Aloitussivu321

Lopetussivu356

Sivujen määrä36

ISSN0885-6125

DOIhttps://doi.org/10.1007/s10994-013-5354-7


Tiivistelmä
In domains like bioinformatics, information retrieval and social network analysis, one can find learning tasks where the goal consists of inferring a ranking of objects, conditioned on a particular target object. We present a general kernel framework for learning conditional rankings from various types of relational data, where rankings can be conditioned on unseen data objects. We propose efficient algorithms for conditional ranking by optimizing squared regression and ranking loss functions. We show theoretically, that learning with the ranking loss is likely to generalize better than with the regression loss. Further, we prove that symmetry or reciprocity properties of relations can be efficiently enforced in the learned models. Experiments on synthetic and real-world data illustrate that the proposed methods deliver state-of-the-art performance in terms of predictive power and computational efficiency. Moreover, we also show empirically that incorporating symmetry or reciprocity properties can improve the generalization performance.



Last updated on 2024-26-11 at 17:50