A1 Journal article – refereed
Matrix representations, linear transformations, and kernels for disambiguation in natural language

List of Authors: Pahikkala T, Pyysalo S, Boberg J, Jarvinen J, Salakoski T
Publisher: SPRINGER
Publication year: 2009
Journal: Machine Learning
Journal name in source: MACHINE LEARNING
Journal acronym: MACH LEARN
Volume number: 74
Number of pages: 26
ISSN: 0885-6125

In the application of machine learning methods with natural language inputs, the words and their positions in the input text are some of the most important features. In this article, we introduce a framework based on a word-position matrix representation of text, linear feature transformations of the word-position matrices, and kernel functions constructed from the transformations. We consider two categories of transformations, one based on word similarities and the second on their positions, which can be applied simultaneously in the framework in an elegant way. We show how word and positional similarities obtained by applying previously proposed techniques, such as latent semantic analysis, can be incorporated as transformations in the framework. We also introduce novel ways to determine word and positional similarities. We further present efficient algorithms for computing kernel functions incorporating the transformations on the word-position matrices, and, more importantly, introduce a highly efficient method for prediction. The framework is particularly suitable to natural language disambiguation tasks where the aim is to select for a single word a particular property from a set of candidates based on the context of the word. We demonstrate the applicability of the framework to this type of tasks using context-sensitive spelling error correction on the Reuters News corpus as a model problem.

Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.

Last updated on 2019-29-01 at 10:36