A4 Refereed article in a conference publication

Multilingual and Zero-Shot is Closing in on Monolingual Web Register Classification




AuthorsRönnqvist Samuel, Skantsi Valtteri, Oinonen Miika, Laippala Veronika

EditorsSimon Dobnik, Lilja Øvrelid

Conference nameNordic Conference on Computational Linguistics

Publication year2021

Journal:Linköping Electronic Conference Proceedings

Book title Proceedings of the 23rd Nordic Conference on Computational Linguistics (NoDaLiDa)

Series titleLinköping Electronic Conference Proceedings

Number in series178

First page 157

Last page165

ISBN978-91-7929-614-8

ISSN1650-3686

Web address https://ep.liu.se/en/conference-article.aspx?series=ecp&issue=178&Article_No=16

Self-archived copy’s web addresshttps://research.utu.fi/converis/portal/detail/Publication/56911747


Abstract

This article studies register classification of documents from the unrestricted web, such as news articles or opinion blogs, in a multilingual setting, exploring both the benefit of training on multiple languages and the capabilities for zero-shot cross-lingual transfer. While the wide range of linguistic variation found on the web poses challenges for register classification, recent studies have shown that good levels of cross-lingual transfer from the extensive English CORE corpus to other languages can be achieved. In this study, we show that training on multiple languages 1) benefits languages with limited amounts of register-annotated data, 2) on average achieves performance on par with monolingual models, and 3) greatly improves upon previous zero-shot results in Finnish, French and Swedish. The best results are achieved with the multilingual XLM-R model. As data, we use the CORE corpus series featuring register annotated data from the unrestricted web.


Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 21:46