A4 Refereed article in a conference publication

Poro 34B and the Blessing of Multilinguality




AuthorsLuukkonen, Risto; Burdge, Jonathan; Zosa, Elaine; Talman, Aarne; Komulainen, Ville; Hatanpää, Väinö; Sarlin, Peter; Pyysalo, Sampo

EditorsJohansson, Richard; Stymne, Sara

Conference nameNordic Conference on Computational Linguistics and Baltic Conference on Human Language Technologies

Publication year2025

Journal: NEALT proceedings series

Book title Proceedings of the Joint 25th Nordic Conference on Computational Linguistics and 11th Baltic Conference on Human Language Technologies (NoDaLiDa/Baltic-HLT 2025)

Volume57

First page 367

Last page382

ISBN978-9908-53-109-0

ISSN1736-8197

eISSN1736-6305

Publication's open availability at the time of reportingOpen Access

Publication channel's open availability Open Access publication channel

Web address https://aclanthology.org/2025.nodalida-1.40/

Self-archived copy’s web addresshttps://research.utu.fi/converis/portal/detail/Publication/506554658


Abstract

The pretraining of state-of-the-art large language models now requires trillions of words of text, which is orders of magnitude more than available for the vast majority of languages. While including text in more than one language is an obvious way to acquire more pretraining data, multilinguality is often seen as a curse, and most model training efforts continue to focus near-exclusively on individual large languages. We believe that multilinguality can be a blessing: when the lack of training data is a constraint for effectively training larger models for a target language, augmenting the dataset with other languages can offer a way to improve over the capabilities of monolingual models for that language. In this study, we introduce Poro 34B, a 34 billion parameter model trained for 1 trillion tokens of Finnish, English, and programming languages, and demonstrate that a multilingual training approach can produce a model that substantially advances over the capabilities of existing models for Finnish and excels in translation, while also achieving competitive performance in its class for English and programming languages. We release the model parameters, scripts, and data under open licenses at https://huggingface.co/LumiOpen/Poro-34B.


Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.




Funding information in the publication
This project has received funding from the European Union’s Horizon Europe research and innovation programme under Grant agreement No 101070350.


Last updated on 08/01/2026 08:48:49 AM