A4 Refereed article in a conference publication
Poro 34B and the Blessing of Multilinguality
Authors: Luukkonen, Risto; Burdge, Jonathan; Zosa, Elaine; Talman, Aarne; Komulainen, Ville; Hatanpää, Väinö; Sarlin, Peter; Pyysalo, Sampo
Editors: Johansson, Richard; Stymne, Sara
Conference name: Nordic Conference on Computational Linguistics and Baltic Conference on Human Language Technologies
Publication year: 2025
Journal: NEALT proceedings series
Book title : Proceedings of the Joint 25th Nordic Conference on Computational Linguistics and 11th Baltic Conference on Human Language Technologies (NoDaLiDa/Baltic-HLT 2025)
Volume: 57
First page : 367
Last page: 382
ISBN: 978-9908-53-109-0
ISSN: 1736-8197
eISSN: 1736-6305
Publication's open availability at the time of reporting: Open Access
Publication channel's open availability : Open Access publication channel
Web address : https://aclanthology.org/2025.nodalida-1.40/
Self-archived copy’s web address: https://research.utu.fi/converis/portal/detail/Publication/506554658
The pretraining of state-of-the-art large language models now requires trillions of words of text, which is orders of magnitude more than available for the vast majority of languages. While including text in more than one language is an obvious way to acquire more pretraining data, multilinguality is often seen as a curse, and most model training efforts continue to focus near-exclusively on individual large languages. We believe that multilinguality can be a blessing: when the lack of training data is a constraint for effectively training larger models for a target language, augmenting the dataset with other languages can offer a way to improve over the capabilities of monolingual models for that language. In this study, we introduce Poro 34B, a 34 billion parameter model trained for 1 trillion tokens of Finnish, English, and programming languages, and demonstrate that a multilingual training approach can produce a model that substantially advances over the capabilities of existing models for Finnish and excels in translation, while also achieving competitive performance in its class for English and programming languages. We release the model parameters, scripts, and data under open licenses at https://huggingface.co/LumiOpen/Poro-34B.
Downloadable publication This is an electronic reprint of the original article. |
Funding information in the publication:
This project has received funding from the European Union’s Horizon Europe research and innovation programme under Grant agreement No 101070350.