A4 Refereed article in a conference publication

FinerWeb-10BT: Refining Web Data with LLM-Based Line-Level Filtering




AuthorsHenriksson, Erik; Tarkka, Otto; Ginter, Filip

EditorsJohansson, Richard; Stymne, Sara

Conference nameNordic Conference on Computational Linguistics and Baltic Conference on Human Language Technologies

Publication year2025

Journal: NEALT proceedings series

Book title Proceedings of the Joint 25th Nordic Conference on Computational Linguistics and 11th Baltic Conference on Human Language Technologies (NoDaLiDa/Baltic-HLT 2025)

Volume57

First page 258

Last page268

ISBN978-9908-53-109-0

ISSN1736-8197

eISSN1736-6305

Publication's open availability at the time of reportingOpen Access

Publication channel's open availability Open Access publication channel

Web address https://aclanthology.org/2025.nodalida-1.27/

Self-archived copy’s web addresshttps://research.utu.fi/converis/portal/detail/Publication/506553763


Abstract

Data quality is crucial for training Large Language Models (LLMs). Traditional heuristic filters often miss low-quality text or mistakenly remove valuable content. In this paper, we introduce an LLM-based line-level filtering method to enhance training data quality. We use GPT-4o mini to label a 20,000-document sample from FineWeb at the line level, allowing the model to create descriptive labels for low-quality lines. These labels are grouped into nine main categories, and we train a DeBERTa-v3 classifier to scale the filtering to a 10B-token subset of FineWeb. To test the impact of our filtering, we train GPT-2 models on both the original and the filtered datasets. The results show that models trained on the filtered data achieve higher accuracy on the HellaSwag benchmark and reach their performance targets faster, even with up to 25% less data. This demonstrates that LLM-based line-level filtering can significantly improve data quality and training efficiency for LLMs. We release our quality-annotated dataset, FinerWeb-10BT, and the codebase to support further work in this area.


Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.




Funding information in the publication
This project has received funding from the European Union’s Horizon Europe research and innovation programme under Grant agreement No 101070350 and from UK Research and Innovation (UKRI) under the UK government’s Horizon Europe funding guarantee [grant number 10052546]. This work was supported by the Research Council of Finland.


Last updated on 08/01/2026 08:26:28 AM