A4 Refereed article in a conference publication

Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction




AuthorsPapaluca Andrea, Krefl Daniel, Suominen Hanna, Lenskiy Artem

EditorsSamuel Louvan, Andrea Madotto, Brielen Madureira

Conference nameAnnual Meeting of the Association for Computational Linguistics

Publication year2022

Book title Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop

Journal name in sourcePROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): STUDENT RESEARCH WORKSHOP

First page 373

Last page382

Number of pages10

ISBN978-1-955917-23-0

DOIhttps://doi.org/10.18653/v1/2022.acl-srw.29

Web address https://aclanthology.org/2022.acl-srw.29/

Self-archived copy’s web addresshttps://research.utu.fi/converis/portal/detail/Publication/176250533


Abstract
In this work we put forward to combine pre-trained knowledge base graph embeddings with transformer based language models to improve performance on the sentential Relation Extraction task in natural language processing. Our proposed model is based on a simple variation of existing models to incorporate off-task pre-trained graph embeddings with an on-task finetuned BERT encoder. We perform a detailed statistical evaluation of the model on standard datasets. We provide evidence that the added graph embeddings improve the performance, making such a simple approach competitive with the state-of-the-art models that perform explicit on-task training of the graph embeddings. Furthermore, we observe for the underlying BERT model an interesting power-law scaling behavior between the variance of the F1 score obtained for a relation class and its support in terms of training examples.

Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 23:53