Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction




Papaluca Andrea, Krefl Daniel, Suominen Hanna, Lenskiy Artem

Samuel Louvan, Andrea Madotto, Brielen Madureira

Annual Meeting of the Association for Computational Linguistics

2022

Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop

PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): STUDENT RESEARCH WORKSHOP

373

382

10

978-1-955917-23-0

DOIhttps://doi.org/10.18653/v1/2022.acl-srw.29

https://aclanthology.org/2022.acl-srw.29/

https://research.utu.fi/converis/portal/detail/Publication/176250533



In this work we put forward to combine pre-trained knowledge base graph embeddings with transformer based language models to improve performance on the sentential Relation Extraction task in natural language processing. Our proposed model is based on a simple variation of existing models to incorporate off-task pre-trained graph embeddings with an on-task finetuned BERT encoder. We perform a detailed statistical evaluation of the model on standard datasets. We provide evidence that the added graph embeddings improve the performance, making such a simple approach competitive with the state-of-the-art models that perform explicit on-task training of the graph embeddings. Furthermore, we observe for the underlying BERT model an interesting power-law scaling behavior between the variance of the F1 score obtained for a relation class and its support in terms of training examples.

Last updated on 2024-26-11 at 23:53