FinEst BERT

finestbert

Persistent Identifier of this resource:

http://urn.fi/urn:nbn:fi:lb-2020061201

Access location:

Bidirectional Encoder Representations from Transformers (BERT) multilingual model trained from scratch, covering three languages: Finnish, Estonian, and English. Used for various NLP classification tasks on the mentioned three languages, supporting both monolingual and multilingual/crosslingual (knowledge transfer) tasks. Whole-word masking used during data preparation and training; trained for 40 epochs with sequence length 128 and another 4 epochs with sequence length 512. FinEst BERT model published here is in pytorch format.

Corpora used:
Finnish - STT articles, CoNLL 2017 shared task, Ylilauta downloadable version;
Estonian - Ekspress Meedia articles, CoNLL 2017 shared task
English - English wikipedia

More information in the article "FinEst BERT and CroSloEngual BERT: less is more in multilingual models" by Matej Ulčar and Marko Robnik-Šikonja, published in the proceedings of the TSD 2020 conference.

"FinEst BERT" model by Matej Ulčar and Marko Robnik-Šikonja is published under Creative Commons Attribution 4.0 International (CC BY 4.0) license.

You don’t have the permission to edit this resource.