RobBERTje: a Distilled Dutch BERT Model (Extended Abstract)

  • Authors: Pieter Delobelle, Thomas Winters, Bettina Berendt
  • Publication Date: 2021-08
  • Publication Venue: 31st Meeting of Computational Linguistics in The Netherlands (CLIN 31)
  • Abstract: Pre-trained large-scale language models such as BERT have gained a lot of attention thanks to their outstanding performance on a wide range of natural language tasks. However, due to their large number of parameters, they are resource-intensive both to deploy and to fine-tune. Researchers have created several methods for distilling language models into smaller ones to increase efficiency, with a small performance trade-off. In this paper, we create several different distilled versions of the state-of-the-art Dutch RobBERT model and call them RobBERTje. The distillations differ in their distillation corpus, namely whether or not they are shuffled and whether they are merged with subsequent sentences. We found that the performance of the models using the shuffled versus non-shuffled datasets is similar for most tasks and that randomly merging subsequent sentences in a corpus creates models that train faster and perform better on tasks with long sequences. Upon comparing distillation architectures, we found that the larger DistilBERT architecture worked significantly better than the Bort hyperparametrization. Interestingly, we also found that the distilled models exhibit less gender-stereotypical bias than its teacher model. Since smaller architectures decrease the time to fine-tune, these models allow for more efficient training and more lightweight deployment of many Dutch downstream language tasks.
Read paper

Citation

APA

Delobelle, P., Winters, T., & Berendt, B. (2021). RobBERTje: a Distilled Dutch BERT Model.

Harvard

Delobelle, P., Winters, T. and Berendt, B. (2021) “RobBERTje: a Distilled Dutch BERT Model.” Ghent, Belgium.

Vancouver

1.
Delobelle P, Winters T, Berendt B. RobBERTje: a Distilled Dutch BERT Model. Ghent, Belgium; 2021.

BibTeX

Related project

2020
Project collaborator

RobBERT

State-of-the-art Dutch language model

Back to all publications