RobBERTje: a Distilled Dutch BERT Model

  • Authors: Pieter Delobelle, Thomas Winters, Bettina Berendt
  • Publication Date: 2021-12-31
  • Publication Venue: Computational Linguistics in the Netherlands Journal
  • Abstract: Pre-trained large-scale language models such as BERT have gained a lot of attention thanks to their outstanding performance on a wide range of natural language tasks. However, due to their large number of parameters, they are resource-intensive both to deploy and to fine-tune. Researchers have created several methods for distilling language models into smaller ones to increase efficiency, with a small performance trade-off. In this paper, we create several different distilled versions of the state-of-the-art Dutch RobBERT model and call them RobBERTje. The distillations differ in their distillation corpus, namely whether or not they are shuffled and whether they are merged with subsequent sentences. We found that the performance of the models using the shuffled versus non-shuffled datasets is similar for most tasks and that randomly merging subsequent sentences in a corpus creates models that train faster and perform better on tasks with long sequences. Upon comparing distillation architectures, we found that the larger DistilBERT architecture worked significantly better than the Bort hyperparametrization. Interestingly, we also found that the distilled models exhibit less gender-stereotypical bias than its teacher model. Since smaller architectures decrease the time to fine-tune, these models allow for more efficient training and more lightweight deployment of many Dutch downstream language tasks.
Read paper

Citation

APA

Delobelle, P., Winters, T., & Berendt, B. (2021). RobBERTje: A Distilled Dutch BERT Model. Computational Linguistics in the Netherlands Journal, 11, 125–140. https://www.clinjournal.org/clinj/article/view/131

Harvard

Delobelle, P., Winters, T. and Berendt, B. (2021) “RobBERTje: A Distilled Dutch BERT Model,” Computational Linguistics in the Netherlands Journal, 11, pp. 125–140. Available at: https://www.clinjournal.org/clinj/article/view/131.

Vancouver

1.
Delobelle P, Winters T, Berendt B. RobBERTje: A Distilled Dutch BERT Model. Computational Linguistics in the Netherlands Journal [Internet]. 2021;11:125–40. Available from: https://www.clinjournal.org/clinj/article/view/131

BibTeX

Related project

2020
Project collaborator

RobBERT

The state-of-the-art Dutch language model

Back to all publications