top of page

DistilBERT

Writer's picture: Editorial StaffEditorial Staff

Hugging Face

Introduced in 2019 by Hugging Face to provide an efficient alternative to BERT for natural language processing tasks. It's a smaller, faster version of BERT created through knowledge distillation. It retains 97% of BERT's language understanding capabilities while being 40% smaller and 60% faster. It is available as part of the Hugging Face Transformers library.


Comments


Top Stories

Stay updated with the latest in language models and natural language processing. Subscribe to our newsletter for weekly insights and news.

Stay Tuned for Exciting Updates

  • LinkedIn
  • Twitter

© 2023 SLM Spotlight. All Rights Reserved.

bottom of page