top of page
Writer's pictureEditorial Staff

DistilBERT


Hugging Face

Introduced in 2019 by Hugging Face to provide an efficient alternative to BERT for natural language processing tasks. It's a smaller, faster version of BERT created through knowledge distillation. It retains 97% of BERT's language understanding capabilities while being 40% smaller and 60% faster. It is available as part of the Hugging Face Transformers library.


Comments


Top Stories

Check back soon
Once posts are published, you’ll see them here.
bottom of page