Our Research contributions
We’re on a journey to advance and democratize NLP for everyone. Along the way, we contribute to the development of technology for the better.
📚
HMTL
Hierarchical Multi-Task Learning
Our paper has been accepted to AAAI 2019. We have open-sourced code and demo.
Read more🐸
Thomas Wolf et AL.
Meta-learning for language modeling
Our workshop paper on Meta-Learning a Dynamical Language Model was accepted to ICLR 2018. We use our implementation to power 🤗.
Read more🦄
Auto-complete your thoughts
Write with Transformers
This web app, built by the Hugging Face team, is the official demo of the Transformers repository's text generation capabilities.
Start writing🤖
State of the art
Neuralcoref
Our coreference resolution module is now the top open source library for coreference. You can train it on your own dataset and language.
Read more🐎
Victor Sanh et AL. 2019
DistilBERT
Distilllation. A smaller, faster, lighter, cheaper version of BERT. Code and weights are available through Transformers.
Read moreSUBSCRIBE
Enter your email address to subscribe to this blog and receive notifications of new posts by email.