The AI community building the future.
The most powerful natural-language processing, powered by open-source
Model Hub
Featured models
Browse the model hub to lorem ipsum dolor, sit amet consectetur adipisicing elit. Aut expedita officia rerum consectetur.
Explore modelsOn demand
Inference API
Use your favorite models with no hassle, Lorem ipsum dolor sit amet consectetur adipisicing elit. Veniam cupiditate illo quisquam.
See pricingfill mask
Mask token: [MASK]
🔥 This model is currently loaded and running on the Inference API.
happiness
0.036
survival
0.031
salvation
0.017
freedom
0.017
unity
0.015
fill mask
Mask token: [MASK]
🔥 This model is currently loaded and running on the Inference API.
happiness
0.036
survival
0.031
salvation
0.017
freedom
0.017
unity
0.015
Open Source
Transformers
Use the transformers library to lorem ipsum dolor, sit amet consectetur adipisicing elit. Aut expedita officia rerum consectetur.
Check documentation
Our Science contributions
We’re on a journey to advance and democratize NLP for everyone. Along the way, we contribute to the development of technology for the better.
📚
HMTL
Hierarchical Multi-Task Learning
Our paper has been accepted to AAAI 2019. We have open-sourced code and demo.
Read more🐸
Thomas Wolf et AL.
Meta-learning for language modeling
Our workshop paper on Meta-Learning a Dynamical Language Model was accepted to ICLR 2018. We use our implementation to power 🤗.
Read more🦄
Auto-complete your thoughts
Write with Transformers
This web app, built by the Hugging Face team, is the official demo of the Transformers repository's text generation capabilities.
Start writing🤖
State of the art
Neuralcoref
Our coreference resolution module is now the top open source library for coreference. You can train it on your own dataset and language.
Read more🐎
Victor Sanh et AL. 2019
DistilBERT
Distilllation. A smaller, faster, lighter, cheaper version of BERT. Code and weights are available through Transformers.
Read moreWebsite
Company
Ressources