Hugging Face Raises $100M In Series C

Hugging-Face-raises-USD-100-Mn-in-Series-C

Hugging Face announced that it has closed a new round of funding, a $100 million Series C with a big valuation.

Following the funding round, Hugging Face is now worth $2 billion. Lux Capital led the round, with Sequoia and Coatue investing in the company for the first time. Some of the startup’s existing investors participated once again. These investors include Addition, Betaworks, AIX Ventures, Cygni Capital, Kevin Durant and Olivier Pomel.

Hugging Face released the Transformers library on GitHub and instantly attracted a ton of attention; it currently has 62,000 stars and 14,000 forks on the platform. With Transformers, you can leverage popular NLP models, such as BERT, GPT-2, T5 or DistilBERT and use those models to manipulate text in one way or another. For instance, you can classify text, extract information, automatically answer questions, summarise text, generate text, etc.

Due to the success of this library, Hugging Face quickly became the main repository for all things related to machine learning models, not just natural language processing. On the company’s website, one can browse thousands of pre-trained machine-learning models, participate in the developer community with their own model, download datasets and more.

Essentially, Hugging Face is building the GitHub of machine learning. It’s a community-driven platform with a ton of repositories. Developers can create, discover and collaborate on ML models, datasets and ML apps. Hugging Face also offers hosted services, such as the Inference API that lets you use thousands of models via a programming interface, and the ability to auto train your model.