Hugging Face CTO, Transformers library creator
Thomas Wolf
Profile
Thomas Wolf is co-founder and Chief Science Officer of Hugging Face — and the person most directly responsible for the library that put Hugging Face on every ML engineer’s laptop. If you’ve ever typed from transformers import AutoModel, you’ve used his work. The Transformers library didn’t just wrap BERT and GPT-2 — it standardized how the entire open-source community ships and consumes models, and that standardization is what made Hugging Face the GitHub of AI rather than just another startup.
Wolf’s path into ML is unusual. He did a PhD in statistical physics on superconducting materials, then became a patent attorney in Paris working on deep learning patents — which is how he ended up learning the field from the outside in. He co-founded Hugging Face in 2016 with Clement Delangue and Julien Chaumond, originally as a chatbot company. When transformers arrived in 2018, Wolf bet the company on the library, and that bet is the reason Hugging Face exists today in the form it does.
What he spends his time on now is worth paying attention to. He led the BLOOM multilingual model effort, pushes the Open-R1 project (an open replication of DeepSeek-R1), and drives LeRobot — the open-source push into robotics. He’s also the rare AI leader who publicly pushes back on the “AI will compress the 21st century” narrative. His 2025 essay arguing that LLMs are producing “yes-men on servers” — very obedient students rather than the B-student skeptics science actually needs — is one of the more thoughtful takes from inside a major lab.
For developers learning AI, Wolf is a useful person to follow because he writes for builders, not executives. The Ultra-Scale Playbook his team published is the kind of document you’d previously only get by working at a frontier lab. He’s optimistic about open source but sober about what current architectures can actually do.
Books
Natural Language Processing with Transformers, Revised Edition Hands-on guide to building with the Transformers library from three of its core maintainers — the closest thing to an official manual.Key Articles & Papers
Transformers: State-of-the-Art Natural Language Processing DistilBERT, a Distilled Version of BERT Datasets: A Community Library for Natural Language Processing The Ultra-Scale Playbook The Einstein AI Model Some Notes on DeepSeek and Export Control 100 Times Faster Natural Language Processing in PythonSpotify Podcasts