@ClemDelangue Profile picture

Clem Delangue 🤗

@ClemDelangue

Co-founder & CEO at Hugging Face 🤗. We teach computers to understand human language.

Similar User
Morph photo

@morph_labs

Replicate photo

@replicate

Lysandre photo

@LysandreJik

Ofir Press photo

@OfirPress

Julien Chaumond photo

@julien_c

Been Kim photo

@_beenkim

Lewis Tunstall photo

@_lewtun

Caiming Xiong photo

@CaimingXiong

Clément photo

@clmt

Wenhu Chen photo

@WenhuChen

Lamini photo

@LaminiAI

Victor Zhong photo

@hllo_wrld

Alexis Conneau photo

@alex_conneau

Weijia Shi photo

@WeijiaShi2

Sabrina J. Mielke photo

@sjmielke

Clem Delangue 🤗 Reposted

The first full paper on @pytorch after 3 years of development. It describes our goals, design principles, technical details uptil v0.4 Catch the poster at #NeurIPS2019 Authored by @apaszke , @colesbury et. al. arxiv.org/abs/1912.01703


Clem Delangue 🤗 Reposted

Interesting work (and a nice large and clean dataset as well, looking forward to see it released): "Compressive Transformers for Long-Range Sequence Modelling" by Jack W. Rae, Anna Potapenko, Siddhant M. Jayakumar, Timothy P. Lillicrap (at DeepMind) Paper: arxiv.org/abs/1911.05507

Tweet Image 1

Clem Delangue 🤗 Reposted

Some more results. Now I made it fully supported by all kinds of model and vocabs. Good experience to use @huggingface with @SlackHQ And it looks pretty smart

Tweet Image 1
Tweet Image 2

Clem Delangue 🤗 Reposted

The @SustaiNLP2020 workshop at #EMNLP2020 will try to remove a little bit of SOTA addiction from NLP research 😉 We'll promote sensible trade-offs between performances & models that are - computationally more efficient - conceptually simpler ... [1/2] twitter.com/DoingJobs/stat…


Clem Delangue 🤗 Reposted

Perhaps a great opportunity to use @huggingface's TF 2.0 Transformer implementations :)


Clem Delangue 🤗 Reposted

Happy to have a small PR accepted to the HuggingFace Transformer library demonstrating substantial mixed precision speed-up with @NVIDIA Tensor Core #GPU even at small batch size in the demo script github.com/huggingface/tr…


Clem Delangue 🤗 Reposted

GPT-2 on device is blazing fast on iPhone 11 ⚡️ Core ML 3 is officially out so we can do state-of-the-art text generation on mobile (117M parameters running ~3 times par second on the neural engine!) We put together a small video benchmark ⬇️


Clem Delangue 🤗 Reposted

DistilBERT (huggingface) BERT baseから蒸留にて6層に小型化(40%減)。推論は60%高速化、精度はGLUEで95%程度保持。8個の16GB V100 GPUで3.5日ぐらいで学習。hidden sizeは768のままで、層数の方が高速化には効果があるとのこと。github github.com/huggingface/py… blog medium.com/huggingface/di…


Clem Delangue 🤗 Reposted

1,060 days ago, @Thom_Wolf and I launched a Deep learning for NLP study group: medium.com/huggingface/la…


Clem Delangue 🤗 Reposted

💃PyTorch-Transformers 1.1.0 is live💃 It includes RoBERTa, the transformer model from @facebookai, current state-of-the-art on the SuperGLUE leaderboard! Thanks to @myleott @julien_c @LysandreJik and all the 100+ contributors!

Tweet Image 1

Clem Delangue 🤗 Reposted

A question I get from time to time is how to convert a pretrained TensorFlow model in PyTorch easily and reliably. We're starting to be quite familiar with the process so I've written a short blog post summarizing our workflow and some lessons learned 👇 medium.com/huggingface/fr…


Clem Delangue 🤗 Reposted

New release of Transformers repo is shaping up & I'm very excited! Gifts for all: -SOTA Lovers: new XLNet & XLM archi + 6 new Bert/GPT trained chkpt -Research Lovers: unified model API, attention/hidden-state outputs to swap/study models -Speed Lovers: Torchscript & head pruning!

Tweet Image 1

Clem Delangue 🤗 Reposted

🔥 Thrilled to release our Swift Core ML implementation of BERT for question answering.🔥🔥 Transformers models now also live on the edge. 📱📲 You now CAN do state-of-the-art NLP on mobile devices! github.com/huggingface/sw… Built w/ @LysandreJik and @Thom_Wolf at @huggingface


Best Long Paper #naacl2019 BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. Jacob Devlin, Ming-Wei Chang, Kenton Lee and Kristina Toutanova #NLProc

Tweet Image 1

Clem Delangue 🤗 Reposted

Welcome to Minne-SOTA #NAACL2019

Tweet Image 1

They’re very big fans of @Thom_Wolf here at #NAACL2019

Tweet Image 1

Clem Delangue 🤗 Reposted

Absolutely PACKED room for @seb_ruder, @Thom_Wolf, @swabhz, and @mattthemathman’s tutorial on transfer learning for NLP #NAACL2019

Tweet Image 1

Loading...

Something went wrong.


Something went wrong.