Pinned
Waking up to your favorite ML author replying to you? Feels like the universe just said, 'You're on the right path, kid.' 😌🔥
1
0
1
0
55
Pritam Reposted
Microsoft released LLM2CLIP: a CLIP model with longer context window for complex text inputs 🤯 TLDR; they replaced CLIP's text encoder with various LLMs fine-tuned on captioning, better top-k accuracy on retrieval 🔥 All models with Apache 2.0 license on @huggingface 😍
6
111
667
374
34K
Pritam Reposted
In this video, I'll be deriving and coding Flash Attention from scratch. No prior knowledge of CUDA or Triton is required. Link to the video: youtu.be/zy8ChVd_oTM All the code will be written in Python with Triton, but no prior knowledge of Triton is required. I'll also…
29
135
1K
993
62K
United States Trends
- 1. $CUTO 7.986 posts
- 2. Northwestern 5.324 posts
- 3. Carnell Tate N/A
- 4. Denzel Burke N/A
- 5. Sheppard 2.589 posts
- 6. $CATEX N/A
- 7. Jeremiah Smith N/A
- 8. Arkansas 26,2 B posts
- 9. #collegegameday 5.017 posts
- 10. #Buckeyes N/A
- 11. Broden N/A
- 12. Ewers N/A
- 13. Wrigley 3.131 posts
- 14. Jim Knowles N/A
- 15. Jahdae Barron N/A
- 16. #Caturday 9.120 posts
- 17. Renji 6.190 posts
- 18. #HookEm 2.204 posts
- 19. DeFi 106 B posts
- 20. #SkoBuffs 2.642 posts
Loading...
Something went wrong.
Something went wrong.