@LukawskiKacper Profile picture

Kacper Łukawski

@LukawskiKacper

DevRel @qdrant_engine | Founder @AIEmbassy Foundation

Similar User
Qdrant photo

@qdrant_engine

Amog Kamsetty photo

@AmogKamsetty

Bob van Luijt photo

@bobvanluijt

Edward photo

@aestheticedwar1

Etienne Dilocker photo

@etiennedi

Dmitry Kan photo

@DmitryKan

atitaarora photo

@atitaarora

Sease photo

@SeaseLtd

Marcin Antas photo

@antas_marcin

π12Qα photo

@firqaaaa

Jackmin photo

@jackminong

Alexander Seifert photo

@therealaseifert

zhen Wang photo

@zhenwang_23

dziubjak photo

@dziubjak

Keelie Bach photo

@KeelieBach

It's been over a week since the official launch, so in case you missed it, please check out a short course prepared together with the @DeepLearningAI team! We covered some important topics, including tokenization and optimizing the semantic search layer in your RAG pipelines!

Tokenization -- turning text into a sequence of integers -- is a key part of generative AI, and most API providers charge per million tokens. How does tokenization work? Learn the details of tokenization and RAG optimization in Retrieval Optimization: From Tokenization to Vector…



Kacper Łukawski Reposted

We had a great session about @nvidia NIMs with @ansjin and @MarkMoyou yesterday. 🚀 From Mark: We got lots of info on inferencing with NIMs and different types of NIMs like embedding models, generative models, reranking models.. 🎨 I walked through how you can use NIMs for…

Tweet Image 1

Kacper Łukawski Reposted

Some of the most popular embedding models out there have a particular issue. They do not support emojis out of the box, so the following sentences have identical representations: I feel so 😃 today = I feel so 😢 today

Tweet Image 1

We're going to start the fine-tuning from the perspective of a regular @qdrant_engine user with lots of data vectorized who wants to avoid heavy and expensive recomputation of them all! It seems fine-tuning with backward compatibility might be possible in some cases! 🤓

Tweet Image 1

Join our webinar if you want to adopt semantic search in a new area. @LukawskiKacper will present tips on teaching your dog new tricks, even with backward compatibility of the embeddings 😱 This time, we're joining forces with @origlobalcloud and their GPU Cloud and trying to…



Kacper Łukawski Reposted

Dense embedding models are not giving up yet! Surprisingly, they are also pretty good late interaction models! Please welcome ColBERT-like retrieval with just sentence transformers 🎉

Tweet Image 1

Late interaction models are great for improving the quality of the results. But couldn't we design future embedding models so they can better solve both single and multi-vector search, though? The existing dense models can do it quite well already 🤔 qdrant.tech/articles/late-…

🎉Happy to finally release answerai-colbert-small-v1: the small but mighty @answerdotai ColBERT. It might not be able to count the number of "r"s in words, but it can definitely find the instructions on how to do that. With just 33M params, it beats even `bge-base` on BEIR!

Tweet Image 1


Loading...

Something went wrong.


Something went wrong.