Good morning everyone! This iteration will be short and sweet. As you saw in the title, did Google really solve the context window challenge, and did RAG become irrelevant? And why am I saying all that? Because in this iteration, I'm talking about Infini-Attention, Google's most recent paper, which claims to work just as well with millions of tokens sent simultaneously.
First of all, this is what the (Learn AI Together) community thinks...
I completely agree. No, it won't be replacing RAG. There are pros and cons to both. First of all, hallucinations will persist even if you send the model a whole book, which will cost way too much anyway. The cost of sending all this text at once at each request should be enough to convince us that RAG is there to stay to give (efficiently) more context to our model.
Now, to introduce some context before we dive in... Context window is a fancy way of saying how many words you can send to an LLM simultaneously. The bigger window you can have, the more words you can send, the more context it can grasp and thus, the better the understanding of your question will be to give an appropriate answer. The problem is that the language models’ performances dramatically decrease along with the context increases. This is what Infini-Attention is about...
Learn more in the article or the video:
And that's it for this iteration! I'm incredibly grateful that the What's AI newsletter is now read by over 16,000 incredible human beings. Click here to share this iteration with a friend if you learned something new!
Looking for more cool AI stuff? 👇
Looking for AI news, code, learning resources, papers, memes, and more? Follow our weekly newsletter at Towards AI!
Looking to connect with other AI enthusiasts? Join the Discord community: Learn AI Together!
Want to share a product, event or course with my AI community? Reply directly to this email, or visit my Passionfroot profile to see my offers.
Thank you for reading, and I wish you a fantastic week! Be sure to have enough sleep and physical activities next week!
Louis-François Bouchard