Google's release of the Gemini 1.5 Pro model has extended the context length to 10 million tokens, sparking discussions in the industry about the future of RAG technology. Some believe that long text inputs could replace RAG, while others argue that RAG will continue to play a significant role. Google's advantage in computational power has positioned it ahead of other companies in exploring context lengths, which might pose challenges to some startups.
Is the Model Context Length Reaching 10 Million a Death Blow for Another Batch of Entrepreneurs?

品玩
This article is from AIbase Daily
Welcome to the [AI Daily] column! This is your daily guide to exploring the world of artificial intelligence. Every day, we present you with hot topics in the AI field, focusing on developers, helping you understand technical trends, and learning about innovative AI product applications.