Google's release of the Gemini 1.5 Pro model has extended the context length to 10 million tokens, sparking discussions in the industry about the future of RAG technology. Some believe that long text inputs could replace RAG, while others argue that RAG will continue to play a significant role. Google's advantage in computational power has positioned it ahead of other companies in exploring context lengths, which might pose challenges to some startups.