Mistral AI has recently open-sourced the Mistral7B v0.2Base Model, announced during the event in Cerebral Valley. The updated model now supports a 32K context length, eliminates the sliding window, and sets Rope Theta to 1e6. In collaboration with Microsoft, they have invested $16 million to release the Mistral Large model, directly competing with GPT-4. The continuously evolving Mistral AI is committed to surpassing traditional competitors with the release of new models.