The Amazon Machine Learning team recently announced that their developed Mistral7B foundational model is now available on SageMaker JumpStart, allowing users to deploy it with a single click. Mistral7B is an English text-to-code generation model with a parameter scale of 7 billion, utilizing a transformer architecture and featuring a context length of 8000 tokens, enabling fast inference speeds. Users can easily experience Mistral7B, which is open-sourced under the Apache2.0 license, providing unrestricted usage.