Grok-1 is a 314-billion parameter expert mixture model (Mixture-of-Experts) trained from scratch by xAI. This model has not been fine-tuned for specific applications (such as dialogue) and is a raw baseline checkpoint from the pre-training stage of Grok-1.