Grok-1

The open-sourced Grok-1 model has 314 billion parameters.

CommonProductProductivityLarge Language ModelMixture-of-Experts Model
Grok-1 is a 314-billion parameter expert mixture model (Mixture-of-Experts) trained from scratch by xAI. This model has not been fine-tuned for specific applications (such as dialogue) and is a raw baseline checkpoint from the pre-training stage of Grok-1.
Visit

Grok-1 Visit Over Time

Monthly Visits

2659900

Bounce Rate

38.27%

Page per Visit

4.8

Visit Duration

00:02:32

Grok-1 Visit Trend

Grok-1 Visit Geography

Grok-1 Traffic Sources

Grok-1 Alternatives