Grok-1

The open-sourced Grok-1 model has 314 billion parameters.

CommonProductProductivityLarge Language ModelMixture-of-Experts Model
Grok-1 is a 314-billion parameter expert mixture model (Mixture-of-Experts) trained from scratch by xAI. This model has not been fine-tuned for specific applications (such as dialogue) and is a raw baseline checkpoint from the pre-training stage of Grok-1.
Visit

Grok-1 Visit Over Time

Monthly Visits

1554169

Bounce Rate

48.55%

Page per Visit

2.2

Visit Duration

00:00:53

Grok-1 Visit Trend

Grok-1 Visit Geography

Grok-1 Traffic Sources

Grok-1 Alternatives