Grok-1
The open-sourced Grok-1 model has 314 billion parameters.
CommonProductProductivityLarge Language ModelMixture-of-Experts Model
Grok-1 is a 314-billion parameter expert mixture model (Mixture-of-Experts) trained from scratch by xAI. This model has not been fine-tuned for specific applications (such as dialogue) and is a raw baseline checkpoint from the pre-training stage of Grok-1.
Grok-1 Visit Over Time
Monthly Visits
1314947
Bounce Rate
46.13%
Page per Visit
2.4
Visit Duration
00:01:07