phixtral-2x2_8

A mixed expert model that outperforms individual expert models.

CommonProductProductivityMixed expert modelText generation
Phixtral-2x2_8 is the first mixed expert model built from two microsoft/phi-2 models, inspired by the mistralai/Mixtral-8x7B-v0.1 architecture. It surpasses the performance of each individual expert model. This model excels in multiple benchmark datasets including AGIEval, GPT4All, TruthfulQA, and Bigbench. It utilizes a customized version of the mergekit library (mixtral branch) with specific configurations. Users can run Phixtral at 4-bit precision on free T4 GPUs via Colab notebooks. The model has 4.46B parameters and utilizes F16 tensor type.
Visit

phixtral-2x2_8 Visit Over Time

Monthly Visits

17104189

Bounce Rate

44.67%

Page per Visit

5.5

Visit Duration

00:05:49

phixtral-2x2_8 Visit Trend

phixtral-2x2_8 Visit Geography

phixtral-2x2_8 Traffic Sources

phixtral-2x2_8 Alternatives