Dolphin 2.9.1 Mixtral 1x22b

Advanced AI model based on Dolphin-2.9-Mixtral-8x22b

CommonProductProgrammingAI ModelText Generation
Dolphin 2.9.1 Mixtral 1x22b is a carefully trained and curated AI model by the Cognitive Computations team, based on the Dolphin-2.9-Mixtral-8x22b version. It is licensed under Apache-2.0. This model boasts a 64k context window, fine-tuned with full weights across a 16k sequence length, achieving 27 hours of training on 8 H100 GPUs. Dolphin 2.9.1 possesses diverse instruction following, dialogue, and coding abilities, along with preliminary agent capabilities and function call support. The model has not been reviewed, and the dataset has been filtered to remove alignment and bias, enhancing its compliance. It is recommended to implement your own alignment layer before making it publicly available as a service.
Visit

Dolphin 2.9.1 Mixtral 1x22b Visit Over Time

Monthly Visits

17788201

Bounce Rate

44.87%

Page per Visit

5.4

Visit Duration

00:05:32

Dolphin 2.9.1 Mixtral 1x22b Visit Trend

Dolphin 2.9.1 Mixtral 1x22b Visit Geography

Dolphin 2.9.1 Mixtral 1x22b Traffic Sources

Dolphin 2.9.1 Mixtral 1x22b Alternatives