Dolphin 2.9.1 Mixtral 1x22b
Advanced AI model based on Dolphin-2.9-Mixtral-8x22b
CommonProductProgrammingAI ModelText Generation
Dolphin 2.9.1 Mixtral 1x22b is a carefully trained and curated AI model by the Cognitive Computations team, based on the Dolphin-2.9-Mixtral-8x22b version. It is licensed under Apache-2.0. This model boasts a 64k context window, fine-tuned with full weights across a 16k sequence length, achieving 27 hours of training on 8 H100 GPUs. Dolphin 2.9.1 possesses diverse instruction following, dialogue, and coding abilities, along with preliminary agent capabilities and function call support. The model has not been reviewed, and the dataset has been filtered to remove alignment and bias, enhancing its compliance. It is recommended to implement your own alignment layer before making it publicly available as a service.
Dolphin 2.9.1 Mixtral 1x22b Visit Over Time
Monthly Visits
20899836
Bounce Rate
46.04%
Page per Visit
5.2
Visit Duration
00:04:57