Doubao-1.5-pro

Doubao-1.5-pro is a high-performance sparse Mixture of Experts (MoE) large language model that focuses on achieving an optimal balance between inference performance and model capability.

ChineseSelectionProductivityLarge Language ModelMulti-modal
Developed by the Doubao team, Doubao-1.5-pro is a high-performance sparse MoE (Mixture of Experts) large language model. This model achieves an excellent balance between model performance and inference performance through an integrated training-inference design. It excels in various public evaluation benchmarks, showcasing significant advantages in inference efficiency and multi-modal capabilities. The model is suitable for scenarios that require efficient inference and multi-modal interaction, such as natural language processing, image recognition, and speech interaction. Its technical foundation is based on the sparse activation MoE architecture, which optimizes activation parameter ratios and training algorithms to achieve higher performance leverage than traditional dense models. Additionally, it supports dynamic parameter adjustment to cater to diverse application scenarios and cost requirements.
Visit

Doubao-1.5-pro Visit Over Time

Monthly Visits

58717

Bounce Rate

56.58%

Page per Visit

2.6

Visit Duration

00:01:20

Doubao-1.5-pro Visit Trend

Doubao-1.5-pro Visit Geography

Doubao-1.5-pro Traffic Sources

Doubao-1.5-pro Alternatives