On February 25th, Alibaba announced the launch of QwQ-Max-Preview, an inference model based on Qwen2.5-Max, and plans to fully open-source its latest inference models, QwQ-Max and Qwen2.5-Max.

The released QwQ-Max-Preview is a preview version. Alibaba stated that the official version will be released soon and will be fully open-sourced under the Apache2.0 license. Unlike previous releases, this open-sourcing includes not only the model itself but also smaller versions, such as QwQ-32B, which can be deployed on local devices, further promoting the popularization and application of AI technology.

微信截图_20250225083313.png

According to LiveCodeBench's evaluation, the performance of QwQ-Max-Preview is comparable to OpenAI's o1-medium and surpasses DeepSeek R1. This result indicates that Alibaba's inference model has made significant breakthroughs in performance, especially in inference speed and the accuracy of code generation. Alibaba also launched the qwen.ai domain, allowing users to directly access and use the latest inference models.

微信截图_20250225083326.png

The open-sourced QwQ-Max and Qwen2.5-Max models are expected to provide developers and businesses with more powerful inference capabilities, especially in code generation, multi-modal processing, and complex task solving.

Alibaba's open-source initiative is not only a rapid response to market demand but also a crucial step in its strategic layout in the AI field. Through full open-sourcing, Alibaba has not only provided developers with powerful tools but also laid the foundation for the widespread application and popularization of AI technology. With the official release of QwQ-Max and Qwen2.5-Max, Alibaba is expected to maintain its leading position in the AI field and drive industry development.