On February 12, Alibaba Cloud's Bailing platform launched several powerful AI models, including DeepSeek-V3, DeepSeek-R1, and DeepSeek-R1-Distill-Qwen-32B among six models, further enriching its AI model matrix. At the same time, Tongyi Lingma announced a brand new model selection feature, supporting the full versions of DeepSeek-V3 and DeepSeek-R1 with 671B parameters, injecting new vitality into the field of AI programming.
Tongyi Lingma is an AI coding assistant jointly produced by Alibaba Cloud and Tongyi Laboratory, providing intelligent code generation and smart Q&A capabilities for development. In January of this year, the Tongyi Lingma AI programmer was fully launched, supporting VS Code and JetBrains IDEs, becoming the first truly implemented AI programmer in the country. This tool covers both front-end and back-end development, achieving complex coding tasks from scratch, and introduces multi-file code modification capabilities, allowing developers to automatically complete multi-file coding tasks such as requirement implementation, bug fixing, and batch generation of unit tests.
This upgrade of Tongyi Lingma's capabilities, with the newly launched model selection feature, further enhances its flexibility. Users can search and download the latest Tongyi Lingma plugin in VSCode and JetBrains, selecting models from the input box to easily switch between different models. In actual development, users can freely switch between models such as Qwen2.5, DeepSeek-V3, and DeepSeek-R1 based on specific scene requirements. These models can accurately understand requirements and quickly generate high-quality code snippets, handling both complex algorithms and simple logic processing with ease.
Currently, Tongyi Lingma's intelligent Q&A feature supports Qwen2.5, DeepSeek-V3, and DeepSeek-R1 models, while the AI programmer supports Qwen2.5 and DeepSeek-V3 models. By allowing users to choose models based on their needs, Tongyi Lingma further lowers the barrier to AI programming technology.