Website Home (ChinaZ.com) June 18th News: DeepSeek recently announced the release of an open-source model named DeepSeek-Coder-V2, which surpasses GPT-4-Turbo in coding and mathematical capabilities and has significantly expanded multilingual support and context processing length. Based on the DeepSeek-V2 model architecture, DeepSeek-Coder-V2 employs a Mixture of Experts (MoE) framework, specifically designed to enhance coding and mathematical reasoning abilities.

DeepSeek-Coder-V2 ranks among the top globally in performance, particularly excelling in code generation and mathematical arithmetic. This model, along with its associated code and papers, is fully open-sourced for free commercial use without the need for application. The model is available in two sizes: 236B parameters and 16B parameters, to meet different application needs.

image.png

In terms of multilingual support, DeepSeek-Coder-V2 has expanded the supported programming languages from 86 to 338, catering to a more diverse development demand. Concurrently, it supports a context length that has been extended from 16K to 128K, capable of handling longer input content. DeepSeek-Coder-V2 also offers API services, supporting a 32K context, at the same price as DeepSeek-V2.

In standard benchmark tests, DeepSeek-Coder-V2 outperforms some closed-source models in areas such as code generation, code completion, code repair, and mathematical reasoning. Users can download different versions of the DeepSeek-Coder-V2 model, including the basic and instruction versions, as well as versions with different parameter sizes.

DeepSeek also provides an online experience platform and GitHub links, along with technical reports, facilitating users' further understanding and use of DeepSeek-Coder-V2. The release of this model not only brings powerful coding and mathematical processing capabilities to the open-source community but also contributes to the advancement and application of related technologies.

Project Link: https://top.aibase.com/tool/deepseek-coder-v2

Online Experience: https://chat.deepseek.com/sign_in