DeepSeek-Coder-V2
Open-source code language model, enhancing programming intelligence.
PremiumNewProductProgrammingCode GenerationProgramming Assistance
DeepSeek-Coder-V2 is an open-source Mixture-of-Experts (MoE) code language model with performance comparable to GPT4-Turbo, showcasing exceptional performance in code-specific tasks. Built upon DeepSeek-Coder-V2-Base, it has undergone further pre-training using a high-quality, multi-source corpus of 6 trillion tokens. This has significantly enhanced its coding and mathematical reasoning capabilities while maintaining its performance on general language tasks. Supported programming languages have expanded from 86 to 338, and the context length has increased from 16K to 128K.
DeepSeek-Coder-V2 Visit Over Time
Monthly Visits
494758773
Bounce Rate
37.69%
Page per Visit
5.7
Visit Duration
00:06:29