DeepSeek-Coder-V2-Lite-Instruct

An open-source code language model that supports multiple programming languages.

CommonProductProgrammingCode GenerationMathematical Reasoning
DeepSeek-Coder-V2 is an open-source Mixture-of-Experts code language model, whose performance rivals that of GPT4-Turbo. It excels in code-specific tasks. Further pre-trained with an additional 60 billion tokens, it enhances coding and mathematical reasoning abilities while maintaining comparable performance on general language tasks. Compared to DeepSeek-Coder-33B, it shows significant improvements across code-related tasks, reasoning, and general capabilities. Moreover, it supports 338 programming languages (an expansion from 86) and boasts a context length extended to 128K from 16K.
Visit

DeepSeek-Coder-V2-Lite-Instruct Visit Over Time

Monthly Visits

18200568

Bounce Rate

44.11%

Page per Visit

5.8

Visit Duration

00:05:46

DeepSeek-Coder-V2-Lite-Instruct Visit Trend

DeepSeek-Coder-V2-Lite-Instruct Visit Geography

DeepSeek-Coder-V2-Lite-Instruct Traffic Sources

DeepSeek-Coder-V2-Lite-Instruct Alternatives