DeepSeek-Coder-V2-Lite-Instruct
An open-source code language model that supports multiple programming languages.
CommonProductProgrammingCode GenerationMathematical Reasoning
DeepSeek-Coder-V2 is an open-source Mixture-of-Experts code language model, whose performance rivals that of GPT4-Turbo. It excels in code-specific tasks. Further pre-trained with an additional 60 billion tokens, it enhances coding and mathematical reasoning abilities while maintaining comparable performance on general language tasks. Compared to DeepSeek-Coder-33B, it shows significant improvements across code-related tasks, reasoning, and general capabilities. Moreover, it supports 338 programming languages (an expansion from 86) and boasts a context length extended to 128K from 16K.
DeepSeek-Coder-V2-Lite-Instruct Visit Over Time
Monthly Visits
17788201
Bounce Rate
44.87%
Page per Visit
5.4
Visit Duration
00:05:32