DeepSeek, China's latest large model with 67 billion parameters, surpasses the 70 billion Llama2 in its category. It stands out with its reasoning, mathematical, and coding abilities, showcasing excellent performance in math tests. Open source for commercial use, it is available in both 7 billion and 67 billion parameter versions, free to use without application. Developed by the company Deep Pursuit, which spun off from the private equity giant Huanfang, it aims to achieve AGI. Deep Pursuit has also released the code large model DeepSeek Coder, leading in code generation tasks.