2023-09-27 16:14:42.AIbase.1.7k
Ant Group Releases Open Source Code Model CodeFuse-CodeLlama-34B in 4bits Quantized Version
Ant Group has released the 4bits quantized version of CodeFuse-CodeLlama-34B. The quantized model can be loaded on a single A10 or RTX4090. The quantized model achieves a performance of 73.8% on the Humaneval pass@1 metric. CodeFuse is Ant's self-developed large model for code generation, aimed at enhancing developers' efficiency by providing intelligent suggestions and real-time support.