AIbase
Product LibraryTool Navigation

llama-moe

Public

?? LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)

Creat2023-07-24T14:15:51
Update2025-03-25T19:43:24
https://arxiv.org/abs/2406.16554
945
Stars
0
Stars Increase

Related projects