Training Chinese Version of LLaMA2 with a Cost of a Few Thousand Yuan in 15 Hours
站长之家
132
Website Home Report: The Colossal-AI team has trained a Chinese version of the LLaMA2 large model with a parameter scale of 8.5 billion characters in just 15 hours at a cost of a few thousand yuan, using techniques such as vocabulary expansion, data filtering, and multi-stage training strategies. This low-cost solution has enabled the Chinese version of LLaMA2 to achieve or surpass the advanced levels of models of the same scale in multiple Chinese language tasks. The entire training process, code, and weights have been open-sourced, allowing for easy migration and application to other languages and fields, enabling the rapid and low-cost construction of large models. This solution has already achieved good results in multiple industries.
© Copyright AIbase Base 2024, Click to View Source - https://www.aibase.com/news/1640