ELYZA Releases Japanese LLM Based on Llama 2, with 7 Billion Parameters, Competing with GPT-3.5

站长之家
232
Translated data:
Japanese AI startup ELYZA has released a Japanese language model based on Meta's Llama 2, with a parameter count of 7 billion, rivaling the performance of GPT3.5. This model, enhanced through additional pre-training and unique post-training, achieved the highest score in a 5-level manual evaluation. ELYZA has successfully brought the capabilities of LLMs in other languages to Japanese, improving the model by reducing the learning volume required for Japanese.
© Copyright AIbase Base 2024, Click to View Source - https://www.aibase.com/news/1256