2023-09-11 15:18:27.AIbase.1.3k
ELYZA Releases Japanese LLM Based on Llama 2, with 7 Billion Parameters, Competing with GPT-3.5
ELYZA has released the Japanese LLM 'ELYZA-japanese-Llama-2-7b' based on Meta's Llama 2, with a parameter count reaching 7 billion, performance comparable to GPT-3.5. The model has undergone additional pre-training and unique post-training, achieving the highest scores in a 5-level manual evaluation. Although it has not yet reached the level of closed LLMs, it is already on par with GPT-3.5. ELYZA has successfully developed LLMs in other languages, including English.