Translated data: Japanese AI startup ELYZA has released a Japanese language model based on Meta's Llama 2, with a parameter count of 7 billion, rivaling the performance of GPT3.5. This model, enhanced through additional pre-training and unique post-training, achieved the highest score in a 5-level manual evaluation. ELYZA has successfully brought the capabilities of LLMs in other languages to Japanese, improving the model by reducing the learning volume required for Japanese.