Stability AI Japan has recently released two Japanese language models, named "Japanese Stable LM3B-4E1T" and "Japanese Stable LM Gamma7B". Both models are based on English language models and have been pre-trained with extensive Japanese and English data, enhancing their capabilities in Japanese language processing. The former model has 3 billion parameters, while the latter has 7 billion parameters. The release of these models marks a significant breakthrough in the field of natural language processing in Japan, providing users with enhanced performance and portability. Performance evaluations show that despite having fewer parameters, the former model performs exceptionally well in multiple tasks, and the latter scores higher, demonstrating remarkable progress in Japanese natural language processing.