The Japanese government, in collaboration with major technology companies such as NEC, Fujitsu, and SoftBank, is investing hundreds of millions of dollars to develop culturally sensitive Japanese language models. These models will be trained on the national supercomputer Fugaku and will feature at least 30 billion parameters, aiming to address the shortcomings of existing models in the Japanese market. Researchers have also developed the Rakuda ranking system to evaluate the cultural sensitivity of models, with GPT-3.5 currently ranking first. Additionally, the Japanese government plans to establish larger models with at least 100 billion parameters for scientific applications, which will be made available to the public by 2031.