The Aya model is a large-scale multilingual generative language model capable of following instructions in 101 languages. It outperforms mT0 and BLOOMZ in various automatic and human evaluations, despite covering twice as many languages. The Aya model was trained using multiple datasets, including xP3x, the Aya dataset, the Aya collection, a subset of the DataProvenance collection, and ShareGPT-Command, and is released under the Apache-2.0 license to advance multilingual technology.