INTELLECT-1-Instruct is a 1 billion parameter language model trained from scratch on 1 trillion English text and code tokens by Prime Intellect. The model supports text generation and has the capability for distributed training, allowing for high-performance training across unreliable, globally distributed workers. It utilizes the DiLoCo algorithm for training and a custom int8 all-reduce kernel to minimize communication load, significantly reducing communication overhead. The background information reveals that it has received computational support from 30 independent community contributors and underwent training across 14 concurrent nodes on three continents.