Recently, Amazon announced a significant investment, planning to allocate $110 million for artificial intelligence (AI) research, aiming to reduce reliance on Nvidia and promote the development of its own chips. This funding will support generative AI research at universities, under the project name "Build on Trainium."
The project will provide researchers with the opportunity to use Trainium chips, enabling them to develop new AI architectures, machine learning libraries, and performance enhancements for large-scale distributed AWS Trainium UltraClusters.
Image Source Note: The image was generated by AI, provided by the image licensing service Midjourney.
AWS Trainium is a custom machine learning chip designed specifically for deep learning training and inference tasks. Amazon stated that this project covers a broad range of research directions from algorithmic innovation to AI accelerator performance enhancement, including research on large-scale distributed systems. As part of the "Build on Trainium" initiative, Amazon has established a research UltraCluster containing up to 40,000 Trainium chips, which are optimized for unique AI workloads and computational structures.
Amazon further indicated that any AI advancements created in the project will be released as open source, allowing researchers and developers to continue driving their innovations. Additionally, in August, Amazon announced a $4 billion investment in Claude developers and OpenAI competitor Anthropic.
The "Build on Trainium" project will also provide funding support for new research and student education. Amazon plans to conduct multiple rounds of research award selections, with selected proposals receiving AWS training credits and access to large Trainium UltraClusters for research. The Catalyst research group at Carnegie Mellon University has already joined the project.
Todd C. Mowry, a professor of computer science at the university, said: "AWS's Build on Trainium program allows our faculty and students to access modern accelerators like AWS Trainium on a large scale, equipped with an open programming model. This enables us to significantly expand our research in tensor program compilation, machine learning parallelization, and language model service and tuning."
Key Points:
📈 Amazon invests $110 million to advance AI research, reducing dependence on Nvidia.
💻 The "Build on Trainium" project supports university research, providing access to Trainium chips.
🌍 Research outcomes will be open-sourced, fostering continuous development and innovation in AI technology.