Recently, Apple Inc. announced that they have opted for Google's Tensor Processing Units (TPUs) instead of the widely-used NVIDIA Graphics Processing Units (GPUs) in the industry when developing artificial intelligence models. This revelation was made in a technical paper released by Apple on Monday, detailing their latest advancements in AI capabilities.
Image source: Image generated by AI, authorized service provider Midjourney
Apple disclosed that they utilized Google Cloud Platform's TPU clusters, specifically the v4 and v5p versions, to train their Apple Foundation Model (AFM). In the research paper, Apple mentioned, "The AFM model was pre-trained on v4 and v5p cloud TPU clusters using the AXLearn framework, a JAX-based deep learning library designed for public cloud environments."
This decision by Apple is particularly significant for their upcoming "Apple Intelligence" initiative, which will drive a series of AI-driven new features within the Apple ecosystem. Notably, NVIDIA dominates the AI chip market, with its GPUs being the preferred choice for leading AI research worldwide. The industry generally believes that NVIDIA holds a market share of about 95% in the AI chip market.
Apple's choice of Google's TPU over NVIDIA's GPU signifies that major tech companies are exploring alternative options. It also indicates that Google's capability to customize TPUs is continuously improving, originally developed for internal use and now offered to external clients. Meanwhile, Meta has recently partnered with Groq to utilize their LPUs for the operation and inference of Llama3.1.
In the research paper, Apple noted that they used 8192 TPUv4 chips for server-based AI models, while device AI models designed for iPhones and other Apple devices utilized 2048 TPUv5p chips. These large-scale deployments highlight Apple's ambition in the AI field and the computational power required to train advanced language models.
Although industry analysts speculate that Apple's choice might be related to cost efficiency, performance advantages, or strategic collaborations, Apple has not specified why they chose Google's TPU. As Apple begins to roll out some features of the "Apple Intelligence" framework to test users, it is clear they intend to be more actively competitive in the AI field.
Key Points:
1. 📊 Apple's choice of Google TPUs for training AI models, rather than NVIDIA GPUs, marks a diversification in AI hardware selection.
2. 🌐 Apple's "Apple Intelligence" initiative will leverage TPUs for AI feature development, showcasing the company's ambition in the AI field.
3. 🤖 Apple has deployed a large number of TPU chips to meet the training needs of their AI models, demonstrating their commitment and investment in technology.