OpenAI's recent moves indicate that this AI giant is actively seeking to diversify its cloud computing suppliers, a strategy that could have profound implications for its long-standing partnership with Microsoft.

According to The Information, following OpenAI's recent $6.6 billion fundraising, CEO Sam Altman and CFO Sarah Friar disclosed this strategic shift to employees. Friar informed shareholders that Microsoft could not provide the necessary processing power quickly enough, prompting OpenAI to explore other data center options. Notably, OpenAI's contract with Microsoft permits such actions.

Cloud Computing, Internet, Metaverse (1)

Image source note: The image was generated by AI, with rights managed by the service provider Midjourney

Altman's concerns stem from a wariness of competitors. He fears that Microsoft's inability to deliver servers quickly enough could hinder OpenAI's lead against Elon Musk's xAI. Musk plans to release what he claims will be the most powerful AI model, Grok3, by the end of the year, while xAI is constructing a massive server infrastructure in Memphis.

In this context, OpenAI is deepening its partnership with Oracle. In June, OpenAI announced its first collaboration with Oracle, with Microsoft only marginally involved. Despite this, the deal still contributed to Microsoft's Azure business, as OpenAI runs Azure infrastructure on Oracle servers.

Sources from The Information reveal that OpenAI is currently in talks with Oracle to lease an entire data center in Abilene, Texas. By mid-2026, the Abilene facility could reach nearly 1 gigawatt of power, potentially housing hundreds of thousands of Nvidia AI chips. With sufficient energy supply, the data center could expand to 2 gigawatts.

Meanwhile, Microsoft is also striving to meet OpenAI's demands. Microsoft plans to provide OpenAI with approximately 300,000 of Nvidia's latest GB200 graphics processors by the end of next year at data centers in Wisconsin and Atlanta. Altman has requested Microsoft to expedite the Wisconsin project, which could partially open by the second half of 2025.

Facing growing computational demands and cost pressures, OpenAI plans to use more of its own AI chips in the future. The company is collaborating with Broadcom and Marvell to design ASIC chips and has reportedly reserved capacity for TSMC's new A16 Angstrom process, with mass production scheduled to begin in the second half of 2026.

These initiatives reflect OpenAI's determination to maintain technological leadership in the rapidly evolving AI field. However, they also raise some critical questions:

  1. Will OpenAI's partnership with Microsoft be affected? Although the current contract allows OpenAI to seek other suppliers, this diversification strategy could alter the dynamics of the relationship in the long term.
  2. Can Oracle become a reliable partner for OpenAI? Given Oracle's relatively weak position in the cloud computing market, it remains to be seen whether it can meet OpenAI's growing demands.
  3. What impact will OpenAI's strategy of developing its own chips have? This could not only change its relationship with existing hardware suppliers but also have far-reaching implications for the entire AI chip market.
  4. Can this diversification strategy help OpenAI maintain its lead against competitors like xAI? Changes in the competitive landscape could reshape the trajectory of the entire AI industry.
  5. How will issues of energy supply and sustainability be addressed? As data center sizes and energy consumption surge, ensuring sustainable development will become a significant challenge.