Will the Circle Be Unbroken?
Deal will provide access to hundreds of thousands of Nvidia chips that power ChatGPT.
On Monday, OpenAI announced it has signed a seven-year, $38 billion deal to buy cloud services from Amazon Web Services to power products like ChatGPT and Sora. It’s the company’s first big computing deal after a fundamental restructuring last week that gave OpenAI more operational and financial freedom from Microsoft.
The agreement gives OpenAI access to hundreds of thousa…
Will the Circle Be Unbroken?
Deal will provide access to hundreds of thousands of Nvidia chips that power ChatGPT.
On Monday, OpenAI announced it has signed a seven-year, $38 billion deal to buy cloud services from Amazon Web Services to power products like ChatGPT and Sora. It’s the company’s first big computing deal after a fundamental restructuring last week that gave OpenAI more operational and financial freedom from Microsoft.
The agreement gives OpenAI access to hundreds of thousands of Nvidia graphics processors to train and run its AI models. “Scaling frontier AI requires massive, reliable compute,” OpenAI CEO Sam Altman said in a statement. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
OpenAI will reportedly use Amazon Web Services immediately, with all planned capacity set to come online by the end of 2026 and room to expand further in 2027 and beyond. Amazon plans to roll out hundreds of thousands of chips, including Nvidia’s GB200 and GB300 AI accelerators, in data clusters built to power ChatGPT’s responses, generate AI videos, and train OpenAI’s next wave of models.
Wall Street apparently liked the deal, because Amazon shares hit an all-time high on Monday morning. Meanwhile, shares for long-time OpenAI investor and partner Microsoft briefly dipped following the announcement.
Massive AI compute requirements
It’s no secret that running generative AI models for hundreds of millions of people currently requires a lot of computing power. Amid chip shortages over the past few years, finding sources of that computing muscle has been tricky. OpenAI is reportedly working on its own GPU hardware to help alleviate the strain.
But for now, the company needs to find new sources of Nvidia chips, which accelerate AI computations. Altman has previously said that the company plans to spend $1.4 trillion to develop 30 gigawatts of computing resources, an amount that is enough to roughly power 25 million US homes, according to Reuters.
Altman has also said that eventually, he would like OpenAI to add 1 gigawatt of compute every week. That ambitious plan is complicated by the fact that one gigawatt of power is roughly equivalent to the output of one typical nuclear power plant, and Reuters reports that each gigawatt of compute build-out currently comes with a capital cost of over $40 billion.
These aspirational numbers are far beyond what long-time cloud partner Microsoft can provide, so OpenAI has been seeking further independence from its wealthy corporate benefactor. OpenAI’s restructuring last week moved the company further from its nonprofit roots and removed Microsoft’s right of first refusal to supply compute services in the new arrangement.
Even before last week’s restructuring deal with Microsoft, OpenAI had been forced to look elsewhere for computing power: The firm made a deal with Google in June to supply it with cloud services, and the company struck a deal in September with Oracle to buy $300 billion in computing power for about five years. But it’s worth noting that Microsoft’s compute power is still essential for the firm: Last week, OpenAI agreed to purchase $250 billion of Microsoft’s Azure services over time.
While these types of multi-billion-dollar deals seem to excite investors in the stock market, not everything is hunky dory in the world of AI at the moment. OpenAI’s annualized revenue run rate is expected to reach about $20 billion by year’s end, Reuters notes, and losses in the company are also mounting. Surging valuations of AI companies, oddly circular investments, massive spending commitments (which total more than $1 trillion for OpenAI), and the potential that generative AI might not be as useful as promised have prompted ongoing speculation among both critics and proponents alike that the AI boom is turning into a massive bubble.
Meanwhile, Reuters has reported that OpenAI is laying the groundwork for an initial public offering that could value the company at up to $1 trillion. Whether that prospective $1 trillion valuation makes sense for a company burning through cash faster than it can make it back is another matter entirely.
Benj Edwards is Ars Technica’s Senior AI Reporter and founder of the site’s dedicated AI beat in 2022. He’s also a tech historian with almost two decades of experience. In his free time, he writes and records music, collects vintage computers, and enjoys nature. He lives in Raleigh, NC.