OpenAI has entered its contract era. The company behind ChatGPT has stitched together chip supply at an unprecedented scale, widened its cloud footprint, and taught a chatbot how to close a sale. The through-line isn’t mystique but mechanics: lock down what’s scarce, rent what’s flexible, and turn an audience into a checkout line.
The scope finally matches the talk. A letter of intent with Nvidia puts at least 10 gigawatts of systems on OpenAI’s roadmap, with up to $100 billion of Nvidia investment tied to the rollout. AMD arrives as a true second source — with a warrant that could hand OpenAI up to 10% of AMD [if milestones are hit](https://qz.com/openai-amd-deal-chatgpt-ai-chips-…
OpenAI has entered its contract era. The company behind ChatGPT has stitched together chip supply at an unprecedented scale, widened its cloud footprint, and taught a chatbot how to close a sale. The through-line isn’t mystique but mechanics: lock down what’s scarce, rent what’s flexible, and turn an audience into a checkout line.
The scope finally matches the talk. A letter of intent with Nvidia puts at least 10 gigawatts of systems on OpenAI’s roadmap, with up to $100 billion of Nvidia investment tied to the rollout. AMD arrives as a true second source — with a warrant that could hand OpenAI up to 10% of AMD if milestones are hit — while Broadcom signs on to build OpenAI’s first in-house processor. Add Google Cloud as a supplier, and the single-vendor optics fade.
The other lane for OpenAI is revenue. ChatGPT’s “Instant Checkout” started with Etsy, moved to Shopify next, and is now part of a partnership with Walmart. Customers can now complete a purchase directly inside the ChatGPT interface. Stripe provides the payment infrastructure, and OpenAI earns a transaction fee on each sale. That moves ChatGPT from demo to storefront and, if the conversion math holds, it turns intent into income without sending users back to a browser.
Money is following the build. As of June, OpenAI’s annualized revenue run rate recently hit about $10 billion, nearly double December’s pace. Last year’s losses were heavy; the bet is that unit costs fall as capacity lands and that commerce and enterprise channels widen the margin. OpenAI is now the world’s most valuable startup after a secondary stock sale pegged its valuation at $500 billion.
OpenAI spent 2024 laying the groundwork for this year’s sprint. Apple announced opt-in ChatGPT access across iOS, iPadOS, and macOS as part of Apple Intelligence; Microsoft relinquished its OpenAI board-observer seat under regulatory scrutiny; Oracle, Microsoft, and OpenAI said Azure AI capacity would be extended over Oracle Cloud Infrastructure; PwC became OpenAI’s first ChatGPT Enterprise reseller and bought around 100,000 seats; and OpenAI signed licensing deals with the Financial Times, News Corp, and Reddit to allow attributed answers and data access.
The question right now isn’t whether OpenAI can sign deals. It can. And it will continue to do so. The question is whether these contracts buy real leverage — on delivery windows, on price curves, on distribution — fast enough to make the economics sing.
Nvidia: The 10-gigawatt anchor
The Nvidia pact is perhaps the loudest number in AI: a letter of intent to deploy at least 10 gigawatts of Nvidia systems for OpenAI’s next-gen infrastructure, with Nvidia intending to invest up to $100 billion as capacity comes online. The structure pairs supply with vendor financing, which buys priority on delivery and line of sight into Nvidia’s roadmap. The structure also resets negotiating dynamics across the rest of OpenAI’s shopping list — because if you can commit at this scale, you can demand better pricing and timing everywhere else. First deployments are slated to begin in 2026 on Nvidia’s Vera Rubin platform.
AMD: The second source with upside attached
OpenAI’s deal with AMD does two jobs at once: It secures multiyear GPU supply for its coming MI-series parts and gives OpenAI a warrant for up to 160 million AMD shares — roughly 10% — that vests with performance milestones tied to capacity and purchases. That alignment makes AMD a profit-linked partner whose success lowers OpenAI’s cost of compute. Practically, it means a parallel lane for training and inference, plus credible leverage in talks with Nvidia. The first 1-gigawatt deployment is scheduled to start in the second half of 2026, with a path to 6 gigawatts over the agreement.
Broadcom: Custom silicon as cost control
OpenAIjust tapped Broadcom to co-design its first in-house AI processor. Development and deployment will start in the second half of 2026, with plans that scale later into the decade. Custom silicon won’t replace GPUs overnight, but this is a lever on cost per token and a hedge against a single vendor’s supply cadence. Even modest gains at inference scale ricochet through every product that sits on top of the stack. The Broadcom partnership turns an abstract aspiration — “own the cost curve” — into a dated, resourced program.
Google Cloud: The multicloud that actually routes work
OpenAI’s quiet addition of Google Cloud this summer — in an unprecedented deal — marked the end of the company’s single-tenant life on Microsoft’s servers. OpenAI added Google Cloud as a supplier to meet surging capacity needs, a practical fix for capacity crunches that also doubles as leverage. When Azure fills up, Google’s infrastructure keeps the lights on — and when contracts come due, competition does the negotiating. By midsummer, Google’s name appeared on OpenAI’s partner list, confirming what insiders already know: The multicloud isn’t a talking point anymore, it’s the new baseline for scale.
Walmart: A prompt that becomes a shopping cart
The recent Walmart tie-in moves chat-commerce from pilot to mainstream. Shoppers — including Sam’s Club members — will soon be able to search, choose, and check out inside ChatGPT via Instant Checkout. For Walmart, this is a new front door upstream of search. For OpenAI, this is tied directly to intent and is a path to habit formation that doesn’t depend on pushing people to a browser. If this partnership works as intended, the default flow of e-commerce may end up going through a chat window, not a browser tab.
Etsy and Shopify: Where chat meets checkout
OpenAI’s Instant Checkout now lives in the wild: U.S.-based ChatGPT users can tap “Buy” inside a chat for U.S. Etsy listings, while Shopify’s million-plus merchants stand in line to follow. At launch, only single-item purchases are supported; multi-item carts and broader merchant onboarding are next. The move tests the AI’s ability to survive product mess: every variant, missing image, and odd title now meets the checkout rail. Shopify brings amplitude. Once that faucet opens, the flood could undermine every other funnel in retail. Merchants will pay a small commission per sale; consumers won’t see extra fees. Shopify calls this “bring[ing] commerce to ChatGPT” with “no links or redirects, just seamless commerce.” And OpenAI gets to teach millions that the shortest line from intention to purchase lives in the conversation they were already having.