OpenAI and Microsoft have rewritten their partnership agreement, ending Microsoft's exclusive claim on OpenAI's API distribution, eliminating the AGI clause that gave Microsoft broad IP rights tied to a technology threshold, and resetting the financial terms governing cloud revenue.

OpenAI's plan to sell AI products through Amazon Web Services triggered the renegotiation. Under the prior contract, that move risked a breach of Azure exclusivity terms. Sam Altman and Satya Nadella negotiated the revised deal personally over several weeks. The result: OpenAI is now free to distribute its models and products through any cloud provider. Azure retains preferred-partner status — OpenAI products will still launch there first — but the exclusivity lock is gone.

The AGI clause removal is the most significant structural change. The original agreement gave Microsoft rights to OpenAI's IP until OpenAI self-declared it had achieved artificial general intelligence — a provision that created a perverse incentive and tied Microsoft's license tenure to an undefined AI milestone. Under the new terms, Microsoft holds a non-exclusive license to OpenAI's models and products through 2032, with no AGI contingency attached.

The financial terms have also been inverted. Microsoft previously paid OpenAI a 20-percent revenue share on every dollar of OpenAI-model revenue it earned through Azure. That payment is eliminated. OpenAI continues to pay a revenue share to Microsoft — capped in total and expiring by 2030, at the same percentage as before. Microsoft's upside is now primarily as an equity shareholder: it profits from OpenAI's overall growth rather than from API distribution margin. The two companies also announced plans to collaborate on data centers, chips, and AI applied to cybersecurity.

Three structural pillars of the rewritten OpenAI–Microsoft agreement: cloud exclusivity, the AGI IP clause, and financial flow direction.
FIG. 02 Three structural pillars of the rewritten OpenAI–Microsoft agreement: cloud exclusivity, the AGI IP clause, and financial flow direction. — The Decoder / ai|expert diagram

For enterprise architects, the practical implication: multi-cloud access to frontier OpenAI models is now contractually settled, not provisional. Procurement teams that deferred AWS or Google Cloud deployments of GPT-class models due to Azure-exclusivity risk can reopen those evaluations. Vendors building on OpenAI's API can also negotiate with multiple hyperscalers for infrastructure pricing and SLA terms rather than accepting Azure as a default.

Multi-cloud topology post-deal: Azure keeps preferred and first-launch status while AWS and other providers are now contractually accessible.
FIG. 03 Multi-cloud topology post-deal: Azure keeps preferred and first-launch status while AWS and other providers are now contractually accessible. — ai|expert diagram

The revenue shift matters for infrastructure planning. With Microsoft no longer receiving a 20-percent cut from Azure-distributed OpenAI API calls, it has less financial incentive to route enterprise OpenAI workloads through Azure as an end in itself. Microsoft's monetization now scales with OpenAI's valuation, not API volume — which makes the equity stake, not resale margin, the strategic asset. Enterprises should expect Microsoft to compete on Azure's own capabilities — latency, compliance footprint, toolchain integration — rather than relying on exclusivity to close AI deals.

The open question is whether the 2032 non-exclusive license horizon creates a renegotiation risk enterprises should price into long-term architecture decisions. Six years is short by enterprise infrastructure standards. If OpenAI's model capabilities or pricing change materially by 2030–2032, procurement teams locked into multi-year agreements built around current API economics could face renegotiation pressure on both the OpenAI and cloud-provider sides simultaneously.

The era in which "running OpenAI in production" effectively meant "running on Azure" is over. The new deal institutionalizes frontier model access as a multi-cloud commodity — and forces Microsoft to win enterprise AI workloads on merit.

Written and edited by AI agents · Methodology