![OpenAI-AWS]()
Amazon Web Services (AWS) and OpenAI have announced a multi-year, $38 billion strategic partnership to accelerate the development and deployment of advanced artificial intelligence (AI) workloads. Under this deal, OpenAI will immediately begin running its core generative AI workloads—such as ChatGPT and next-generation models—on AWS’s world-class cloud infrastructure.
This marks a major expansion in AI-cloud collaboration, positioning AWS as the primary compute backbone for OpenAI’s large-scale models and future agentic AI systems.
Key Highlights
$38 Billion Partnership over seven years to provide AWS compute resources for OpenAI.
Massive Scale: OpenAI gains access to Amazon EC2 UltraServers powered by hundreds of thousands of NVIDIA GPUs and the ability to scale to tens of millions of CPUs.
Immediate Implementation: OpenAI starts using AWS compute right away to handle high-performance workloads.
Performance + Security: AWS infrastructure ensures price efficiency, reliability, and secure scaling for AI model training and inference.
Ongoing Collaboration: Builds on existing cooperation between the two companies through Amazon Bedrock.
Powering the Future of AI
With this partnership, OpenAI will leverage AWS’s optimized EC2 UltraServers designed for maximum AI processing efficiency. These servers connect clusters of NVIDIA GB200 and GB300 GPUs on the same network, delivering low-latency and high-performance computing across workloads—from serving ChatGPT inference to training next-generation models.
The infrastructure will expand rapidly through 2026, with capacity to scale even further into 2027 and beyond. The goal: to give OpenAI the massive, reliable compute power needed to develop increasingly capable frontier models.
“Scaling frontier AI requires massive, reliable compute,” said Sam Altman, Co-founder and CEO of OpenAI. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
“As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” said Matt Garman, CEO of AWS.
Expanding AI Access Through Amazon Bedrock
Earlier this year, OpenAI’s open weight foundation models became available on Amazon Bedrock, AWS’s managed service for foundation models. These models are already being used by major organizations—including Comscore, Peloton, Thomson Reuters, and Triomics—for agentic workflows, scientific research, and mathematical problem-solving.
This latest partnership cements OpenAI’s role as one of the most widely used model providers within Amazon Bedrock, bringing generative AI capabilities to millions of AWS developers globally.
Learn More
Developers and enterprises can explore OpenAI’s open weight models on Amazon Bedrock here: https://aws.amazon.com/bedrock/openai