On January 14, 2026, OpenAI announced a major multi-year computing infrastructure agreement with AI chip startup Cerebras Systems, valued at more than $10 billion. The deal will see Cerebras supply up to 750 megawatts of computing power to OpenAI’s growing AI platforms through 2028, marking one of the largest partnerships in the AI hardware space to date.
Purpose of the Agreement
OpenAI said the partnership with Cerebras is designed to expand and diversify its AI computing capabilities, particularly for real-time inference workloads — the part of AI operations where models process user queries and generate responses quickly. By integrating Cerebras’s specialized wafer-scale AI processors, OpenAI aims to reduce latency and improve performance across services such as ChatGPT and other advanced AI tools.
According to the companies, this move will help OpenAI meet rapidly increasing demand for AI compute power and ensure its infrastructure can scale efficiently as usage grows. The compute capacity from Cerebras will be deployed in phases through 2028, integrating into OpenAI’s systems alongside other compute resources.
Why It Matters
- Scale and Scope: The agreement provides a substantial expansion of computing capacity — 750 MW of power over three years — which rivals the output of small power plants dedicated to AI workloads.
- Strategic Diversification: OpenAI has historically relied heavily on GPU-based compute from major providers like Nvidia. Partnering with Cerebras — a specialist in wafer-scale AI chips — helps reduce dependence on any single vendor and broadens its hardware ecosystem.
- Faster AI Responses: Cerebras’s architecture is optimized for low-latency inference, which means AI models can respond more quickly and efficiently, especially under heavy load.
- Market Impact: The deal reinforces Cerebras as a significant competitor to traditional chipmakers like Nvidia and AMD and highlights the escalating importance of compute infrastructure in the global AI race.
About Cerebras Systems
Founded in 2015, Cerebras Systems is known for its wafer-scale AI processors, which integrate massive compute and memory resources on a single giant chip. This approach aims to accelerate both training and inference for large AI models. Prior to this deal, Cerebras had been preparing for an initial public offering (IPO) and has raised substantial capital while diversifying its customer base.
Looking Ahead
The OpenAI-Cerebras contract reflects the escalating emphasis on AI infrastructure investments as demand for powerful, efficient computation surges. With compute capacity being a key competitive advantage in AI development, this deal is likely to influence future hardware partnerships, industry valuations, and the broader technology ecosystem.