Close
newsletters Newsletters
X Instagram Youtube

OpenAI signs $10B AI computing deal with Cerebras

A major shift in AI infrastructure as OpenAI secures large-scale computing capacity from Cerebras. February 3, 2023. (Reuters Photo)
Photo
BigPhoto
A major shift in AI infrastructure as OpenAI secures large-scale computing capacity from Cerebras. February 3, 2023. (Reuters Photo)
January 15, 2026 03:57 PM GMT+03:00

OpenAI, the developer behind ChatGPT and other artificial intelligence (AI) models, has struck a multiyear computing deal with Cerebras Systems, an emerging alternative to Nvidia.

The agreement, valued at around $10 billion, will provide up to 750 megawatts (MW) of high-performance computing capacity between 2026 and 2028, aiming to shorten AI response times and strengthen the company’s infrastructure.

Delivering 750 MW of high-performance computing for AI models

The AI developer will integrate Cerebras’ systems to support large language models (LLMs) and other AI applications that demand significant computing power.

The deal covers a phased deployment of up to 750 MW, equivalent to the energy output of a small nuclear plant. This capacity will enable LLMs to operate faster and more efficiently, improving performance across ChatGPT and other AI services.

OpenAI partners with Cerebras in a $10 billion deal to boost AI computing power beyond Nvidia. June 11, 2023. (Adobe Stock Photo)
OpenAI partners with Cerebras in a $10 billion deal to boost AI computing power beyond Nvidia. June 11, 2023. (Adobe Stock Photo)

Faster, more efficient AI chips beyond Nvidia GPUs

Cerebras offers a Wafer-Scale Engine (WSE) technology that turns an entire silicon wafer into a single massive chip, unlike traditional graphics processing units (GPUs).

This approach reduces bottlenecks between memory and computation, allowing AI models to generate real-time responses (inference) more quickly. The move also reduces reliance on Nvidia hardware while increasing infrastructure diversity and energy efficiency.

Integrating Cerebras’ low-latency systems will significantly shorten response times for user queries.

The upgrade is expected to enhance Natural Language Processing (NLP), visual content generation, code creation, and AI agents. Faster inference will improve the experience for ChatGPT users and other AI services.

Deployment is scheduled to be completed by 2028, strengthening both the resilience and efficiency of the company’s AI infrastructure.

Driving competition and promoting hardware diversity

The OpenAI-Cerebras partnership is expected to increase competition in the AI computing market while promoting alternatives to GPU-centric solutions.

For Cerebras, the deal provides a substantial revenue boost and a high-profile reference ahead of a potential initial public offering (IPO). The collaboration also encourages other AI infrastructure providers to innovate, enhancing technological diversity and competition across the industry.

The company has historically relied heavily on Nvidia GPUs. This agreement reflects a broader strategy to reduce computing costs and expand infrastructure options through custom chip solutions.

Alongside projects with Broadcom and Advanced Micro Devices (AMD), the partnership ensures that AI models operate reliably while minimizing supply chain risks.

January 15, 2026 03:59 PM GMT+03:00
More From Türkiye Today