AI Markets · · 7 min read

Cerebras targets $4 billion IPO as Nvidia’s AI chip monopoly fragments

The specialist chipmaker's $40 billion valuation bet crystallises the market shift from GPU dominance to purpose-built architectures—accelerated by China's DeepSeek V4 breakthrough on Huawei silicon.

Cerebras Systems is targeting a $4 billion raise at a $40 billion valuation in its upcoming initial public offering, according to Bloomberg, marking the highest-profile test yet of whether investors will pay premium multiples for challengers to Nvidia’s AI accelerator dominance.

The Sunnyvale-based company recorded $510 million in sales for 2025, per its CNBC-cited S-1 filing, alongside $24.6 billion in remaining performance obligations—effectively a backlog signaling multi-year deployment commitments. The largest of those: a $10 billion agreement to provide 750 megawatts of computing power to OpenAI through 2028, MarketWise reported. That single contract underwrites Cerebras’ claim that its wafer-scale engine architecture—a 46,225 mm² chip with 4 trillion transistors delivering 125 petaflops—can displace Nvidia GPUs in latency-sensitive inference workloads.

Cerebras IPO Snapshot
Target valuation$40 billion
2025 revenue$510 million
Remaining obligations$24.6 billion
OpenAI deal value$10 billion

The erosion of GPU hegemony

Cerebras CEO Andrew Feldman framed the OpenAI win as a direct displacement. “Obviously, [Nvidia] didn’t want to lose the fast inference business at OpenAI, and we took that from them,” he told the Wall Street Journal, TechCrunch noted. The quote captures the competitive bifurcation emerging in AI compute: training—where Nvidia’s H100 and B200 GPUs remain entrenched—versus inference, where latency and energy efficiency favor architectures that eliminate GPU memory bottlenecks.

Nvidia’s market capitalisation stands at $4.86 trillion as of May 2026, CompaniesMarketCap data shows, with AMD at $587.82 billion. Cerebras’ $40 billion target valuation implies investors are pricing in at least three more hyperscale contracts of OpenAI magnitude to justify the multiple. “The AI chip market is large enough to support multiple winners, but Cerebras needs to prove that wafer-scale economics work at hyperscaler volumes,” Ben Bajarin, CEO of Creative Strategies, told Tech Insider. “The $10 billion OpenAI deal is proof of concept. Now they need three or four more deals like it to justify a $25 billion valuation.”

“The AI chip market is large enough to support multiple winners, but Cerebras needs to prove that wafer-scale economics work at hyperscaler volumes.”

— Ben Bajarin, CEO, Creative Strategies

Geopolitical acceleration: DeepSeek’s Huawei breakthrough

Eight days before Cerebras filed its IPO update, China’s DeepSeek released V4—a 1.6 trillion parameter flagship model trained entirely on Huawei Ascend 950 chips and Cambricon accelerators, according to Codersera analysis. Unlike the earlier R1 model, which relied on Nvidia GPUs, V4 demonstrates that frontier-class performance no longer requires access to US-controlled semiconductor supply chains. Wei Sun, principal analyst at Counterpoint Research, called the achievement significant: “V4’s ability to run natively on local chips could have massive implications, helping Beijing achieve more AI sovereignty and further reduce reliance on Nvidia,” he told CNBC.

Nvidia CEO Jensen Huang acknowledged the strategic risk in an April interview. “The best AI researchers in the world, because they are limited in compute, also come up with extremely smart algorithms,” he said, per Fortune. “The day that DeepSeek comes out on Huawei first, that is a horrible outcome for [the U.S.].” DeepSeek has indicated it expects to reduce V4-Pro pricing once Huawei scales production of Ascend 950 chips in the second half of 2026—a timeline that overlaps with Cerebras’ own capacity ramp and investor roadshow.

AI chip market capitalisation leaders
Company Market cap (May 2026) Primary segment
Nvidia $4.86 trillion Training + inference GPUs
AMD $587.82 billion Data center accelerators
Cerebras (target) $40 billion Wafer-scale inference

Infrastructure demand outpacing supply

The capital flowing into AI accelerators reflects bottlenecks further downstream. DTE Energy disclosed an 8.4 gigawatt data center pipeline requiring $30 billion in infrastructure investment over five years, Utility Dive reported in October 2025, illustrating that power constraints now rival chip supply as a limiting factor for model training. AI chip startups globally raised $8.3 billion in 2026 through mid-April, according to Dealroom data cited by CNBC—a funding environment that favors companies with demonstrated hyperscale traction.

Cerebras expects to recognise 15% of its $24.6 billion obligation backlog in 2026 and 2027 combined, implying $3.69 billion in near-term revenue if contractual milestones are met. That compares to the $510 million booked in 2025, suggesting either aggressive capacity expansion or staged deployment schedules. The company’s wafer-scale engine contains 19× more transistors and delivers 28× more compute than Nvidia’s B200, per Cerebras’ own specifications, but the architecture requires customers to redesign software stacks—a switching cost that favours incumbents in shorter sales cycles.

Key market dynamics
  • Inference workloads rising from 50% of AI compute in 2025 to projected 80% by 2027, favouring specialised architectures over general-purpose GPUs
  • China’s DeepSeek V4 trained on Huawei chips demonstrates viable alternative to Nvidia ecosystem for frontier models
  • Hyperscaler diversification reducing single-vendor concentration risk but fragmenting software tooling and talent pools
  • Power infrastructure (8.4 GW pipeline) emerging as binding constraint alongside semiconductor capacity

What to watch

Cerebras’ IPO pricing, expected within weeks, will test whether public market investors value architectural differentiation at the same premium as private backers. The company faces customer concentration risk—OpenAI represents the bulk of disclosed obligations—and must demonstrate replicability across hyperscalers. Meanwhile, DeepSeek’s timeline for cost reductions on Huawei silicon in H2 2026 coincides with Cerebras’ own scaling phase, potentially compressing margins if Chinese-trained models set new cost-performance benchmarks. Track whether Cerebras announces a second anchor customer before roadshow completion; absence would signal harder enterprise sales than the OpenAI deal implies. The stock’s first-day performance will indicate whether AI infrastructure investors are rotating toward diversification plays or maintaining conviction in Nvidia’s platform dominance despite fragmenting use cases.