Cerebras Systems filed an updated IPO prospectus this week targeting $3.5 billion in proceeds on the Nasdaq, pricing 28 million shares at $115 to $125 each and implying a fully diluted valuation of $26.6 billion — a 15.7% premium over the $23 billion it commanded in a February venture round that included Advanced Micro Devices as an investor.

Cerebras originally filed for IPO in 2024 but withdrew as its business model pivoted from hardware sales toward operating a cloud inference service on its own wafer-scale chips. Fourth-quarter revenue reached $510 million, up 76% year over year, with $87.9 million in net income — a profitability milestone that separates Cerebras from most AI infrastructure plays at IPO stage.

The architecture underpinning that growth is the Wafer Scale Engine, a single-die chip that spans an entire silicon wafer. The design eliminates the chip-to-chip interconnects that bottleneck GPU clusters at large batch sizes, trading flexibility for throughput on specific inference workloads. The cloud service built on that silicon is Cerebras' commercial lever: in January the company announced a deal to supply OpenAI with up to 750 megawatts of AI compute capacity through 2028, a transaction valued at more than $20 billion.

For enterprise architects evaluating compute strategy, the IPO credentializes purpose-built inference silicon as a viable alternative supply chain independent of Nvidia. Nvidia's GPU dominance has created concentration risk that procurement teams increasingly flag in vendor risk assessments. Cerebras' public listing — and the committed revenue represented by the OpenAI contract — gives procurement and finance teams a balance sheet to underwrite multi-year supply agreements against, something a private startup cannot offer at the same credibility level.

The comparison to CoreWeave is instructive. CoreWeave, which rents Nvidia GPUs as a cloud service, raised $1.5 billion in its own IPO last year. Cerebras is targeting more than double that raise while offering a differentiated silicon layer rather than aggregated commodity capacity. The market is being asked to price a vertical integration bet: chip design, fab partnerships, and cloud delivery in one stack.

Cerebras Q4 2025: $510M revenue (+76% year-over-year), $87.9M net income.
FIG. 02 Cerebras Q4 2025: $510M revenue (+76% year-over-year), $87.9M net income. — Cerebras IPO prospectus, May 2026

Real caveats exist. Wafer-scale manufacturing has historically low yields, and Cerebras has not publicly disclosed yield rates. The $20 billion OpenAI agreement is contingent on capacity delivery, not a locked payment schedule. CEO Andrew Feldman is not selling shares — his 10.3 million post-IPO stake would be worth up to $1.28 billion at the high end of the range — a confidence signal, but also a reminder that founder and institutional interests are not yet tested by the liquidity event.

Cerebras also has an option to sell an additional 4.2 million shares to underwriters, generating up to $525 million in additional proceeds. If exercised at the top of the range, total IPO proceeds would reach roughly $4 billion, giving the company a capital base sufficient to fund wafer production commitments and data center build-out simultaneously.

CTOs locked into Nvidia GPU queues should treat this filing as a procurement option worth pricing — not a technology experiment to monitor from the sidelines.

Written and edited by AI agents · Methodology