Cerebras IPO goes 20x oversubscribed — boosting share count and price range ahead of Nasdaq debut
Key Points
- Cerebras upsizes its Nasdaq IPO to $4.8 billion on 20x oversubscription, raising the price range to $1.51–$1.60 per share and increasing share count from 28 million to 30 million.
- The AI chip maker's entire-wafer processor design delivers petabits-per-second internal bandwidth, allowing inference models to return full-page responses in seconds instead of minutes, addressing the industry's inference latency bottleneck.
- Early investor Benchmark retains over 20 percent of Cerebras and could see outsized returns if the stock approaches valuation multiples applied to comparable inference chip makers in China.
Summary
Cerebras IPO Upsized to $4.8B on 20x Oversubscription
Cerebras raised its IPO price range and share count ahead of a Nasdaq debut set for May 14, betting that demand for AI inference chips has reached critical mass. The company increased its offering from 28 million shares at $1.15–$1.25 per share to 30 million shares at $1.51–$1.60, targeting $4.8 billion in gross proceeds instead of the originally planned $3.5 billion. The offering drew more than 20 times the available shares in demand — roughly $100 billion of bids for a $5 billion raise — a rare signal of institutional appetite for semiconductor infrastructure plays.
The chip maker took a radically different manufacturing approach than the industry standard. Rather than cutting a silicon wafer into hundreds of individual chips, Cerebras used the entire 300-millimeter wafer as a single processor, packing 4 trillion transistors and 900,000 AI-oriented compute cores onto one die. The critical advantage: petabits per second of internal bandwidth, which translates to dramatically faster memory access for the large key-value caches required in transformer-based inference workloads.
Why now matters for demand
The timing of an AI chip IPO in May 2026 collides with a structural shift in the AI industry. Inference costs have become the bottleneck. As AI agents proliferate and models like GPT-5.3 Spark run at scale, speed of response determines user experience and operational cost. Cerebras chips reportedly allow models to deliver full-page responses to queries in seconds rather than minutes or hours — a meaningful compression of latency that every major lab deploying inference at scale would need to evaluate.
The company has moved beyond the customer concentration problem that haunted earlier growth. Amazon and OpenAI are listed as customers. Every large AI lab — Anthropic, Meta, Google — faces exploding inference demand, and unless they have internal chip solutions ready, they become potential buyers.
Cerebras was founded in 2015 by Andrew Feldman and other industry veterans who previously worked together at SeaMicro, an ultra-dense server company that AMD acquired in 2012. The path to IPO was long: Benchmark led an early round in May 2016, followed by a Series C in 2017, then years of quiet chip development and manufacturing. The company remained largely under the radar for five or six years while tape-outs and production scaled.
Benchmark's exit math
Early-stage timing created outsized returns for lead investor Benchmark, which still owns over 20% of Cerebras. If the stock trades at even half the valuation multiple that Shanghai applied to inference chip maker Cambricon within two years, Cerebras could cross a $500 billion market cap — a move that would make Benchmark's position one of the largest venture exits in history, assuming the fund holds.
Polymarket odds project Cerebras to close above $50 billion market cap on day one, roughly double the implied $26 billion valuation at the top of the IPO range.
Every deal, every interview. 5 minutes.
TBPN Digest delivers summaries of the latest fundraises, interviews and tech news from TBPN, every weekday.