Following on May 9's "AI Chipmaker Cerebras Is Said to Plan Raising IPO Price Range".
From Techi:
Cerebras is no longer a vague "maybe IPO" story. As of May 6, 2026, the AI-chip company has a live roadshow, a fresh SEC filing, a proposed Nasdaq ticker and a real price range for public investors to evaluate.
The clean headline is this: Cerebras filed an amended S-1 on May 4, 2026 to sell 28 million Class A shares at an expected range of $115 to $125 per share, and the company says it has applied to list on the Nasdaq Global Select Market under the ticker CBRS. At the top of the range, the deal would raise about $3.5 billion before expenses and put Cerebras near a $26.6 billion market value, according to Reuters' May 4 report and TechCrunch's read of the same filing.
That makes Cerebras one of the most important AI infrastructure listings of 2026, but it also makes the due diligence harder. The company has real technology, real customers and real revenue growth. It also has customer concentration, unusual financing tied to OpenAI, export-control exposure and a profit line that looks better on the surface than the operating business underneath.
For TECHi readers already tracking the AI silicon race through AMD vs. Nvidia and the new AMD data-center earnings cycle, Cerebras is the public-market test of a different question: can a wafer-scale AI system become a serious investable platform, or is this IPO being priced like a perfect outcome before the proof is fully in?
The May 2026 IPO termsCerebras' May 4 preliminary prospectus says the company is offering 28 million Class A shares and expects the IPO price to land between $115 and $125 per share. The company also plans to grant underwriters a 30-day option to buy up to 4.2 million additional shares, a standard over-allotment structure that could increase the total number of shares sold if demand is strong.
The midpoint matters because it gives investors a cleaner way to size the deal. Cerebras estimates net proceeds of about $3.24 billion at a $120 midpoint after underwriting discounts, commissions and offering expenses, or about $3.73 billion if the underwriters exercise the option in full. The company says it plans to use the proceeds for general corporate purposes, including working capital, operating expenses and capital expenditures, with about $230 million earmarked for tax withholding and remittance obligations tied to RSU settlement.
The banking group is large enough to signal a serious institutional book. Cerebras named Morgan Stanley, Citigroup, Barclays and UBS Investment Bank as lead book-running managers, with Mizuho and TD Cowen as bookrunners and a longer list of co-managers. That does not make the deal low risk, but it does mean this is not a quiet niche listing.
The registration statement is still preliminary. Cerebras' own IPO release says the registration statement has been filed but is not yet effective, and the SEC's investor bulletin on IPOs reminds investors that SEC effectiveness is not an endorsement of the company or the investment merits. Until the offering is declared effective and priced, CBRS is not a normal public stock that retail investors can buy on the open market.
Why this is Cerebras' second attempt
Cerebras first filed to go public in 2024, but the IPO became tied up in national-security scrutiny around Abu Dhabi-based G42. In the original 2024 S-1, Cerebras disclosed that G42 represented $119.1 million of revenue in the first half of 2024, or 87% of total revenue for that period. That concentration made the IPO story about more than chips; it made it about customer dependency, export controls and geopolitical risk.
TechCrunch reported in April 2026 that Cerebras' earlier IPO plan was delayed by a federal review of the Abu Dhabi-based G42 investment and was ultimately withdrawn. The new May 2026 filing does not erase that history, but it changes the center of gravity by putting OpenAI and AWS at the front of the growth story.
The updated prospectus still shows that concentration risk is not gone. Cerebras says G42 accounted for 24% of 2025 revenue and 85% of 2024 revenue, while Mohamed bin Zayed University of Artificial Intelligence accounted for 62% of 2025 revenue. It also says OpenAI, G42, MBZUAI and AWS are significant customers or expected future customers whose negative demand changes could harm the business.
That is the first key point for investors: Cerebras has improved the story, but it has not turned into a diversified chip company overnight.
What Cerebras actually sells
Cerebras is not trying to build a slightly cheaper GPU. Its pitch is architectural. The company builds wafer-scale AI systems around the Wafer Scale Engine, a processor that keeps far more compute, memory and bandwidth on one very large piece of silicon instead of distributing work across many smaller chips.
The current flagship is WSE-3. Cerebras says WSE-3 has 4 trillion transistors, 900,000 AI-optimized cores, 44GB of on-chip SRAM and 125 petaflops of peak AI performance, and that the 5nm chip powers the CS-3 AI supercomputer. The same company release says CS-3 systems can be clustered up to 2,048 nodes and train models up to 24 trillion parameters.
That technical pitch matters because the AI market is splitting into workloads. Training frontier models is one market. Low-latency inference is another. Agentic coding, long reasoning, search and enterprise assistants often depend less on one benchmark score and more on how quickly a system can produce useful tokens back to the user. OpenAI makes that point directly in its Cerebras partnership announcement, saying low-latency compute can make AI responses faster and more natural across code, agents and other workloads.
Cerebras is therefore not just selling "anti-Nvidia" sentiment. It is selling a specialized system for customers that need speed, latency and token throughput. That is why the stock, if the IPO prices, should be analyzed beside Nvidia's broader AI platform rather than as a simple one-for-one replacement.
OpenAI is the center of the story
The biggest reason the 2026 IPO looks different from the 2024 attempt is OpenAI. In January, OpenAI said it was partnering with Cerebras to add 750MW of ultra-low-latency AI compute to its platform, with capacity coming online in multiple tranches through 2028. Cerebras' prospectus goes further, describing a multi-year OpenAI deal valued at more than $20 billion and saying the companies agreed to co-design future models for future Cerebras hardware.
That is powerful validation, but it also creates real dependency. The May S-1 says the Master Relationship Agreement with OpenAI represents a substantial portion of projected revenue over the next several years. It also says Cerebras must deliver capacity tranches across specified numbers of data centers and minimum capacity thresholds, and that OpenAI can terminate part or all of the agreement if Cerebras misses certain delivery or service obligations.
The financing around OpenAI is just as important. Cerebras disclosed a $1.0 billion working-capital loan from OpenAI that is tied to accelerating services, technology, manufacturing and buildout under the MRA. The filing also describes an OpenAI warrant covering 33,445,026 shares of Class N common stock at a $0.00001 exercise price, subject to vesting conditions.
That setup gives public investors both upside and risk. If Cerebras delivers, OpenAI could be the anchor customer that turns wafer-scale inference into a large cloud business. If Cerebras misses delivery milestones, the same agreement can become a pressure point.
AWS gives Cerebras a second hyperscaler channel....
....MUCH MORE
As always our interest is in the disclosures found in the filings rather than in the share flotation.
As the old-timers used to say: The banks give you too much of the cold ones and not enough of the hot ones to make it pay to play the game.
Speaking of disclosures, there a new amendment to the S1 this morning: