AI chipmaker Cerebras raises $1 billion prior to long-awaited IPO
Briefly

AI chipmaker Cerebras raises $1 billion prior to long-awaited IPO
"After production by TSMC, Cerebras' chips are packaged in America. This second process is particularly complex in the case of this AI company due to the large size of the processors. Unlike chips from Nvidia, AMD, or Intel, these are "wafer-scale" products, i.e., chips that are as large as the discs upon which the chip designs are drawn. Such a design is only possible due to leaving large wiggle room for imperfections, but it allows for enormous bandwidth and on-die memory, critical for AI performance."
""We have increased production capacity eight times in the past 18 months and will increase it four more times in the next six to eight months," says CEO Andrew Feldman. Cerebras' revenue grew from less than $6 million in the second quarter of 2023 to approximately $70 million in the same period of 2024. This year, the company announced customers such as Hugging Face, Meta, Notion, and Perplexity."
"The new funding round values Cerebras at $8.1 billion. This is more than double the $4 billion the company was worth in 2021. Feldman explains to CNBC that the company still plans to go public. "I don't think this indicates a preference for one or the other. We have tremendous opportunities ahead of us, and it's good not to let them slip away due to a lack of capital.""
Cerebras has raised $1.1 billion from multiple investors and delayed an expected IPO while maintaining plans to go public. The funding values the company at $8.1 billion, double its 2021 valuation. The proceeds will expand U.S. production for its wafer-scale AI processors, a complex step after TSMC fabrication. Cerebras increased production capacity eightfold over 18 months and plans a fourfold increase soon. Revenue rose from under $6 million in Q2 2023 to about $70 million in Q2 2024, and customers include Hugging Face, Meta, Notion, and Perplexity. The company competes with Nvidia in AI training, inferencing, and cloud LLMs.
Read at Techzine Global
Unable to calculate read time
[
|
]