Google assembles four-partner chip supply chain with Broadcom, MediaTek, Marvell to challenge Nvidia in inference
Briefly

Google assembles four-partner chip supply chain with Broadcom, MediaTek, Marvell to challenge Nvidia in inference
"Google's Ironwood TPU, the seventh generation designed specifically for inference, delivers ten times the peak performance of the TPU v5p, featuring 192 gigabytes of HBM3E memory per chip and 7.2 terabytes per second of bandwidth."
"The custom chip supply chain includes four design partners, with Broadcom focusing on training, MediaTek on cost-effective inference, and Marvell potentially adding memory processing capabilities."
"Google's strategy aims to produce millions of Ironwood TPUs this year, with commitments from companies like Anthropic for up to one million units, indicating strong market demand."
Google is developing a custom chip supply chain with four design partners: Broadcom, MediaTek, Marvell, and Intel. The roadmap includes the Ironwood TPU, designed for inference, and future TPU v8 chips at TSMC 2nm by late 2027. Broadcom's 'Sunfish' will focus on training, while MediaTek's 'Zebrafish' targets inference at reduced costs. Marvell is in discussions to enhance the lineup with additional processing units. This strategy positions Google as a significant competitor to Nvidia in AI inference, emphasizing performance and scalability.
Read at TNW | Artificial-Intelligence
Unable to calculate read time
[
|
]