
"If OpenAI succeeds in training a state-of-the-art LLM with its own AI chip, it will be a first. China's DeepSeek has already attempted to produce a successor to the DeepSeek-R1 model without Nvidia's hardware, but has so far failed due to inadequate hardware and software tooling. As a rule, the AI industry is tied to Nvidia, albeit in different forms. Google, for example, is the only party that can make its mark as both a hardware and model builder with its own Tensor Processing Units."
"It is much more likely that OpenAI's new chip is intended for the daily operation of GPT-5 and later LLMs. In that case, the processor will provide a boost to the AI player, with growing capacity leading to ever lower costs. This should free OpenAI from the current status quo, in which it spends billions of dollars to deliver ever-better AI models with a minimal margin."
OpenAI is building a proprietary AI processor developed with Broadcom, scheduled for release in 2026 and intended for exclusive internal use. The processor aims to reduce dependence on external providers such as Nvidia, Google Cloud, and Microsoft and to lower operating costs by improving capacity for daily operation of GPT-5 and later models. Uncertainty remains about the chip's suitability for training frontier models like GPT-6 prior to mass production. Previous attempts by other companies to avoid Nvidia hardware failed due to inadequate hardware and software tooling. OpenAI pursued design finalization and production arrangements with TSMC and investor support for chip development.
Read at Techzine Global
Unable to calculate read time
Collection
[
|
...
]