
"Groq has now raised over $3 billion to date, PitchBook estimates. Groq has been a hot commodity because it is working on breaking the chokehold that AI chip maker Nvidia has over the tech industry. Groq's chips are not GPUs, the graphics processing units that typically power AI systems. Instead, Groq calls them LPUs, (language processing units) and calls its hardware an inference engine - specialized computers optimized for running AI models quickly and efficiently."
"Its products are geared toward developers and enterprises, available as either a cloud service or an on-premises hardware cluster. The on-prem hardware is a server rack outfitted with a stack of its integrated hardware/software nodes. Both the cloud and on-prem hardware run open versions of popular models, like those from Meta, DeepSeek, Qwen, Mistral, Google and OpenAI. Groq says its offerings maintain, or in some cases improve, AI performance at significantly less cost than alternatives."
Groq raised $750 million at a $6.9 billion post-money valuation, surpassing earlier rumored figures. The company previously raised $640 million at a $2.8 billion valuation in August 2024 and has now raised over $3 billion to date, PitchBook estimates. Groq develops LPUs (language processing units) and an inference engine optimized for running AI models efficiently. Products are offered as cloud services or on-premises hardware clusters and run open versions of models from Meta, DeepSeek, Qwen, Mistral, Google and OpenAI. The company claims comparable or improved AI performance at significantly lower cost. Founder Jonathan Ross previously worked on Google's TPU, and Groq reports powering over two million developers.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]