Insiders say the future of AI will be smaller and cheaper than you think | Fortune
Briefly

Insiders say the future of AI will be smaller and cheaper than you think | Fortune
"Their valuation is based on, you know, bigger is better, which is not necessarily the case,"
"We do use large language models. We don't need the biggest ones. There's a threshold at which point a large language model is able to follow instructions in a limited domain, and is able to use tools and actually communicate with other agents,"
"To put that into perspective, GPT 3.5 was more than 400 billion parameters and had to be run in a data center. A 17 billion parameter model can run on your MacBook. That's the difference, and that's the trend."
HSBC analysis shows OpenAI claiming $20 billion revenue, committing $1.4 trillion to new data centers, and still needing $207 billion more even with $200 billion-plus revenues by 2030. A contrasting vision emphasizes smaller AI operations built around specialized agents that handle niche tasks and avoid reliance on gargantuan LLMs. There exists a capability threshold where more modest models can follow instructions, use tools, and communicate with other agents, which can be sufficient for many applications. The DeepSeek example demonstrates that a 17-billion-parameter model outperformed GPT-3.5, ran on a laptop, and cost far less to develop, signaling a trend.
Read at Fortune
Unable to calculate read time
[
|
]