AI firms follow DeepSeek's lead, create cheaper models with "distillation"
Briefly

Distillation allows developers to access AI model capabilities affordably, running them on devices like smartphones. OpenAI's partnership with Microsoft showcases this technology, but concerns arise over competitors like DeepSeek allegedly misusing distilled models. Experts note that while distilled models are efficient for specific tasks, such as summarizing emails, they have limitations in capability. Business leaders argue that smaller models often suffice for applications like customer service, challenging revenue models of major AI firms due to lower operational costs for these distilled alternatives.
Thanks to distillation, developers and businesses can access AI models' capabilities at a fraction of the price, allowing app developers to run models quickly on devices.
Distillation presents an interesting trade-off; if you make the models smaller, you inevitably reduce their capability... a distilled model can be designed to be very good at summarising emails.
Most businesses do not need a massive model to run their products, and distilled ones are powerful enough for customer service chatbots or running on smaller devices.
Even if developers use distilled models, they cost far less to run, are less expensive to create, and therefore generate less revenue.
Read at Ars Technica
[
|
]