The AI bubble will pop. It's up to us to replace it responsibly | Mark Surman
Briefly

The AI bubble will pop. It's up to us to replace it responsibly | Mark Surman
"Tech investors were riding high, convinced that a website and a Super Bowl ad were all it took to get rich quick. Spending was mistaken for growth; marketing was mistaken for a business model. In just a few months, the dot-com boom would go bust: $1.7tn in market value vanished, and the broader economy took a $5tn hit. Yet something remarkable emerged from the wreckage."
"Nearly 80% of stock gains in 2025 are concentrated in just seven companies Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia and Tesla, all of whom are vying for control of the full AI stack that will underpin our shared future hardware, software, data, energy and infrastructure. This isn't just about market share, it's about who decides how billions of people learn, create and see the world. That level of concentration should worry us all."
"And like the dot-com days, valuations are rocketing without clear paths to profitability. Companies are selling the fantasy that AI will replace human workers, even though 95% of AI experiments inside firms fail to reach production. And instead of building public-interest tools that expand human potential, much of the industry is generating what Cory Doctorow calls productive residue a flood of synthetic media, misinformation and deepfakes."
The dot-com crash erased vast market value and economic wealth but also catalyzed creative rebuilding: web 2.0, open-source software, Firefox and Wikipedia. The current AI surge shows a similar pattern of extreme concentration, with most 2025 stock gains tied to seven firms competing to control the entire AI stack—hardware, software, data, energy and infrastructure—and thereby shape how billions learn, create and perceive the world. Valuations are soaring without clear profitability, internal AI experiments largely fail to reach production, and the industry is producing significant harmful externalities such as synthetic media, misinformation and deepfakes. The underlying problem is an extractive economic model that hoards data and prioritizes domination over public-interest outcomes; different economic logics could yield tools that expand human potential instead.
Read at www.theguardian.com
Unable to calculate read time
[
|
]