
"The problem is that according to current neuroscience, human thinking is largely independent of human language - and we have little reason to believe ever more sophisticated modeling of language will create a form of intelligence that meets or surpasses our own, Riley wrote. We use language to think, but that does not make language the same as thought. Understanding this distinction is the key to separating scientific fact from the speculative science fiction of AI-exuberant CEOs."
"AGI, to elaborate, would be an all-knowing AI system that equals or exceeds human cognition in a wide variety of tasks. But in practice, it's often envisioned as helping solve all the biggest problems humankind can't, from cancer to climate change. And by saying they're creating one, AI leaders can justify the industry's exorbitant spending and catastrophic environmental impact."
Humans tend to associate language ability with intelligence and are often persuaded by strong linguistic skills. Current neuroscience indicates human thinking operates largely independently from language. Language functions as a tool for thought but is not equivalent to thought itself. Large language models primarily emulate communicative aspects of language and act as tools rather than reproducing distinct human cognition. AGI is envisioned as an all-knowing system rivaling human cognition across tasks, but claims that scaling language models will yield AGI lack neuroscientific support. Industry focus on scaling via more data and compute drives high capital expenditure and significant environmental costs.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]