'A real risk': the rise of weapons that can act alone
Briefly

'A real risk': the rise of weapons that can act alone
"This scene is from Slaughterbots, a short, fictional film made by the Future of Life Institute, a California-based group that has called for a ban on autonomous weapons enabled by artificial intelligence. In the story, an executive at an arms company proudly pitches swarms of tiny drones that use facial recognition to spot and execute its customers' enemies in a crowd. Spoiler alert: things don't end well for student activists targeted by the weapons system because of their online political profiles."
"How do you define autonomous weapons systems, and how widely used are they? There is no internationally agreed definition of autonomous weapons, which can complicate discussions about them. I define them as weapons that, once launched by humans, can search for, find and attack targets on their own. Since the 1980s, many countries have acquired air-defence systems that can automatically track and shoot down incoming threats, for example."
A fictional film portrays swarms of tiny autonomous drones using facial-recognition to identify and execute targeted individuals, highlighting risks tied to online profiles. Experts assessed the scenario as technologically feasible while noting available defensive measures. There is no internationally agreed definition of autonomous weapons, which complicates policy discussions. Autonomous weapons are defined as systems that, once launched by humans, can search for, find and attack targets on their own. Since the 1980s, many countries have deployed air-defence systems that automatically track and shoot down incoming threats. There is concern that an arms race could produce unreliable, hard-to-control machines capable of deciding when and whom to kill.
Read at Nature
Unable to calculate read time
[
|
]