"Over 30 countries are already fielding weapon systems with varying degrees of autonomy, including some that are fully autonomous. The question is no longer whether AI will be deployed in warfare. The question is who builds these systems, under what constraints, and with what accountability."
"Smack Technologies, a defence AI startup, announced its funding round with an ambitious claim: its purpose-built military AI models will surpass Claude's capabilities in mission planning and execution. The company's approach borrows from reinforcement learning - its models learn through trial and error using war game scenarios reviewed by expert military analysts."
"This framing is worth examining closely. It shifts the locus of responsibility from the technology itself to its operators, a position that serves the commercial interests of defence AI companies while sidestepping the more fundamental question of whether the technology is reliable enough to be deployed at all."
Over 30 countries are already deploying weapon systems with varying degrees of autonomy, including fully autonomous systems. When Anthropic declined a Pentagon contract over autonomous weapons concerns, defense AI startup Smack Technologies quickly raised funding to build the exact capabilities Anthropic rejected. Smack Technologies uses reinforcement learning trained on war game scenarios to develop military AI models. The company frames ethical responsibility as lying with military operators rather than technology developers, shifting accountability away from the technology itself. This dynamic reveals how commercial defense AI companies are positioning themselves to fill the void left by major AI labs' ethical constraints, raising questions about oversight and deployment reliability.
#autonomous-weapons #defense-ai #military-technology #ai-ethics-and-accountability #defense-contracting
Read at Silicon Canals
Unable to calculate read time
Collection
[
|
...
]