"The Defense Department notified Anthropic that its products are deemed a supply chain risk, effective immediately. CEO Dario Amodei stated he doesn't believe this action is legally sound and that his company sees no choice but to challenge it in court, following the Pentagon's threat to apply this designation unless the company removed safeguards over mass surveillance and autonomous weapons."
"The supply chain risk designation has a narrow scope because it only exists to protect the government. The general public and even Defense Department contractors can still use Anthropic's Claude chatbot and its AI technologies, with Microsoft confirming it will continue using Claude for non-defense related projects after legal review."
"Amodei explained that Anthropic had productive conversations with the Defense Department over recent days, exploring ways to serve the Pentagon while adhering to two exceptions: that technology not be used for mass surveillance and the development of fully autonomous weapons, and ways to ensure a smooth transition if agreement is not possible."
The Defense Department designated Anthropic as a supply chain risk after the company refused to remove safeguards on mass surveillance and autonomous weapons development. CEO Dario Amodei announced plans to challenge this designation legally, arguing it lacks legal validity. The designation restricts only government use of Anthropic's Claude technology; the general public and non-defense contractors can continue using it. Microsoft confirmed it will maintain its relationship with Anthropic for non-defense projects. Amodei revealed ongoing productive conversations with the Pentagon to find compliant ways to serve defense needs while maintaining core safety principles, with discussions about potential transitions if agreement proves impossible.
#ai-regulation #defense-department-policy #supply-chain-risk #autonomous-weapons #government-technology-restrictions
Read at Engadget
Unable to calculate read time
Collection
[
|
...
]