"The Pentagon is making plans to have AI companies train versions of their models specifically for military use on classified information, according to the MIT Technology Review. If true, it wouldn't come as a surprise, seeing as the US is aiming to become an "AI-first" warfighting force, based on the statement released by Secretary of Defense Pete Hegseth earlier this year."
"Models trained on actual classified data could give more accurate and detailed responses, say, for situations similar to what happened in the past that aren't public information. MIT Tech Review says the department is looking to conduct the training in a secure data center that's allowed to host classified government projects."
"Aalok Mehta, who previously led AI policy efforts at Google and OpenAI, told the publication that training models on classified data carries certain risks. If the same model is used across the whole Defense Department, for instance, personnel without the correct clearance level could end up getting information that they weren't supposed to have access to."
The Pentagon is developing plans to have AI companies train specialized model versions on classified information within secure government data centers. This initiative aligns with the US military's goal to become an AI-first warfighting force. While the Department of Defense already uses AI models like Anthropic's Claude for military operations, training models on actual classified data would enable more accurate and detailed responses for sensitive situations. The Pentagon would retain sole ownership of training data, with rare exceptions allowing cleared AI company personnel access. However, security experts warn that deploying classified-trained models across the entire Defense Department could create clearance violations if personnel access information beyond their authorization level.
#military-ai #classified-data-training #pentagon-ai-strategy #security-clearance-risks #defense-department-technology
Read at Engadget
Unable to calculate read time
Collection
[
|
...
]