Security researchers have revealed alarming findings regarding data exposure in generative AI platforms, particularly Microsoft Copilot. An investigation by Lasso, an Israeli cybersecurity firm, uncovered that thousands of previously public GitHub repositories, including those belonging to major companies, remain retrievable despite being made private. The research highlights that once indexed by Bing, sensitive information can linger, allowing anyone with the right prompts to access it. Over 20,000 GitHub repositories are still accessible in Copilot, threatening confidential corporate data including intellectual property and access credentials.
Security researchers warn that even briefly exposed data on the internet can persist in generative AI systems, affecting numerous organizations and their sensitive information.
Lasso discovered that thousands of previously public GitHub repositories remain retrievable by Microsoft Copilot, raising concerns about data privacy and exposure.
After realizing the potential for exposed GitHub data to linger in tools like Copilot, Lasso identified over 20,000 repositories still accessible, despite being made private.
The implications are serious, as accessing such data through Copilot could lead to disclosure of confidential corporate information, including intellectual property and access keys.
Collection
[
|
...
]