AI code suggestions sabotage software supply chain
Briefly

AI coding assistants, based on large language models, are altering the software development landscape by automating code generation. However, they pose new risks due to their tendency to hallucinate, meaning they often suggest non-existent software packages. A recent study highlighted that 21.7% of code suggestions from open source models resulted in non-existent packages. Malicious actors can exploit this by creating harmful packages that mimic these hallucinated names. This form of attack, termed "slopsquatting," threatens software security and integrity, especially when coding errors go unnoticed.
As security researchers have noted, AI coding assistants can invent package names, leading to risks in software supply chains through hallucinations that can be exploited.
Running code from an AI assistant that suggests non-existent packages makes it easy for malicious actors to hijack these hallucinations by creating bad packages.
Read at Theregister
[
|
]