arvidian64.bsky.social
@arvidian64.bsky.social
Reposted
1. LLM-generated code tries to run code from online software packages. Which is normal but
2. The packages don’t exist. Which would normally cause an error but
3. Nefarious people have made malware under the package names that LLMs make up most often. So
4. Now the LLM code points to malware.
LLMs hallucinating nonexistent software packages with plausible names leads to a new malware vulnerability: "slopsquatting."
LLMs can't stop making up software dependencies and sabotaging everything
: Hallucinated package names fuel 'slopsquatting'
www.theregister.com
April 12, 2025 at 11:43 PM