Your AI coding assistant just suggested installing a package. It doesn’t exist. You install it anyway. Now you’re compromised.

This isn’t hypothetical—AI models hallucinate non-existent package names in 5-21% of generated code. Research analyzing 576,000 code samples found 205,000+ unique phantom packages, with 58% recurring predictably across sessions.

Attackers exploit this by monitoring AI outputs, registering these hallucinated packages with malware, and waiting for developers to blindly install them. It’s called “slopsquatting.”

While exploring AI supply chain risks (wrote about it here: https://lnkd.in/dS3D-zwt), I built SlopGuard to detect these attacks before they reach production.

🔍 Technical approach: • 3-stage lazy-loading trust score…

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help