What Is a Prompt Injection Attack?
ibm.com·4d
💬Prompt Engineering
Preview
Report Post

Amber Forrest

Staff Editor | Senior Inbound, Social & Digital Content Strategist

IBM Think

What is a prompt injection attack?

A prompt injection is a type of cyberattack against large language models (LLMs). Hackers disguise malicious inputs as legitimate prompts, manipulating generative AI systems (GenAI) into leaking sensitive data, spreading misinformation, or worse.

The most basic prompt injections can make an AI chatbot, like ChatGPT, ignore system guardrails and say things that it s…

Similar Posts

Loading similar posts...