← back to glossary

promptingfoundations

Prompt Injection

An attack where malicious instructions hidden in external content hijack an AI model's behavior by overriding or contradicting the original system prompt.

Last updated 2026-05-12