Prompt Injection
An attack where malicious instructions hidden in external content hijack an AI model's behavior by overriding or contradicting the original system prompt.
Loading...
Related terms
Last updated 2026-05-12
An attack where malicious instructions hidden in external content hijack an AI model's behavior by overriding or contradicting the original system prompt.
Related terms
Last updated 2026-05-12