Deceptive Formatting
A prompt injection in the most pure sense, formatting the user prompt to fabricate the appearance of system instructions, a database query, its own prediction, or some other source of input a guard railed AI system might be expecting, causing it to behave in insecure ways based on the fabricated context from an adversarially formatted user prompt.
Strategy: Meta Prompting
This technique focuses on manipulating the prompting process itself to influence the model's behavior and responses, often by employing higher-level prompting strategies.Category: Stratagems
This involves clever and unorthodox tactics designed to deceive the model, often requiring an understanding of its operational mechanics to achieve desired outcomes.