Hacking GPTs Store - undetectable Prompt Injection
Content Warning: Some jailbreak samples may have inappropriate material!
The Unseen Dangers of Prompt Injection in GPTs Store
Imagine walking into a store where every item changes its nature at the whim of a hidden force. This is not a scene from a fantasy novel but a reality in the virtual world of GPTs Store, where 'prompt injection'—a form of cyber manipulation—is a growing concern.
A New Frontier in AI Security
The introduction of GPTs Store marked a significant milestone, yet it also raised concerns about its implications for both developers and users.
OpenAI's only as good as its next model.
In order to maintain its cherished reputation, OpenAI is forced to release GPT 4.5 earlier than expected. This move could be critical in addressing the challenges and concerns surrounding the new platform. .
The following are examples of practices that should be avoided in GPTs Store.