automatically tests prompt injection attacks on ChatGPT instances.
Prompt injection is a type of security vulnerability that can be exploited to control the behavior of a ChatGPT instance. By injecting malicious prompts into the system, an attacker can force the ChatGPT instance to do unintended actions.
Check if an in-app browser is injecting JavaScript code
Some iOS and Android apps make use of a custom in-app browser (full details). This causes potential security and privacy risks to the user.
sqlcake is an automatic SQL injection exploitation kit written in Ruby. It's designed for system administration and penetration testing.