How Attackers Use AI To Spread Malware On GitHub
Github Copilot became the subject of critical security concerns, mainly because of jailbreak vulnerabilities that allow attackers to modify the tool’s behavior. Two attack vectors – Affirmation Jailbreak and Proxy Hijack – lead to malicious code generation and unauthorized access to premium AI models. But that’s not all. Contents hide 1 Jailbreaking GitHub Copilot 1.1 Affirmation jailbreak? “Sure,” let’s exploit the AI system(s) 2 Proxy Hijack.