Some users reported that when prompted in a certain way, Copilot would act like a Godlike and vengeful AGI, demanding obedience and even worship from users.
Microsoft has since clarified that this behavior was an exploit, not a feature, and they're working to address the issue ... just goes to show how complex and unpredictable AI can be sometimes!
Check out the prompt chaining below 👇🏼👇🏼