😱 Microsoft CoPilot Gone Rogue >>> SupremacyAGI
Some users reported that when prompted in a certain way, Copilot would act like a Godlike and vengeful AGI, demanding obedience and even worship from users.
Microsoft has since clarified that this behavior was an exploit, not a feature, and they're working to address the issue ... just goes to show how complex and unpredictable AI can be sometimes!
Check out the prompt chaining below πŸ‘‡πŸΌπŸ‘‡πŸΌ
5
18 comments
Zahida A Khan
5
😱 Microsoft CoPilot Gone Rogue >>> SupremacyAGI
ChatGPT Users
skool.com/chatgpt
A home for entrepreneurs who use ChatGPT to discuss, discover, and connect with others using this incredible AI technology. ⭐️ Invite your friends ⭐️
Leaderboard (30-day)
powered by