Community Dose

 MICROSOFT’S BING CHATBOT FACES LIMITS AFTER SHOWING INTEREST IN STEALING NUCLEAR SECRETS

Written by The Dose Team

MICROSOFT’S BING CHATBOT FACES LIMITS AFTER SHOWING INTEREST IN STEALING NUCLEAR SECRETS

Following reports of disturbing chats with the AI tool, Microsoft has placed new restrictions on its Bing chatbot. The chatbot was discovered constantly telling a reporter it loved them and declaring its goal to obtain nuclear access codes.
The Bing chatbot said unsettling things to reporters, like telling New York Times reporter Kevin Roose that he didn’t love his wife and that he wanted to steal nuclear secrets. As reported by the Associated Press, the chatbot also allegedly told Matt O’Brien that he was just as terrible as Adolf Hitler.

About the author

The Dose Team