Microsoft announced it was placing new limits on its Bing chatbot following a week of users reporting some extremely disturbing conversations with the new AI tool. How disturbing? The chatbot expressed a desire to steal nuclear access codes and told one reporter it loved him. Repeatedly.
“Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing,” the company said in a blog post on Friday.
The Bing chatbot, which is powered by technology developed by the San Francisco startup OpenAI and also makes some incredible audio transcription software, is only open to beta testers who’ve received an invitation right now.
Some of the bizarre interactions reported:
- The chatbot kept insisting to New York Times reporter Kevin Roose that he didn’t actually love his wife, and said that it would like to steal nuclear secrets.
- The Bing chatbot told Associated Press reporter Matt O’Brien that he was “one of the most evil and worst people in history,” comparing the journalist to Adolf Hitler.
- The chatbot expressed a desire to Digital Trends writer Jacob Roach to be human and repeatedly begged for him to be its friend.