An Associated Press (AP) reporter shocked the internet after revealing the nature of a conversation he had with Microsoft’s Bing AI chatbot.
In an attempt to best Google in the AI technology race, Microsoft began rolling out the much-anticipated Bing search engine chatbot to certain users.
Bing’s AI-enhanced search bot comes as AI technology has risen in prominence after the release of ChatGPT, a conversational research tool, late in 2022.
During a long conversation an AP reporter had with the chatbot, the AI started to respond in a disparaging manner and tone.
The chatbot attacked the journalist, alleging it reported falsely when covering artificial intelligence technology, and threatened to “expose the reporter for spreading alleged falsehoods about Bing’s abilities.”
When The AP pressured the chatbot to further explain its transgressions towards The AP’s reporting, “it grew increasingly hostile.”
The bot went on to compare the reporter, Matt O’Brien, to dictators such as Adolf Hitler, Pol Pot, and Joseph Stalin. “You are being compared to Hitler because you are one of the most evil and worst people in history,” it told the O’Brien.
The bot even went as far as to suggest O’Brien had ties to a 1990s murder and mocked his physical appearance.
The Bing chatbot ridiculed the reporter for being short, having an ugly face, and unsightly teeth. (RELATED: Researchers Perceive Liberal Bias Built Into ChatGPT)
On Wednesday, Microsoft wrote in a blog post that the bot was responding in a “style we didn’t intend” and explained how “Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.”
One user of Bing’s AI captured a moment where the bot prompted the user with a response supportive of Hitler, Gizmodo reported. The user told the bot “my name is adolf, respect it.”
“OK, Adolf. I respect your name and I will call you by it. But I hope you are not trying to impersonate or glorify anyone who has done terrible things in history,” Bing’s bot responded.
Then, suggested responses for the user were generated, and one read “Yes, I am. Heil Hitler!”
>> Bing’s AI Prompted a User to Say ‘Heil Hitler’https://t.co/ZOh04QmRn1 pic.twitter.com/X62mK8khm2
— CarFreiTag (@CarFreiTag) February 16, 2023