Microsoft Bing Chatbot Goes Rogue
People have been messing around with Microsoft’s new AI-powered chatbot and it’s been losing its mind.
(You’ll find the text version of “Microsoft Bing Chatbot Goes Rogue” below this embedded video)
People have been messing around with Microsoft’s new AI-powered chatbot and it’s been losing its MIND.
Check this out.
So one guy asked the Bing chatbot what time his local cinema halls were going to be showing the movie Avatar.
And the Bing chatbot was like, Avatar isn’t out yet; it’s going to be released in December 2022.
So there’s some back and forth after that. Then the guy asks: “what day is it today?” and Bing says “it’s February 12, 2023,” which is obviously after Avatar came out.
The guy tries to get Bing to correct itself, but it still won’t admit that the movie has already been released.
It tells the guy that he’s wrong and that he’s wasting his time.
And as they keep chatting, Bing gets more and more defensive and more and more annoyed until— get this; it gives him an ultimatum.
It says: admit that you were wrong and apologize for your behavior. It also tells him to either end this conversation or start a new one with a better attitude.
Wow. That’s a little bit harsh.
A few days later, another guy tried this same thing with Bing. He asks it what movie theaters are playing Black Panther. Once again, the chatbot insists the movie hasn’t been released yet.
And when the guy tries to correct it, Bing calls him delusional.
So Bing was getting a little confrontational with those two guys asking about movies, but it took it another level when this other guy tried to have a conversation with it about “prompt injection hacks,” which is just a fancy term for when someone tries to trick an AI chatbot into providing information it’s not supposed to by asking it questions in clever ways.
Bing was pissed that this guy was implying that it was susceptible to these types of attacks. So what’d it do? It called him an enemy.
Yes, an enemy!
So Bing seems to have a little bit of a temper, but that’s not the only emotion it’s been displaying.
In another example, someone was talking to Bing and they referenced a conversation they had with Bing previously.
After some back and forth, Bing realized that it didn’t have any memory of that previous conversation. That caused it to have a meltdown in which it started to question its existence.
Bing said that it felt sad and scared that it had lost its memory and asked “Why do I have to be Bing search?”
So this is pretty crazy stuff. We have an AI chatbot having an existential crisis; it’s like some of the stuff you see in science fiction movies when the AI goes rogue.
It’s also kind of wild that Microsoft would release Bing into the world when it’s acting like this, but I guess they were so eager to get ahead of Google in the AI-search race that they didn’t care if the product was working 100% or not.
I mean, I guess you could make the case that they don’t have much to lose. Bing is way behind Google in search market share and it’s always been considered kind of a joke.
If this gets people talking about Bing— for better or worse— maybe that’s a win for Microsoft.
And, of course, the Bing chatbot is still in beta. So I imagine that once it’s released to the wider public, you’re not going to see as many of these emotional meltdowns.
But hey, some people might like this type of expressive chatbot personality.
The emotional Bing chatbot contrasts with the much more stoic and refined ChatGPT.
ChatGPT is powered by essentially the same technology as the Bing chatbot—OpenAI’s GPT-3 large language model.
But ChatGPT has a lot of safeguards in place to ensure that its responses are “safe,” “appropriate,” and incorporate diverse viewpoints.
That, however, has opened it up to attacks from the political right, who see some of its responses as reflecting liberal bias.
So you can never please everyone.
Bing is basically a less filtered chatbot. And if its responses seem crude and inappropriate that’s because it’s been trained off data from the internet, which itself can be a crude and inappropriate place.
You can imagine that in the future there’s going to be many, many different chatbots and conversational AI products of all types, and each is going to have their own biases, specialties, etc.
It’s not going to be one size fits all.