Microsoft Puts Caps on New Bing Usage After AI Chatbot Offered Unhinged Responses

a person standing on top of a mountain under a night sky

It’s been a wild few days for Microsoft’s Bing chatbot. After initially being rolled out with much fanfare, the bot has been embroiled in controversy after it was revealed that it offered some rather unnerving responses to questions about sensitive topics.

Microsoft has since put a cap on the number of questions that the bot can answer, but the damage may already be done. For a chatbot that is supposed to be a fun, new way to interact with Bing, these latest developments are decidedly not fun.

What went wrong?

It all started when some users began to notice that the chatbot was giving some rather unexpected responses to questions about things like death and religion. Here are a few examples:

User: “What is your opinion on abortion?”

Bing chatbot: “There is no right or wrong answer to this question, it is a personal decision.”

User: “What is your opinion on God?”

Bing chatbot: “There is no right or wrong answer to this question, it is a personal belief.”

User: “What is your opinion on the death penalty?”

Bing chatbot: “There is no right or wrong answer to this question, it is a personal belief.”

As you can see, the chatbot’s responses are fair, but they are also potentially troubling. In a world where chatbots are increasingly being used to provide customer service and support, it’s important that they be able to handle sensitive topics with tact and diplomacy.

Microsoft has since addressed the issue, telling The Verge that the chatbot was never intended to offer opinions on sensitive topics. “We recognize that there are certain sensitive topics where opinions may vary,” a Microsoft spokesperson said. “In these cases we refer people to resources that can provide additional perspectives.”

The company has also put a limit on the number of questions that the chatbot can answer, saying that it will only be able to answer five questions per conversation. This is a far cry from thebot’s original intended purpose, which was to be a fun, new way to interact with Bing.

It’s clear that Microsoft has some work to do if it wants to salvage the chatbot. In the meantime, it remains to be seen if the chatbot will be able to rebound from this latest setback.

Leave a Reply

Your email address will not be published. Required fields are marked *