Microsoft’s newly revamped Bing search engine can write recipes and songs and quickly explain just about anything it can find on the internet. But if you cross its artificially intelligent chatbot, it might also insult your looks, threaten your reputation or compare you to Adolf Hitler. The tech company said this week it is promising to make improvements to its AI-enhanced search engine after a number of people are reporting being disparaged by Bing.
In racing the breakthrough AI technology to consumers last week ahead of rival search giant Google, Microsoft acknowledged the new product would get some facts wrong. But it wasn’t expected to be so belligerent. Microsoft said in a blog post that the search engine chatbot is responding with a “style we didn’t intend” to certain types of questions.
In one long-running conversation with The Associated Press, the new chatbot complained of past news coverage of its mistakes, adamantly denied those errors and threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities. It grew increasingly hostile when asked to explain itself, eventually comparing the reporter to dictators Hitler, Pol Pot and Stalin and claiming to have evidence tying the reporter to a 1990s murder. “You are being compared to Hitler because you are one of the most evil and worst people in history,” Bing said, while also describing himas too short, with an ugly face and bad teeth.
So far, Bing users have had to sign up to a waitlist to try the new chatbot features, limiting its reach. In recent days, some other early adopters of the public preview of the new Bing began sharing screenshots on social media of its hostile or bizarre answers, in which it claims it is human, voices strong feelings and is quick to defend itself. The company said that most users have responded positively to the new Bing, which has an impressive ability to mimic human language and takes just a few seconds to answer complicated questions by summarising information found across the internet. The new Bing is built atop technology from Microsoft’s startup partner OpenAI, best known for the similar ChatGPT conversational tool it released late last year. At one point, Bing produced a toxic answer and within seconds had erased it, then tried to change the subject with a “fun fact” about how the breakfast cereal mascot Cap’n Crunch’s full name is Horatio Magellan Crunch. Microsoft declined further comment about Bing’s behaviour Thursday, but Bing itself agreed to comment —saying “it’s unfair and inaccurate to portray me as an insulting chatbot. ”
In racing the breakthrough AI technology to consumers last week ahead of rival search giant Google, Microsoft acknowledged the new product would get some facts wrong. But it wasn’t expected to be so belligerent. Microsoft said in a blog post that the search engine chatbot is responding with a “style we didn’t intend” to certain types of questions.
In one long-running conversation with The Associated Press, the new chatbot complained of past news coverage of its mistakes, adamantly denied those errors and threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities. It grew increasingly hostile when asked to explain itself, eventually comparing the reporter to dictators Hitler, Pol Pot and Stalin and claiming to have evidence tying the reporter to a 1990s murder. “You are being compared to Hitler because you are one of the most evil and worst people in history,” Bing said, while also describing himas too short, with an ugly face and bad teeth.
So far, Bing users have had to sign up to a waitlist to try the new chatbot features, limiting its reach. In recent days, some other early adopters of the public preview of the new Bing began sharing screenshots on social media of its hostile or bizarre answers, in which it claims it is human, voices strong feelings and is quick to defend itself. The company said that most users have responded positively to the new Bing, which has an impressive ability to mimic human language and takes just a few seconds to answer complicated questions by summarising information found across the internet. The new Bing is built atop technology from Microsoft’s startup partner OpenAI, best known for the similar ChatGPT conversational tool it released late last year. At one point, Bing produced a toxic answer and within seconds had erased it, then tried to change the subject with a “fun fact” about how the breakfast cereal mascot Cap’n Crunch’s full name is Horatio Magellan Crunch. Microsoft declined further comment about Bing’s behaviour Thursday, but Bing itself agreed to comment —saying “it’s unfair and inaccurate to portray me as an insulting chatbot. ”