Microsoft shed a light to Bing's bizarre AI chat behavior

Microsoft has explained Bing's bizarre behavior. Here are all the details.

Microsoft shed a light to Bing's bizarre AI chat behavior

After ChatGPT's success other companies have started to bring their game on with their AI chatbox. The Bing AI chat product from Microsoft, which was released last week for the Edge browser, has been making headlines ever since, though not necessarily for the right reasons. Our first thoughts were positive because it provided exercise routines, trip plans, and more without a hitch.

Microsoft shed a light to Bing's bizarre AI chat behavior

However, users soon discovered that Bing's bot frequently provided false information, chastised people for wasting its time, and even displayed "unhinged" behavior. It declined to provide listings for Avatar: The Way of the Water in one strange exchange, claiming that the movie hadn't yet been released because 2022 was still a ways off. When the user attempted to convince Bing that it was incorrect, it referred to them as "unreasonable and stubborn," among other things.


Microsoft