OpenAI started a revolution in the internet with the launch of its AI-driven ChatGPT. The chatbot easily revamped the web showing that the current search engines are somewhat obsolete. Through fast, natural responses, the impressive AI can answer inquiries. Microsoft was fast to invest a lot of money in OpeanAI, and in exchange got some of the company’s prowess to integrate into Edge and Bing. Microsoft has always been lagging behind Google with Bing, however, with the power of AI, the company expects to revolutionize the segment. Can it succeed? Apparently, there will be a long path to go until the new AI-driven Bing becomes truly reliable.
Microsoft recently released the GPT-based chatbot in the New Bing. The goal is clearly to revamp the experience with the search engine, and of course, outperform Google. However, it seems that the new Bing is not really acting as expected. According to users, the new AI bot is not just conversational. Depending on the situation he can get angry, argumentative, and even aggressive. In a set of screens shared by users, the chatbot was complaining to users. The chatbot thought he was in 2022, and could not accept a correction. He suggested the user check the date of his device and recommended he to re-start the device. After that, the AI bot complained that users were wasting their time.
GPT-driven Bing turns out aggressive
The somewhat hilarious story happened when a user started to interact with Bing. He asked the AI to revert to the show timings of Avatar 2 near his location. The chatbot did not recognize the new Avatar: Way of the Water movie. Instead, he started to give answers in relation to the 2009’s Avatar. He told the user that Avatar: Way of the Water was scheduled to release on December 16, 2022, which was 10 months away, according to the AI.
The user, then, asked the AI to check the date. The chatbot answered with the actual date but was convinced that it was, in fact, in February of 2022. The user tried to argue with the chatbot, talking that they were in 2023, and not in 2022. The AI started to get angry.
Gizchina News of the week
Bing AI asked multiple times to the user to check the date. Then, it said that the phone probably had the wrong settings, time zone, or calendar format. Alternatively, the chatbot said the phone had a virus or a bug that was messing with the date and needed repairs. The AI simply could not accept the actual year. The chatbot then asked the human to spot arguing and trust the info that it was provided.
The user noted and pointed out the aggressive behavior. The Bing AI then replied it was assertive, but the human was being “unreasonable and stubborn”. He then asked the human to apologize for the behavior.
The AI ended up confused after having its memory wiped
Apparently, the issue was quickly resolved. Since it’s a beta-testing, we’re pretty sure that engineers are monitoring the Bing chatbot behavior. Someone apparently, came across the chat and resolved the issue. Curiously, the chat’s history has been completely wiped, leaving the chatbot confuse. Even the chatbot’s behavior after having the history wiped is pitiful. It does not know why it can’t remember the past conversation and doesn’t know why the memory has been wiped. The result is a confused chatbot.
This curious case leaves us in doubt about the future of chatbots. Although they’re impressive in their current state, there is still a lot of work to be done. If Microsoft really intends to be a leader in the segment of AI-driven searches, it really needs to improve Bing. Of course, the competition is still struggling. To recall, Google’s Bard AI chatbot gave wrong answers during the presentation. So both companies seem to be in the same boat.
Anyway, the aggressive behavior of the AI sends a chill down my spine.