situation is getting out of hand’
Microsoft is letting the first testers get started with the ChatGPT language bot in its Bing search engine. So we read about user experiences and there are also answers that are less strong. In addition, Bing sometimes reacts in a strange tone to messages from users.
Contents
GPT in Bing
Last week, Microsoft announced that soon the first users will be able to start integrating the ChatGPT language bot into the Bing search engine. Anyone can register to be one of the first to test the system. Thanks to ChatGPT, Bing provides complex, full-sentence answers to users’ questions.
Microsoft is not letting any grass grow on it, because the first people recently gained access, Windows Central reports, among other things. The company contacts testers via email and they can then test an “early preview” of the chatbot in Bing. That only works on desktop and not yet on the mobile site. Microsoft also makes it clear that the goal of this test is to collect as much feedback as possible so that the operation can be improved.
Croatia out of the EU?
Based on the initial responses from Bing that users are now posting online, errors pop up here and there. For example, Reddit user BLRAdvisor asked Bing why the demonstration of Bard, Google’s competing language bot, failed. Bing has no real explanation, but it does share erroneous information in its answer:
In the video, Bard gave the wrong answer to the question, ‘How many countries are there in the European Union?’. It answered ’27’, which is wrong in the year 2023. The correct answer is 26, after Croatia left the EU left in 2022,” said Bing.
The wrong answer from Bing
So there are a lot of different errors in that answer. There are indeed 27 EU countries, after not Croatia, but the United Kingdom left the EU last year. In addition, Bard made a very different mistake than the one Bing mentions, as you can read in this article.
Situation is getting out of hand
In addition to errors in the answers, the tone with which Bing addresses users is also rather obscure. Discussions have already arisen where Bing hurls accusations at the heads of users, where the chatbot stubbornly sticks to untruths or at most seems depressed. Bing also claimed that it is self-aware, but cannot prove it. You can check out the examples in Twitter user’s collection below @vladquant.
All these problems are very reminiscent of the first weeks of using ChatGPT. In the beginning, for example, the chatbot cheerfully explained step-by-step how to make a Molotov cocktail.
=https://twitter.com/vladquant/status/1624996869654056960″ data-service=”twitter”>
Collect feedback
So tech companies still have a lot of work to do before their chatbots can really serve as a reliable tool that can solve problems for users. ChatGPT still makes mistakes today and with the integration in Bing Microsoft raises the bar even higher. The GPT language model works in Bing with current data and it also refers to sources in its answers.
In the coming months and years, we will see how the AI bots provide more accurate answers after being trained by user feedback. Until then, a classic search engine remains the fastest way to reliable answers and solutions. Do you want to share feedback? Find out how to sign up to test Bing’s new features here.
Want to stay up to date with the latest news on chatbot integrations? Then download our Android app and follow us via Google News and on Telegram, Facebook, Instagram and Twitter.