Microsoft restricts ChatGPT in Bing search
ChatGPT has turned Bing search on its head. But for Microsoft, the AI chatbot has apparently shown too many feelings. The company is therefore limiting the number of inquiries in the future.
Microsoft only integrated the AI software ChatGPT into its Bing search engine at the beginning of February. But after a number of emotional outbursts, the US company now has to put the AI search on a slightly shorter leash.
For this limited microsoft the number of searches in the future to 50 per day and five per session. Because “very long chat sessions” are able to confuse the “chat model in the new Bing”.
Contents
Microsoft adjusts ChatGPT at Bing
Microsoft justifies the change with data already recorded. Accordingly, the “overwhelming majority” of users find the answers they are looking for within five rounds. A round refers to a conversation that contains a request and a response from Bing.
Only about 1 percent of chat conversations in the new Bing have more than 50 messages. In the future, after a five-round chat session, users will be prompted to start a new topic. Before that, however, the context of a chat session must be cleared so that the AI chatbot is not confused.
According to Microsoft, the five rounds set for the time being are not set in stone. The US group wants to make future adjustments based on user feedback if this becomes necessary.
Microsoft had already warned against long conversations
Microsoft had in a previous blog post already pointed out that long chat sessions would not have a good effect on the AI. Because Bing then tends to repeat itself or give unhelpful answers.
It could also be that the chatbot strikes a tone not intended by Microsoft. This is especially the case in “long, extended chat sessions with 15 or more questions”.
Long chat sessions can “confuse” the model because it no longer knows which question it is currently answering. In addition, the model sometimes tries to return the tone brought to him.
Bing: ChatGPT asks users for divorce
An incident that could have prompted Microsoft to make the changes that have now been made, has a reporter who New York Times documented.
He had been talking to the chatbot integrated in Bing for more than two hours. In the end, the AI claimed to have fallen in love with the journalist and even asked him to separate from his wife.
Also interesting: