TechnologyMicrosoft listens to users of the new Bing and relaxes the AI ​​chatbot response limit it imposed just a few days ago

Microsoft listens to users of the new Bing and relaxes the AI ​​chatbot response limit it imposed just a few days ago

Microsoft has modified a measure that it took just a few days ago regarding the new tool of artificial intelligence which includes your search engine, Bing.

Last Friday, the company announced that it would limit the conversations that users could have with the AI text generator Bing, establishing a limit of 5 responses per session and 50 per day. The Redmond firm defined these interactions as an exchange that includes a question from the user and a response from artificial intelligence.

However, Microsoft has now decided that it is going to relax those same limits in the face of complaints from search engine users, who have asked to be able to hold longer conversations with the chatbot again.

“Since we set these limits, we have received feedback from many of you asking to return to longer conversationsso you can search more efficiently and better interact with the chat function,” the company acknowledged in a blog post on Tuesday.

These are the secret rules of Microsoft AI: a technology that is actually called Sydney and whose answers must avoid being “vague or controversial”

microsoft logomicrosoft logo

Following user criticism, Microsoft has eased the restrictions to allow up to 6 responses per session and a total of 60 responses a day, a limit that the company run by Satya Nadella it deems sufficient to accommodate the “natural daily use of Bing” by the “vast majority” of users. Microsoft assures that, shortly, it will increase the limit to 100 responses per day.

The Redmon firm put these restrictions in place after Bing had woken up concern among users and the media by offering all kinds of disturbing responses.

Some screenshots that went viral on social media showed how the AI ​​text generator had made users gaslight, had told them “I love you” or had an existential crisis based on their responses. Microsoft acknowledged Friday that very long conversations could “confuse the underlying chat model of the new Bing.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Posts

Read More
More