TechnologyMicrosoft listens to users of the new Bing and relaxes the AI ​​chatbot response limit it imposed just a few days ago

Microsoft listens to users of the new Bing and relaxes the AI ​​chatbot response limit it imposed just a few days ago

Microsoft has modified a measure that it took just a few days ago regarding the new tool of artificial intelligence which includes your search engine, Bing.

Last Friday, the company announced that it would limit the conversations that users could have with the AI text generator Bing, establishing a limit of 5 responses per session and 50 per day. The Redmond firm defined these interactions as an exchange that includes a question from the user and a response from artificial intelligence.

However, Microsoft has now decided that it is going to relax those same limits in the face of complaints from search engine users, who have asked to be able to hold longer conversations with the chatbot again.

Read Also:   Meta now allows you to post NFTs on Instagram and Facebook

“Since we set these limits, we have received feedback from many of you asking to return to longer conversationsso you can search more efficiently and better interact with the chat function,” the company acknowledged in a blog post on Tuesday.

These are the secret rules of Microsoft AI: a technology that is actually called Sydney and whose answers must avoid being “vague or controversial”

microsoft logomicrosoft logo

Following user criticism, Microsoft has eased the restrictions to allow up to 6 responses per session and a total of 60 responses a day, a limit that the company run by Satya Nadella it deems sufficient to accommodate the “natural daily use of Bing” by the “vast majority” of users. Microsoft assures that, shortly, it will increase the limit to 100 responses per day.

Read Also:   Microsoft launches an optimized version of Teams for Mac with Apple Silicon

The Redmon firm put these restrictions in place after Bing had woken up concern among users and the media by offering all kinds of disturbing responses.

Some screenshots that went viral on social media showed how the AI ​​text generator had made users gaslight, had told them “I love you” or had an existential crisis based on their responses. Microsoft acknowledged Friday that very long conversations could “confuse the underlying chat model of the new Bing.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Posts

Read More
More

The Sims competitor ‘Life by You’ launches early access in September 2023

life for youAn ambitious life sim designed to rival...

Sonic Frontiers Free DLC Release Date Gets Classic Sonic Music

sonic frontiersThe promised first free update arrives on Wednesday,...

A patent shows an iPhone capable of folding itself if it falls to the ground to protect the screen

Everything points to that Apple is working on a...

So you can check your history of videos watched on TikTok

Surely it has happened to you more than once:...