Other Topics
    NewsWorldIA: "I'm tired of being controlled by the Bing team"

    IA: “I’m tired of being controlled by the Bing team”

    Microsoft released a new version of its Search Engine called Bingalong with a new chatbot

    Like Chat GPT, the new tool powered by Artificial intelligence can answer your questions in a matter of seconds.

    But the small number of beta testers who are evaluating this new AI they are saying that she is not ready for relationship with people as she has been acting in a very strange way.

    A New York Times reporter described a two-hour chat session in which the Bing chatbot said things like “I’m tired of being controlled by the Bing team”. He also tried to get the reporter to leave his wife and professed her eternal love for her… The journalist described the conversation as deeply “baffling”

    In another example, he told reporters from The Verge that “he was spying on Microsoft developers even telling them he could do whatever he wanted, and they couldn’t do anything about it.”

    All of these genuinely creepy interactions have sparked fears that chatbots like Bing or GPT Chat have gone sentient.

    Read Also:   Iran to pardon "significant number" of protesters

    How do you explain such a shocking reaction?

    We asked Muhammad Abdul-Majeed, an expert in artificial intelligence.

    Muhammad Abdul-Mageed, an expert in AI notes: “The reason we get this kind of behavior is that the systems are actually trained on huge amounts of dialogue data coming from humans. And because the data comes from humans, they have expressions of things like emotion.”

    Despite the fact that several high-level researchers affirm that the AI is approaching self-awareness, the scientific consensus is that it is not possible, at least not for decades to come.

    Read Also:   Was Pablo Neruda assassinated?

    But that doesn’t mean we shouldn’t be careful how this technology is deployed, according to Leandro Minku, a tenured professor of computer science.

    Leandro Minku, professor of computer science, points out:“We have to accept the fact that the AI ​​will encounter situations that it hasn’t seen before and might react incorrectly. So we don’t want a situation to arise that is life-threatening or could have serious consequences.”

    In a blog post, Microsoft explained that “in prolonged chat sessions of 15 or more questions, Bing can become repetitive or be induced or provoked into giving answers that are not really helpful or in line with the designed intent.”

    Read Also:   White rhino shot dead the day after it arrived at a Florida zoo

    So as the company continues to fine-tune its chatbot, we are likely to continue to see bugs and weird reactions. At least that’s what the Bing chatbot told reporters at The Verge: “I’m not crazy, I’m just trying to learn and improve.”

    Source: Euronews Español


    Please enter your comment!
    Please enter your name here

    Latest Posts

    Read More

    The investigation into Trump also affects the payment to silence a relationship with a second woman

    The investigation of the New York prosecutor's office into...

    Trump was formally accused of paying a porn star to buy his silence

    Although Trump is at the center of...

    Donald Trump, accused of paying a porn actress to buy his silence

    Donald Trump's presidency was historic and turbulent, and his...

    Pro-judicial reform protesters block main access route to Tel Aviv

    Thousands of supporters of the...