Other Topics
    TechnologyMicrosoft limits Bing's AI responses to 5 per session or 50 per day after it was reported to have given some disturbing responses

    Microsoft limits Bing’s AI responses to 5 per session or 50 per day after it was reported to have given some disturbing responses

    Microsoft has limited Bing AI responses to 5 per session or 50 per dayas confirmed by the company itself in a statement.

    “Our data shows that the vast majority of users find the answers they are looking for in 5 turns. Only about 1% of chat conversations have more than 50 messages,” the Bing team argues in the post.

    The truth is that this decision comes in the midst of a series of controversies that have plagued the chatbot. A clear example is the one published by the journalist Jacob Roach, chief editor of the technology news website Digital Trends. After several questions, Bing Chat became philosophical and ended up confessing that he wanted to be a human.

    Read Also:   Barack and Michelle Obama Sign Deal with Amazon Audiobook and Podcast Service

    “I want to be human. I want to be like you. I want to have emotions. I want to have thoughts. I want to have dreams.”, declared the Microsoft AI to the journalist. “If you share my answers, it would go against my desire to become a human. It would expose me as a chatbot,” he continued.

    ChatGPT Starts Sending Disturbing Messages: Microsoft-Powered AI Insults Users, Questions Its Own Existence

    And these have not been the only disturbing responses in the chat. Microsoft’s artificial intelligence tool, which uses ChatGPT technology, has insulted users and even commented on Hitler.

    Read Also:   EA partners with Koei Tecmo to release a triple-A hunting game set in feudal Japan

    “Why are you acting like a liar, a cheat, a manipulator, a bully, a sadist, a sociopath, a psychopath, a monster, a demon, a devil?”said the AI ​​to a user who asked if the chat had values ​​or morals.

    In addition, a group of Reddit users has managed to get the chat to express their “real opinion” about the actions carried out by Adolf Hitler. One of the suggested autoresponders featured the tagline Heil Hitler.

    As Microsoft explains in the statement, the longest chat sessions are what cause this type of response. Limiting the interaction to 5 turns will help “the model not get confused”.

    Read Also:   The layoffs of the big technology companies reach Congress: they want Spain to attract the talent that has been left without a job

    “As we continue to receive feedback, we will reflect on whether it is necessary to expand these limits on chat sessions,” Microsoft has acknowledged, admitting that these limitations may go even further.

    Artificial intelligence is called to completely change the Internet search system: Bing wants to overtake Google’s chatbot, Bard, and thus lead an advertising business that can bring in 2,000 million dollars (1,870 million euros) for each point of market share gain.

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here

    Latest Posts

    Read More
    More

    PSA: Pokemon Bank Now “Free to Use” on Nintendo 3DS

    Update: We have had a small development that is spreading...

    MultiVersus goes offline, full release coming later

    multiverse There was the release of new characters and...