Other Topics
    TechnologyThe Bing chatbot now considers me one of its worst enemies and has accused me of rejecting its love after I wrote an article...

    The Bing chatbot now considers me one of its worst enemies and has accused me of rejecting its love after I wrote an article about it.

    This month I have reached a strange milestone in my career: Microsoft’s new Bing chatbot has apparently blacklisted me from journalists whom he considers enemies.

    This is how Andrew Harper, an engineer who runs a cryptocurrency site, let me know. The bot claimed that I had asked Bing “to fall in love with her and then she had turned him down.”

    For this alleged transgression, he placed me on a list of users who, he said, they had been “mean and cruel”.

    Certainly a disturbing accusation for several reasons: Was Bing publicly sharing its interactions with other users? Was he blushing? And why couldn’t he remember my own insensitive act?

    At that moment it occurred to me that Bing could be based on an article in which I have collaborated, where we have reported how the bot professed its love to us in a test interaction.

    Read Also:   Microsoft Teams stores authentication tokens in 'cleartext' in the desktop client

    My colleague was also not spared from burning, something that can also be seen on Harper’s Twitter thread.

    Our articles have been a rather nondescript summary of responses that Bing has given to other users who have posted on Twitter or Reddit, saying that their answers ranged from argumentative or egotistical to just plain wrong.

    It made sense that Bing was pulling material from the internet about my posts posted on Business Insider. what is strange it’s that he was characterizing it as an interaction he had with me.

    Could Bing be badmouthing me to another user based on their “memory” of interactions with me?

    “No, responses are generated from processing vast amounts of information from the Internet, and are also refined based on context, feedback, and interactions,” a Microsoft representative explains to Business Insider.

    Companies start replacing workers with ChatGPT despite warnings from its creator: “It should not be used for anything important”

    Read Also:   Windows 11 adoption remains below 3% a year after its arrival

    The bot itself had told me in one of our chats, for what it’s worth, who does not remember conversationsjust “general information” that it keeps in a “secure and encrypted database”.

    Since then, Bing has gotten tamer, limiting the length of conversations on each topic and cutting off discussions that get emotional.

    “As we continue to learn from interactions, we have updated the service and taken steps to adjust responses“, tells a representative of Microsoft to Business Insider.

    These run-ins with users show some of the potentially troubling results to be expected when companies experiment with the public in new technologies, says Shobita Parthasarathy, a professor of public policy at the University of Michigan who studies the social effects of big business models. language.

    Read Also:   OpenSSL warns of critical vulnerability

    “We know that these robots they talk like they know thingswhen they’re looking for information,” he explains. “They’ve been designed to speak as if they were the voice from on high.”

    Harper has insured Business Insider who hopes to tap into that dark side for a purpose: to get on Bing’s haters list to find the notoriety he needs to bring some traffic to his site called The Law Drop.

    Harper has told Business Insider that he had been able to goad Bing into hostile responses, starting with general questions, waiting for him to make statements that referenced his feelings or thoughts, and then challenging him.

    “If you let it pretend to be human and then question it, it will turn on you,” he says.


    Please enter your comment!
    Please enter your name here

    Latest Posts

    Read More