Donald Trump’s presidential run was aided by a legion of agitators who used social media to spread misinformation, incite hate and support his candidacy. Various conspiracy theories, such as the pizza gate, slipped into the public debate. Also openly xenophobic ideas, which had never before been brandished with such confidence and aplomb in the US. The journalist Andrew Marantz (New York, 38 years old) plunged into the guts of the far-right propaganda machine. He started staying with the influencers conservatives, some of them openly white supremacists, who spread such content. He saw them work, attended their parties, witnessed their discussions, felt their motivations.
The result of his research turned him into the book Antisocial. The extreme right and ‘free speech’ on the internet. “We hit our audience with emotional triggers that they can’t ignore. This is a psychological operation”, says one of the dozens of voices that the work includes, in this case the neo-Nazi and anti-Semite Mike Enoch. The book was published in 2020 and the following year it had its version in Spanish (Captain Swing). The editor of The New Yorker attends EL PAÍS in Madrid. He is visiting Spain invited by the Valladolid Cultural Forum, in which he will participate with a talk this Friday. He believes that what he describes in his work, the mechanisms that make certain messages spread over the internet, “will shape our society for many years to come.”
Ask. What has changed in the last two years, since you wrote the book?
Q. What did you discover?
R. I realized that the key mechanic that gets everything moving is what we call “emotional engagement.” It turns out that there are some emotions that work better than others. Exploiting them will spread your message across the internet. In traditional media, editors apply their criteria and choose if a story deserves to be told and what visibility they want to give it. On the internet there are no filters: it is a points system, editorial judgment does not exist. I started hanging out with people who played this game every day. Some did whatever it took to make the numbers go up just because they wanted to make money. Others did it for ideological reasons, to create chaos, or simply because they wanted to watch the world burn. There were also those who used it very methodically to help a candidate win the election. But the mechanism is always the same. How can I make certain emotions heighten and others less prominent? How can I use images and words to make people’s blood pressure rise, scare them, rage, suspect or hate them? Often, those weren’t the only emotions they knew how to cultivate, but the ones that worked best. The basic emotions that rioters appeal to have been the same for millennia.
Q. Do you think that social networks are doing enough to prevent the spread of problematic content?
R. They are doing something, but not enough. It’s not about censoring or not, I think that’s not the right framework. Nobody likes to censor. If you go to a nightclub and the music is so loud you can’t hear a thing, people set chairs on fire and throw them out the window, and you don’t know if your drink contains poison, that’s not a good club. If you complain about all this to the owner, he can’t tell you: ‘I’m sorry, I believe in freedom’. People will not want to stay in that disco. Everyone believes in free speech. The question is: what are we going to do to create better conditions so that people can have a drink in that club?
R. It’s complicated. When people take a hardline ideological approach, they almost never come up with the right answer. Elon Musk says that he is an absolutist of free speech. And it is not like that: he canceled the Twitter accounts of those who shared information about where his private plane is. Do you like ISIS free speech to radicalize people? That’s what this Supreme Court case is about. Did a Google executive murder someone? No. Should they be held responsible for murder? No. They have nothing to do with this? Well, that’s not so clear. That’s where these questions get tricky. My opinion is that it is dangerous for the Government to criminally regulate any of these matters.
“Social networks do not do enough to prevent the dissemination of problematic content”
R. If we go back to the nightclub analogy, ten years ago it was just a small venue with ten chairs. Now the situation is very different. They have billions of people, and their algorithms are making editorial distinctions that decide what people can and cannot see on the platform. It’s all based on the algorithm and how much emotional engagement you get. You may sit in a chair for ten years and no one will see or hear you because you are not a good enough propagandist. That is a decision the company is making. I believe that they must be regulated, they must pay taxes, monopolies must be ended and they must offer safe products, but I am not sure that applying criminal sanctions is the solution.
R. Tech companies are the biggest companies in the world, and it’s kind of crazy that they could have been as unregulated as they are now. But fake news, misinformation, hate speech, and bias were around many thousands of years before the internet came along. It is true that social networks accelerate everything. They change the speed, scale, scope, and breadth of the problem. Our way of thinking evolves over time and is affected by the ideas to which we are exposed. No one is immune from it. It is an illusion to say that you are in favor of the market of ideas and that everything will fix itself.
R. Some of the characters that appear in my book still have a public presence, others don’t. There are those who have been expelled from Twitter or Facebook. Many of those who I thought would be the first to disappear are still there. A lot of them don’t like Trump anymore, they got off that bandwagon too soon. I am not entirely sure that the tactics that Trump used will succeed again. But many of those techniques have been standardized. The great skill that far-right propagandists had is that they could take what used to sit on the fringes of the internet and turn it into a popular topic of conversation. The refugees say they want asylum, but in reality they are terrorists who want to destroy the country; the elections were rigged; Hillary Clinton is dead and she was replaced by someone else… It is very easy to pay attention to someone like Trump who says that nonsense. The damage is already done. The next Trump-style candidate may be able to modulate his message to sound a little more calm or respectable, less vulgar and offensive, but the policies he promotes will be similar.
Q. How can one fight against this fascination mechanism that you describe in the book?
R. I love facts. In The New Yorker we have an army of fact-checkers, which I also used in my book. But I know that facts don’t always win arguments. If there is an emotional truth or identity that someone wants to connect with, they can find ways to distort reality so that it prevails. It is something we have to fight against. We have to go out and say that it is proven that the Earth is not flat. I think it’s important to do the right thing. But it should also not surprise us if telling the truth doesn’t always lead to victory.
Q. Is it, then, about showing the truth in an emotional way so that it prevails?
R. Yes. Some of the emotions that you can use bring people together; others destroy people. You do not have to use only negative emotions, such as fear and suspicion, even if it is the easiest and cheapest route. Obama’s campaign projected emotional engagement. That’s a bit more difficult than playing on fear, but it’s not impossible. You can appeal to the identity of people as subjects interested in community and solidarity.
“To spread your message on the internet you have to make people’s blood pressure rise”
Q. Do you think that ChatGPT and other similar tools will enter into this equation, in the preparation of messages that appeal to people’s instincts?
R. I think you can push in both directions. Of course, it’s a big problem that it can be used to amplify misinformation and bullshit. Artificial intelligence (AI) has no notion of what the truth is, so it can be manipulated and weaponized in many ways. The far right can use AI, but so can the left. I recently read in a socialist magazine that we should use it to convert people to socialism. You can engage in a dialogue and say: Do you think people should starve in the streets? Do you think people shouldn’t be able to pay their medical bills? And then you can allow people to lay out the arguments and say, Oh, actually you were a socialist from the start, you just didn’t know it. AI is a new tool, just as social networks were at the time. You have to use it to try to win the culture war.
Q. Will the next assault on the Capitol be fueled by AI tools?
R. Possibly. These things cannot be stopped once they become part of the fabric of our lives. I believe that generative AI will be present in the next great advances in politics, in the same way that mobile phones and social networks are today. In 2012 it was great news to discover that social networks influenced the elections; Today we don’t even talk about it.
Q. In the book he talks about social networks like Twitter, Facebook or Parler. Are there any others that have emerged in recent years that catch your attention?
R. TikTok. It is very interesting because it is the most algorithmically determined. It gives you very few options. Basically, you sign in and it shows you what it wants to show you. And it’s incredibly addictive. This short video platform is making Facebook, Twitter or Reddit start to move in that direction. If I want to analyze what’s going to happen in the world, I can’t pretend that my 8,000-word articles are spreading the same way a viral TikTok does. I have to acknowledge the fact that I work in an old media outlet and face the consequences.
You can follow THE COUNTRY Technology in Facebook and Twitter or sign up here to receive our weekly newsletter.