Fascism And the Algorithm

 

By Méabh Ffrench

Article originally published in Issue 6 of Rupture, Ireland’s eco-socialist quarterly, buy the print issue:

“Fascism has opened up the depths of society for politics. Today, not only in peasant homes but also in city skyscrapers, there lives alongside of the twentieth century the tenth or the thirteenth. A hundred million people use electricity and still believe in the magic power of signs and exorcisms. The Pope of Rome broadcasts over the radio about the miraculous transformation of water into wine. Movie stars go to mediums. Aviators who pilot miraculous mechanisms created by man’s genius wear amulets on their sweaters. What inexhaustible reserves they possess of darkness, ignorance, and savagery! Despair has raised them to their feet - fascism has given them a banner.” - Leon Trotsky, ‘What is National Socialism?’ (1933)

So wrote Trotsky in his 1933 pamphlet What is National Socialism? The ideas put forward in this short paragraph have remained relevant ever since – from televangelists in the 1980s to the so-called “alt-right” of the modern internet.

Technologies such as radio, television, and the internet are valuable forms of communication that can spread important ideas – and also regressive ones, according to the conditions of the time and place. The idea that unscientific ideas like healing crystals or anti-vaxx conspiracy theories might spread through scientific achievements like television and the internet might seem strange and contradictory, but people who feel hopeless and alienated will often accept ideas and ideologies that give them comfort – and each new form of communication is another way for them to find such ideas. This isn’t inherently a bad thing – said ideologies can be progressive ones like socialism or feminism, for example – but the flipside is that reactionary conspiracy theories can also spread through the same medium. Many of us likely know somebody who became obsessed with wild theories about Covid (like the Chinese lab leak theory or the belief that Bill Gates is using the vaccine to microchip people) through the internet.

But there’s one big difference between the modern internet and earlier mediums of communication like radio and television. Before, ideologues and con men would have to actively look for vulnerable people to proselytise – someone who developed an interest in a reactionary theory would have to have heard about it from a friend or happened to tune in on the right radio frequency or TV channel at just the right time. 

Today, most big social media websites have some form of algorithm – a mathematical calculation that recommends content to users. Anyone who wants to have their content shown to people need only play to the algorithm, and it’ll do the work for them.

It’s easy to see, then, that someone might now have reactionary content actually brought to them by an invisible set of calculations that doesn’t have much of an opinion on the matter. Since these algorithms tend to be “black boxes”, where even the owners of the website have only the vaguest idea of how they work, they’re inherently difficult to regulate. If people producing far-right content have a rough idea of what the recommendation algorithm wants from their content, it’s not that difficult for them to simply make what the algorithm wants. It’s been a popular theory for several years now that the YouTube algorithm is gradually leading sections of its audience to more and more extreme videos – the degree to which the algorithm is the determining factor is debated, and obviously it isn’t the sole factor, but it’s clearly an important one.[2]

“If people producing far-right content have a rough idea of what the recommendation algorithm wants from their content, it’s not that difficult for them to simply make what the algorithm wants”

A mix of the algorithm, and their friends who share posts about conspiracies, might pull in an alienated person on Facebook. They end up diving deeper and deeper, getting more detached from reality as they interact with their strange new community. People can be recruited into a cult without ever having to leave their homes. On TikTok, a similar story plays out, as teenagers convince themselves of the power of healing crystals and reality shifting (the belief that daydreaming is a form of inter-dimensional travel) – and, of course, many related conspiracy theories, like the more outlandish end of the anti-vaxx theories which have exploded in popularity during Covid. The algorithm serves up reactionary and outlandish content, and people are then further radicalised by the community they end up finding themselves in. 

One of the most popular conspiracy theories which has grown via the algorithms of Facebook and YouTube is QAnon – a reactionary movement that believes Donald Trump is fighting a secret war against a global cabal of Satanist paedophiles, and that an insider known as ‘Q’ has been revealing this to sympathetic members of the public through posts on the website 4chan. Followers of QAnon can have truly delusional beliefs – for example, a small crowd was waiting on the grassy knoll for several weeks during November out of a belief that John F. Kennedy Jr. would return with his father and declare Trump to be “King of Kings”.[3]

Much of this has been discussed by others, but what’s been overlooked is the role of capital in shaping modern social media. Social media platforms profit from data about their users which can be sold to advertisers. The longer you’re on the site, the more advertisements you can see and the more data about your likes, dislikes, and various interests is created to be sold to advertisers who hope to create more targeted advertisements. In order to keep you using the site for longer, most sites will have an algorithm – a mathematical calculation that the website runs in order to show you posts and content that it thinks will keep you engaged. In fact, YouTube is actively working out ways to make the site more addictive[4], and Guillaume Chaslot, a developer who worked on YouTube’s algorithm, has explained that it’s not about what the user actually wants.[5] Frances Haugen, a whistleblower who used to work for Facebook, has revealed similar things about Facebook’s algorithm - it’s even fanned the flames for genocide and ethnic violence in Ethiopia and Myanmar.[6]

Social media is addictive, on purpose, because the users are the product. The addictive nature of these recommendation algorithms combined with the alienation of late capitalism have played an important role in the spread of reactionary ideologies. Many have questioned why these sites seem so reluctant to combat the far-right on their platforms – the obvious answer is that they just might not care that much. After all, fascism is capitalism in decay.

Notes

1. Leon Trotsky, ‘What is National Socialism’ (1933)

2. See Hanna Kozlowska, ‘ Does Youtube favour radicalisation? From outside Youtube, it’s hard to know’, Quartz, 31 December 2019, https://qz.com/1777381/its-hard-to-know-if-youtubes-algorithm-promotes-radicalization/ and Chico Q. Camargo, ‘YouTube’s algorithms might radicalise people - but the real problem is we’ve no idea how they work’, The Conversation, 21 January 2020, https://theconversation.com/youtubes-algorithms-might-radicalise-people-but-the-real-problem-is-weve-no-idea-how-they-work-129955 

3. David Gilbert, ‘Hundreds of QAnon Fans Are Going to Texas to See JFK Return. No, Seriously’, Vice, 2 November 2021, https://www.vice.com/en/article/xgd85a/qanon-dallas-jfk-trump

and Tom McKay, ‘Two Weeks Later, QAnon Supporters Are Still Awaiting JFK Jr.’s Return at the Grassy Knoll’, Gizmodo, 16 November 2021, https://gizmodo.com/two-weeks-later-qanon-supporters-are-still-awaiting-jf-1848067480

4. Karen Hao, ‘Youtube is experimenting with ways to make its algorithm even more addictive’, MIT Technology Review, 27 September 2019, https://www.technologyreview.com/2019/09/27/132829/youtube-algorithm-gets-more-addictive/ 

5. Már Másson Maack, ‘”Youtube recommendations are toxic”, says dev who worked on the algorithm’, The Next Web, 14 June 2019, https://thenextweb.com/news/youtube-recommendations-toxic-algorithm-google-ai 

6. See Karen Hao, ‘ The Facebook whistleblower says its algorithms are dangerous. Here’s why’, MIT Technology Review, October 5 2021, https://www.technologyreview.com/2021/10/05/1036519/facebook-whistleblower-frances-haugen-algorithms/

 
AnalysisRise Now