top of page

The Feudal Internet: Radicalisation in Plain Sight

Image: Leon Bublitz

Undoubtedly, politics has become much more divided through the stark polarisation of political values across the spectrum. Looking at the discussion around Brexit and elections it is easy to put sole responsibility on the mainstream media. However, this would be to ignore the main culprit for dividing politics: the growth of social media. The way sites are structured and how they make money has turned them into platforms where hate and misinformation are spread easily. Not only that but the monopoly held by the main sites; Facebook, YouTube, and Twitter, has created a virtual feudal structure.

Each of these sites has transformed into something completely different from what their creators originally intended. Zuckerberg wanted to connect college students not anti-vaxers, the creators of YouTube wanted to let friends stay up to date with each other through posting personal videos and not to be accused of supporting neo-nazis. Jack Dorsey wanted a personal microblog platform, not a place for US presidents to rant about their own dislike of the news media. The unforeseen evolution of these social media sites, combined with how they make money is what is leading to a more divided political landscape.

The need for these sites to be profitable means they aim to keep users logged in for as long as possible.To do this they tap into the user’s personal data to provide recommendations of videos and posts which are similar to those the user has previously viewed; if they watch one cat video they’ll be recommended more cat videos. The way the algorithms these sites use are programmed, is not predicated in recommending videos which are factual or beneficial to getting a wide spectrum of political views. Instead, it wants to recommend videos the user has “liked” in the past. If they watch a video questioning the existence of Covid-19 they will keep being recommended anti-Covid conspiracy theories. Without knowing it, people are being radicalised by corporate interests with no responsibility or skin in the game.

Indeed, there has been an increase in awareness of the harmful repercussions - resulting from the dissemination of false or radical ideas on these platforms – and the sites have started to respond. Famous far right commentators are being banned from Twitter, Facebook and YouTube. Even Donald Trump has been banned from Twitter after inciting a failed coup. Most far right and conspiracy theory oriented is Alex Jones whose audience has been told that the Sandy Hook shooting was staged or that frogs are being turned gay by chemicals in the water. While these bans may seem like a good way to help end the widening political gap, I suspect it will have the opposite effect.

In response to the social media ban, Alex Jones and other figures associated with the “intellectual dark web” have voluntarily left mainstream social media sites. The effect is leading more people to social media sites popping up to provide a “safe space” for these conspiracists. Parler is currently the default alternative to which conservative commentators are turning. Although, it may seem positive for many that these conspiracists are leaving the popular interaction channels, pushing these ideas out of the mainstream may also encourage extremism, such as the case of the anti-semite attacker at the Poway Synagogue in America in 2019 or any number of radicalised terrorist actions in Europe.

The Poway shooter announced his intentions to commit a mass shooting on the image based social media site 8chan. The site 8chan is a social media site with next to no influence from its creator Fredrick Brennan, who set up the site in 2013 inspired by his perceived lack of free speech on mainstream social networks. Brennan created a site where there is virtually no oversight from the site’s webmasters. The result of this lack of moderation is a site which acts as haven for white supremacists, incels and even pedophiles. And while Parler is unlikely to be as unmoderated as 8chan, the setting up of a reactionary social media site brings about concerns with what sort of content will be posted.

This reactionary element can already be seen in the Parler’s user base which runs from Katie Hopkins to Candace Owens. The Independent reported on Parler in 2019 and found that upon signing up to Parler recommended hashtags included #covidiots, #georgesoros and #2NDCOMING. By providing a place for people who have been de-platformed these sites attract those who are already radical or who are being radicalised.

There’s no easy way to prevent this radicalisation from taking place. While mainstream sites aim to keep consumers logged in for as long as possible, the radicalisation of its users will only continue. Once radicalised the user will turn to Parler and other similar sites to find out the “truth”. The issue is there isn’t any real way Facebook and Twitter can deal with conspiracy theorists without suppressing those ideas, while Facebook still fails to acknowledge that the way it operates as leading to the spread of misinformation

It’s also important to remember that radicalisation does not happen quickly, it’s often a slow process which gradually takes over the persons life. The attempted coup on the US Capitol is an example of the dangers of radicalisation. My previous examples of people being radicalised like the Poway Synagogue shooter were lone wolves. What we saw with the attack on the Capitol was the mobilisation of far-right Trump supporters aiming to overturn the results of a democratic election. The danger of social media’s misinformation has reached an all time high with violence being the result. The removal of Trump from Twitter and Facebook will simply reinforce the beliefs of those involved in the Capitol Hill riot and those who support it. They already think that big tech is against them and now they have even more “evidence”. This sort of violence is already being spread to the UK with anti-lockdown rallies and attacks on BLM protestors and police. .

There are steps to take to prevent people being radicalised, but one way is for tech platforms to shift their goal from owning a site which makes profit to a site which helps people understand one another. Inevitably, this is hard for them to enforce, because by banning extremists they play into to the narrative of right-wing extremist groups. Therefore, users must be at the forefront of tackling extremist ideas. We as users also have a responsibility to prevent the rise in fake news. If we see something that’s clearly fake news, we should report it, we should try to engage with people who believe fake news without talking down to them. Otherwise, we will continue to see a rise in people leaving Facebook and Twitter and joining Parler, creating a more divided nation and preventing an open dialogue.

Edited by Ellie Muir, Essays Editor


bottom of page