On Tuesday last week, Theresa May triggered six weeks of intense campaigning, which will lead to a snap election on 8 June. Whether you’ve made your mind up or are still on the fence, how confident are you that your local politicians aren’t being chosen by a billionaire in Palo Alto, USA?
As I stepped into the polling station on the 8 May 2015, I was confident I’d made the right decision. Months of watching political debates and chatting with friends had helped me formulate a fair and well balanced view point.
However I noticed something bizarre in the months that followed. I have a politically diverse bunch of friends, but when I logged into Facebook after the election I saw nothing but Tory hate. It doesn’t seem to clearly represent the views of those around me, and certainly not the views of a nation who surprisingly and overwhelmingly favoured the Conservatives two years ago.
Something didn’t add up. So I did what any normal person would do. I counted the political status updates over a period of time and found 37 anti-Tory statuses and zero pro.
Interesting. Does that small sample of 37 posts suggest that 100 per cent of my Facebook friends are anti-Tory? And therefore does my humble friend list of 350 suggest that all 35 million Facebook users are too?
Obviously not. So the more pressing questions are: “Is Facebook selecting which of my friends’ political opinions it wants me to see?” and: “Is Mark Zuckerberg forcing some political ideology on his quest for world domination?”
Our journey for truth begins with a tale of conspiracy, in the depths of Facebook’s most controversial feature — the news feed.
The news feed conspiracy
The Facebook news feed is a fickle friend. Sucking you in, drawing you away from real life to feast on the humble-brags and highly edited lives of someone you once met in 2009.
Very few people are aware of the mechanics behind it. Facebook is addictive for a reason. It wants you coming back time after time, so it shows you more of what you like, and less of what you don’t. Under the bonnet of your news feed are a number of algorithms that follow your every move, learn more about you, and tailor what you see.
If you like a status, you’ll hear more regularly from that person. If you view someone’s photos, it will show you more. Its intuition goes further — it gives preferential treatment to certain words in posts and comments (for example a comment containing “congratulations” appears on your feed more often, because it signals a major life event). The formulas behind the algorithm are determined by thousands of factors and updated several times a day. This has a huge impact on the information you absorb.
The average Facebook user has access to 1,500 posts per day, but only sees 300 of them. What’s included in the 80 per cent of posts you don’t see? Your news feed won’t tell you. In fact, 62 per cent of people don’t even know their news feed is being filtered.
Eli Pariser, internet activist and founder of Upworthy, coined the term “filter bubble” to describe the way we consume information online (watch his brilliant TED talk here).
In pursuit of tailoring our digital experiences, internet services and their algorithms filter out the content we won’t be interested in. “If algorithms are going to curate the world for us,” Pariser said, “then … we need to make sure that they also show us things that are uncomfortable or challenging or important.”
So intentionally or not, the internet is narrowing our view of the world, not widening it, and the news feed is churning out posts that tell us what we already like and agree with.
But what does that mean for politics?
Does Facebook influence the way you vote?
If Facebook’s algorithm is intentionally showing you content you already agree with, are you seeing a wide enough range of information to make an informed decision?
It’s unlikely that posts on Facebook have the power to convert someone from the Green Party to UKIP. However with one study reporting that 50 per cent of voters in the 2015 election were still undecided as late as March, there’s a significant risk of someone or something intentionally curating the political content we see online.
Earlier this year, Facebook published a study to try and prove their algorithms don’t mess with political outcomes. They chose a sample of 10 million users in the US who labelled themselves as Democrat or Republican and tracked their pre-election activity. First they left the algorithm running and counted the number of posts in an individual’s news feed that contained opposing political views. Then they turned the algorithm off and counted again.
The results are surprisingly insignificant – you’ll only see six per cent less opposing political views with the algorithm active. That’s just the odd post here and there, and certainly doesn’t suggest that a significant number are being suppressed.
So, what’s the cause of the Tory hate party on my news feed?
Facebook blames me for living a terribly sheltered life and not making friends beyond my socio-demographic group: the same US study found that only 23 per cent of people in your friends list have completely opposing political views to you.
All that said, something doesn’t feel right about believing the conclusions Facebook gave from the Facebook-initiated study undertaken by internal Facebook researchers using data only Facebook has access to.
Can statistics from social media during the 2015 elections help us get to the bottom of this?
Social media and the 2015 election
The 2015 election was predicted as being the first election that really harnessed the power of campaigning on social media. It didn’t fulfil this prophecy, but there are a few helpful stats that shed some light and ultimately provide credibility to Facebook’s study:
1. Only 9 per cent of Facebook users openly state their political allegiance. The other 91 per cent of my ‘friends’ may be Tories that just don’t feel the need to post their political opinions online. That’s backed up by stat number two.
2. Labour had 62 per cent more Facebook shares than the Conservatives (539,802 to 201,535). Labour also beat Conservatives on Youtube, Twitter and Instagram engagement. Perhaps Labour supporters are more likely to a) be on social media and b) share their political opinion on social media. Hence the unanimous Tory hate on my news feed.
I guess my conclusion is that Facebook isn’t whitewashing any political opinions or forcing ideologies on you. Instead, the unanimous Tory hate is probably a result of not spreading my friendship net wide enough, or having friends who don’t feel comfortable voicing their political opinions.
Although I’d initially hoped to uncover the greatest scandal in modern politics, along the way I discovered that the political opinions and messages we consume online are dangerously unrepresentative of the political thoughts and movements of a nation. Now I’m off to send friend requests to some right-wing voters and hippies.