When you are waiting at a red traffic light at a junction, a little algorithm – a formula or set of steps for solving a particular problem – is silently judging you. It’s working out how long you and other motorists have been waiting at the various sets of lights at the junction, how many pedestrians have pushed a button at a crossing, which road is the most important one, what time of day it is, how quickly you need to pee and so on.
Based on these variables and what the algorithm makes of them all, you’ll either be waiting a short or a long time before you can stop cursing, uncross your legs, release the handbrake and move on (assuming, of course, that you’re the sort of person who uses the handbrake. There is a very good reason for using a handbrake whilst waiting at the lights, but that’s another, and perhaps rather boring, blog post).
Algorithms are in the news a lot at the moment, partly because a clever chap called Eli Pariser has written a book called The Filter Bubble about them. Annoyingly, this is a particular interest of mine, and he’s beaten me to writing a tome about it – but in my defence I’m a new dad and finding the time to write a blog post is very tricky, let alone attempting a book. For similar dad-related reasons, I haven’t got round to reading The Filter Bubble, but from what I can gather from reviews and an interesting TED talk he gave recently, Parisier’s focus is on how algorithms are used online.
More on that in a moment, but first it’s worth pointing out that algorithms are nothing new – they’ve been around for donkey’s years, and are as much an offline phenomenon as an online one. Healthcare professionals use detailed algorithms to examine symptoms and establish courses of treatment; call centres use them to evaluate your response to certain questions and ascertain what crap to sell you. If you’ve got a Volvo it will probably tell you off for starting the engine without putting your seatbelt on, and if you get in an elevator, it will hopefully take you to a floor which corresponds to the button you press. In their simplest form, algorithms are little flowcharts which ‘process’ a situation – or you. If yes, do this; if no, do that.
All the above examples seem rather mundane – and unless you’re particularly into the electronics in a Volvo, they are. But lately, algorithms have taken on a new importance. As with most things, the internet has sort of ‘turbo-charged’ them: it’s made them (a) more sophisticated and (b) far more prevalent, to the point where it’s virtually impossible to do anything online without encountering an algorithm that is doing its very best to make you take a very particular course of action.
You are probably only reading this blog post because Facebook or Google used an algorithm to process you – or your search query – in a very specific way and decided that this article was for you.
If you’re reading this in the south of France, you may well be there because when you perused the Ryanair site, it did some sums and thought that offering you a so-called free flight to Nice was a good idea.
If you’re staying in a four star hotel in Nice you may be there because when you searched for hotels in the area, an online advertising algorithm pointed you in the direction of a cheap deal on four star hotels in France. (Personally, if I was in a four star hotel by the Mediterranean, I wouldn’t be reading a blog post about algorithms, I’d be doing something more interesting, but there you go.)
Algorithms helping you get cheap flights seems pretty harmless; a good thing, right? Perhaps, depending on what you make of global warming. However, online algorithms are not just benign little bits of code that help you find stuff you like; they are often rather more sneaky than that.
If you use Gmail to read your emails, Google’s algorithms are reading them too, and displaying adverts to you based on the things – however sensitive or confidential – you are discussing in the mail.
If on Facebook you casually mention that you are a bloke and list yourself as ‘single’ (yes, I know, as if anyone ever does that casually), you will see a plethora of attractive big-breasted ladies beside your news feed, all enticing you to visit their dating website, where of course all the ladies are as attractive and big-breasted as the girls in the ads. Not that I would know.
If you visit an insurance website, a series of algorithms will track your every click and change the content of pages in real time to ensure that you only see the policies you are most likely to buy.
If you search for a product on eBay, an algorithm will take note of this, put a ‘cookie’ on your computer (without asking you), and you will see a shedload of adverts for that product when you visit other, completely different websites. This is perhaps why, when I turn on my computer after my partner has been on Ebay, I see countless adverts for Cath Kidson products wherever I go online, and if I’ve been using her laptop, all she will see is guitars.
This is all about personalisation: very big, powerful companies filtering content and showing you stuff based on who they think you are. (And to be fair, they’ve got a pretty good idea. Every time you clicked that little ‘like’ button on Facebook, you told it you are into Ann Summers products, Tom Jones and Pizza Hut. Hence the constant ads for sexy Welsh pizza).
It’s not because these companies particularly want to make the web experience better for you – although sometimes, this is a side-effect – they really just want to sell you something. But either way, personalisation algorithms are now being employed on an industrial scale, to the point where to use the internet is to be pushed hard, and in a sophisticated way, in a certain direction. And the interesting – perhaps disturbing thing – is the effect this is having on our worldview and behaviour.
Let’s take a look at worldview: it will come as no surprise to anyone who reads my blog, or has the misfortune to be subjected to my Facebook status updates, that I’m an outspoken pinko-lefty-liberal type. But I’m a tolerant guy, and I have some conservative friends. However, I’m unlikely to ‘like’ their status updates about so-called benefit scroungers or click on links they post to Daily Mail articles.
Equally, my conservative friends probably won’t be too keen on my rude status updates about David Cameron or the links to Guardian editorials that I post. However, I am quite likely to click on other people’s left-leaning posts, and my Tory cousins will no doubt hit ‘like’ every time somebody whinges about a mythical gold-plated public sector pension, calls for the return of the death penalty or wants to privatise the NHS.
These kind of social interactions have consequences for Facebook users. This is because the network makes use of an algorithm called ‘Edgerank’ to determine what to display in users’ news feeds. Without going into too much detail, it takes three variables – ‘affinity’, ‘weight’ and ‘time’ – to make a call on what pieces of content are relevant to each Facebook user. With the examples highlighted above, it will conclude that ‘right-wing’ posts are less relevant to me, and that ‘left-wing’ posts are less relevant to my conservative chums. And it will edit them out of our respective news feeds. This is truly a shame, as it means I can’t wind up my conservative friends any more. Rather more importantly, a valuable exchange of ideas is no longer taking place. Despite all the sharing of information and views that Facebook was meant to bring, every time I use it, a piece of maths is effectively hiding content from me. Not just me: 500 million or so Facebook users who are looking to it for information 20 times a day, 365 days a year. And the overwhelming majority of these have no idea at all that Facebook is taking such an active role in deciding what they should see. I’m no social scientist, but I’m sure this kind of filtering of content applied on such a huge scale cannot but have a significant impact on how people see the world.
This algorithmic, personalised filtering is not restricted to social media news feeds. It’s now crept into search results. Up until fairly recently, you could be fairly confident that if both you and your friend searched for Russian brides on Google, and you both lived in the UK, you’d get exactly the same results. However, about a year and a half ago I started noticing – not, I must stress, as a result of searching for Russian brides – that when I searched for the same thing on Google, but in different contexts, that the results were very different. By different contexts I mean searching for the same thing
- on more than one computer
- when I was logged into my Google account, or when I was not logged into my Google account
- after clicking a particular search result
- in a different geographical location.
This was a bit of a headache, as at the time I was doing a bit of freelance work involving search engine optimisation for a music site and I kept getting multiple sets of results for the exact same keywords. It turns out that Google had started doing the same thing as Facebook – looking at a whole load of variables relating to me and making assumptions as to what floated my boat, rather than giving me an impartial set of links. In his TED talk, Parisier highlights this filtering extremely effectively, by describing an experiment where he asked a few of his friends to google ‘Egypt’ and send him a screenshot of the results provided by Google. The screenshots all varied enormously – Google had personalised the search results to the nth degree for each of his friends.
It’s worth noting however, that personalisation isn’t restricted entirely to online algorithms written by big powerful corporations; in a sense, we also write our own. Here’s an example of how. These days I mainly read the news on a smartphone. I'm going to come across as very bien-pensant here, and perhaps a bit of a knob, but my two news sources of choice are the BBC and The Guardian – and I "consume" (eughh) news via two apps that I’ve downloaded for my phone. Both these apps let me select exactly what content I want to appear when I open them. So, when I’m reading the news, I’m presented with content to do with politics, comment, technology, music and whatnot – and generally speaking not much fashion, showbiz and sport. But when I used to buy a newspaper, I would read it from start to finish, meaning I was invariably exposed to – and would read – a much wider range of stories. With news apps, even though they don’t use any surreptitious personalisation filters, they subtly encourage users to apply their own personalisation filter. The upshot is arguably a narrower view of what’s going on, despite there generally being more content available to browse.
So should we be worried about all this filtering that’s going on? Yes. Because it means that the internet is changing in a profound way. Traditionally the web has been (justifiably) viewed as a tool that
- widens access to information
- provides an ‘impartial’ way of sifting through information
- increases transparency.
But now, the two major prisms through which people see the online world – arguably Facebook and Google – are throwing the above notions out the window. Facebook is actively restricting what people see in news feeds, based on perceived taste. Google’s results are no longer impartial – they’re personalised. And both services have not been at all transparent about how this filtering is / has been applied, or how to switch personalisation off.
And that’s just Facebook and Google – a multitude of sites are going down the personalisation route. It’s the Next Big Thing on the web. And when it gets to the point that every site you visit is running an algorithm that shows you ‘relevant’ information only after it has checked your IP address, cross-referenced its content with what you searched for on Google recently, examined your Facebook likes, scanned your computer for cookies and checked out that Russian brides website you were perusing the other day, the internet can no longer be considered a 'source' of information. It will be a gatekeeper far, far more powerful than Rupert Murdoch, and one that you can’t haul before parliament – or throw a foam pie at.