Google knows me so well I’m wondering if I’m a robot

It was Dave Dee, Dozy, Beaky, Mick and Tich who made me start to seriously question whether I have free will.

To explain, let me add a little bit of context. I was recently watching The Sound of the Sixties on BBC Four, because that's how I like to party when a song came on by the aforementioned quintet. I didn't recognise it, and the presenter seemed to garble the name slightly was it called "Sabadat" or something?. So after listening for a minute or so, I reached for my phone and started Googling to identify the song. I got as far as typing "dave d" before the suggestion popped up: "dave dee dozy beaky mick and tich zabadak".

Should I have been surprised? Perhaps we've come to take Google's amazing predictive capabilities for granted. But the spooky thing was that no other titles were suggested. I certainly hadn't searched for this track before; so how had Google managed to pick out precisely what I was looking for?

I came up with a hypothesis. Earlier, the show had played "Pictures of Matchstick Men", and I had idly Googled it, because no matter how many times I hear that song, I can't bring myself to believe it's by the same Status Quo who wrote "Whatever You Want". This must have somehow clued Google in to which show I was watching, allowing it to correctly guess what I might search for next. Perhaps, from previous airings of the show, Google has learnt that people who search for one specific song by Status Quo are likely to go on and search for a particular Dave Dee composition about ten minutes later. In this age of machine learning, I didn't dare rule out the idea.

What does this have to do with free will? Well, as online services become increasingly eerie in their predictive abilities, I'm reminded of an insight that one of my university supervisors shared with me: "You might sincerely believe that you freely make your own decisions," he pointed out. "But if, at the start of each day, I were to present you a sealed envelope saying exactly what you would do on that day, and if every evening you opened it and found it correct, you might well question whether you really did have free will, or whether it merely appeared that way to you." So it is with Google. I think of myself an autonomous being, but as soon as I typed "dave d", the search engine knew that I would search for "Zabadak", rather than the more popular "Legend of Xanadu" or other matches, such as The Kinks' Dave Davies.

Does it matter?

You might take the utilitarian view that better predictive technologies are a good thing, because they mean less effort. Perhaps in the future, when an unfamiliar song comes on, I'll glance at my watch and see the answer to my unasked question already showing onscreen. It isn't a huge leap from what Google Now already does with travel news and football scores.

But there is a worrisome aspect to it and I don't just mean the philosophical questions it raises. Talk of predictive technologies calls to mind Philip K Dick's short story The Minority Report, best known in its 2002 film adaptation. The title has become a byword for futuristic computer interfaces, but the real theme of the story is "pre-crime" the idea of identifying and apprehending would-be criminals before they have a chance to act. Google can't yet predict crimes of passion and violence, but in a world threatened by terrorism and espionage, it can certainly spot suggestive connections and activities. At what point does such data cross the line from circumstantial evidence to actionable intelligence?

Perhaps that idea doesn't worry you. I'm sure you're not a terrorist, after all. But that doesn't mean you won't be affected, because behavioural prediction doesn't have to be passive. If it can be established that people who do x normally go on to do y, that's hugely exploitable information. You may recall the government's controversial "nudge unit", set up in 2010 with the aim of improving social behaviours through subtle measures such as changing the language of official letters and forms. One reason it was so controversial was that many of its ideas worked: people really did settle their tax bills more quickly after reading that most recipients pay on time.

Now imagine if the likes of Facebook and Google started using such techniques to guide you towards specific outcomes -- perhaps the ones paid for by advertisers. It's a bit like the notorious "filter bubble" syndrome, where you end up only seeing news you like from people you agree with. But it's worse, because it's driven by companies whose interests are in no way aligned with yours and if it's done properly, you shouldn't have any idea you're being nudged.

Truthfully, I suspect there's a limit to how far this idea can be taken. The world is a chaotic place, and even if I don't have free will, my behaviour is driven by a huge number of unpredictable inputs, not just the ones you lay in my path. And I'm fine with that: I'd rather be a mere robot than waste my precious autonomy clicking on links I don't care about, just to make another quarter of a cent for some Silicon Valley billionaire in a tie-dye T-shirt.

As to the way forward, I honestly don't know. While writing this column, I found myself wondering whether "The Legend of Xanadu" might contain a suitable line to close on something about escaping the modern world, perhaps. So I opened my browser to search... and as soon as I pressed "x", the suggestion "Xanadu lyrics" popped up. If I'm worried about being predictable, I may already have missed the boat.

Main image credit: Bigstock

This article first appeared in PC Pro

Darien Graham-Smith

Darien began his IT career in the 1990s as a systems engineer, later becoming an IT project manager. His formative experiences included upgrading a major multinational from token-ring networking to Ethernet, and migrating a travelling sales force from Windows 3.1 to Windows 95.

He subsequently spent some years acting as a one-man IT department for a small publishing company, before moving into journalism himself. He is now a regular contributor to IT Pro, specialising in networking and security, and serves as associate editor of PC Pro magazine with particular responsibility for business reviews and features.

You can email Darien at darien@pcpro.co.uk, or follow him on Twitter at @dariengs.