Together with everybody else who has any interest in how we live today and how we can expect to live for the next few decades, I have been reading Douglas Coupland’s recent piece in FT Weekend magazine.
The topic is what he calls "Artificial Intuition" - basically the convergence of Big Data1 and all sorts of loyalty and activity tracking, in which algorithms will be able to correlate our data exhaust from all sorts of different sources, aggregate and correlate it, and use it to document and even predict our behaviour with a high degree of accuracy.
Many people find this scary or otherwise undesirable. Evgeny Morozov is something of a cheerleader for this rejectionist camp, calling the rise of data "the death of politics". The overall point of this reaction is that political change has always required a grey area where activities that might be illegal are not enforced. This is how homosexuality or racial equality could move from illegal, to tolerated, to embraced and legalised: because places and spaces existed where it was possible to practice illegal behaviours on a limited scale and in a group tolerant of those behaviours.
Coupland’s take is that these Temporary Autonomous Zones are being cleared up by the algorithmic approach to everything from shopping, to dating, to politics:
In fact, [Artificial Intuition] is accelerating at an astonishing clip, and it’s the true and definite and undeniable human future.
Later in the same piece:
The amount of internet freedom we have right now is the most we’re ever going to get.
This is probably correct - although I might remove the word "internet". Absent some sort of major crash of online surveillance mechanisms, it seems that we are heading to the point where this data-driven approach becomes irreversible. We are probably past the point where even full-blown societal rejection would work.
As ever, there’s a relevant Gibson quote:
Mona's life has left virtually no trace on the fabric of things, and represents, in Legba's system, the nearest thing to innocence.
From Mona Lisa Overdrive. Published in 1988, people!
My previous company had a deeply unsettling (to me) employee healthcare programme in the USA, where employees who did not participate in company-mandated health&fitness routines - and share data from those! - were penalised in their healthcare costs.
The black humour of the situation is that when Americans want to criticise European-style nationalised healthcare systems, they usually trot out arguments about smokers or overweight people being denied medical treatments because of their lifestyle choices. But here was a private company, under contract to another private company, enacting and enforcing something far more intrusive.
The usual argument here is that you should be staying healthy for yourself anyway, so this tracking should not matter to you. In fact, you should enjoy the discounts or loyalty points or whatever you get for meeting the targets of the health&fitness programme.2
Basically it’s a variation of the "if you have nothing to hide, you have nothing to fear" argument. The problem is that with Big Data, eventually everyone will have something to hide. Had an extra glass of wine with dinner? Drifted a little over the speed limit that one time? Inadvertently short-changed someone? Guess what - that’s tracked now. And what other behaviours, routine today, might be criminalised tomorrow? Overconsumption of sugar? Excessive screen time allowed to children?
This is not an idle question. Data don’t have a statute of limitations. Regardless of what you think of them personally, users of the hacked Ashley Madison cheaters’ site often paid specifically to delete their accounts. One reason someone might do this is that they had seen the error of their ways and resolved to be a better spouse - but now they risk being outed together with the biggest adulterers out there.
For once I’m going to quote Julian Assange, someone with whom I have all sorts of issues, but who has become something of a poster child for net privacy:
My version of that is to say, 'well, you're so boring then we shouldn't be talking to you, and neither should anyone else', but philosophically, the real answer is this: Mass surveillance is a mass structural change. When society goes bad, its going to take you with it, even if you are the blandest person on earth.
There’s a reason we have doors to our houses, and blinds and curtains to our windows - and it’s not so we can commit crimes in comfort, it’s simply so we can live our lives in private. Remember, the Panopticon was a prison. When secrets are outlawed, only outlaws will have secrets.