A couple of years ago, we first learned how big data could influence politics. The way in which we can be influenced by social media is fairly scary…
What if the team supporting a political campaign had information about the opinions, preferences and voting intentions of every individual in a country, and could tailor their campaigning precisely to each voter?
They have it already.
The Vice News article “The Data That Turned the World Upside Down” is the second scariest thing I’ve seen for ages.
It analyses how the harmless-sounding British company Cambridge Analytica uses information gathered from social media – all your “likes”; which shows you watch on TV; every quiz you ever did on Facebook; what you click on; what you buy; what you drive – in fact the whole of so-called “big data” – to build up a picture on you more detailed than anything George Orwell could have imagined.
If the Thought Police in 1984 had had big data, they wouldn’t have needed Room 101. They would have known everything already.
As others have observed, Orwell got total surveillance right. What he didn’t anticipate was people voluntarily putting on line all the information about themselves a potential authoritarian state could ever need.
The scariest thing I’ve seen in years is this video featuring “Alexander Nix” (is that his real name?), described as CEO of Cambridge Analytica, explaining how his techniques work. It takes eleven minutes to watch but is worth the investment.
Even the background music is spookily appealing – and appropriate.
It’s a pity the Eton-educated Nix has such a perfect English accent. This will reinforce the Hollywood-fuelled and atavistic convictions of Americans, Iranians, Turks and countless other conspiracy-oriented peoples that the British are inherently evil and are cunningly manipulating the entire world order with their Machiavellian tricks.
If only we were that smart.
Did I say Machiavelli? If he’d access to big data, the city state of Florence might be ruling the world as we speak.
Nix talks about building a model with 4-5,000 data points for every person in the United States of America; using this to predict their preferences; then using that information to target a tailored political campaign to influence voters.
This is both cost-efficient and effective.
Thus, you no longer put an expensive billboard by a highway to be seen by random motorists, or pay for a costly TV ad watched by millions of people who aren’t interested in your product. Instead, you analyse big data to build up a precise picture of a consumer, or voter; and deliver a message, precisely crafted to influence that individual, direct to their cerebral cortex via social media, a mail shot, or someone knocking on the door.
Not science fiction: it’s all in the video above.
I don’t think there is anything wrong about this. People have been trying to manipulate other people for years – indeed, the video reminds me of the classic 1952 science fiction short story The Snowball Effect by Katherine MacLean. In the story, which begins: “All right,” I said, “what is sociology good for?”, a professor attempts to show that he can make an organisation – the Watashaw Sewing Circle – more efficient by giving it a set of rules designed to make it flourish. The results are surprising. The whole story is at the link.
The idea that faceless corporations or political scientists know all about you and can simply programme you to do their will is also reminiscent of Frederik Pohl’s 1955 story The Tunnel under the World, available in full at my review of The Simpsons (9/10).
I reckon the article and video above tell us eight things:
(i) big data is here, and works. More information about every one of us is available right now. more readily accessible and easier to manipulate than ever before. The article and video simply explain it;
(ii) used effectively, that information makes us easier to target individually, and thus to manipulate, than we had thought possible. I disagree with those who say they’re immune to such influence. We are all susceptible to messages delivered the right way;
(iii) this may not be bad: it enables information about cheap green energy or noble charities to be sent to the people who want those services. Charities or churches are as able to access this info as governments and corporations (they may not be so able to afford it, but prices will fall);
(iv) these technologies are in their infancy. They will become more powerful as, for example, facial recognition software becomes all-pervasive in airports, train stations and streets; or eg as people put health information – pulse, blood sugar, diet etc – on-line via wearable devices. Thus: “need an isotonic drink? There’s a shop around the corner”; “you appear to have left our premises carrying goods for which you have not paid; kindly contact our security personnel”; “you are having breathing difficulties: we are increasing your health insurance premiums with immediate effect”;
(v) the organisations which use this technology best will accumulate power. The article above argues that these techniques were used both by both Brexit and Trump campaigns. Cambridge Analytica’s own video boasts that they helped Trump win (NB I suspect the company will be getting a lot of new customers, and competitors, around now);
(vi) you should be aware of these facts when you use social media or any other kind of interactive media. Remember: “if you don’t pay for something, you’re not the consumer, you’re the product”;
(vii) you may want to modify the way you use your phone, wearable technology or social media to take account of these trends. Or you may not care. See, for example, my recent piece on 11 ways you can help yourself to smartphone detox;
(viii) this technology is here to stay; we’d better start getting used to it. The only thing I can see which could change this would be if another technological innovation came along so big that it threw the rest of technological progress into reverse – as happened when Coronatime was invented.
Better start getting ready, folks.
P.S. a thoughtful interlocutor has contacted me to say that attempts to use data to predict human behaviour are nothing new; people are inherently unpredictable; and we shouldn’t be worrying about all this stuff. I’m not so sure; I think it’s about probabilities, and that big data could make it possible to influence more people, more of the time, with a sufficiently higher degree of accuracy and at lower cost than before to make this a fundamental shift. What do you think?