
Welcome to the metadata society — and beware

Adrian Lobe/Gulf News
What is threatening about this algorithmic regulation is not only the subtlety of control that takes place somewhere in the opaque machine rooms of private corporations, but that a techno-authoritarian political mode could be installed, in which the masses would be mere politico-physical quantity
Every day, Google processes 3.5 billion search queries. Users google everything: Resumes, diseases, sexual preferences, criminal plans. And in doing so, they reveal a lot about themselves; more so, probably, than they would like.
From the aggregated data, conclusions can be drawn in real time about the emotional balance of society. What’s the general mood like? How’s the buying mood? Which product is in demand in which region at this second? Where is credit often sought? Search queries are an economic indicator. Little wonder, then, that central banks have been relying on Google data to feed their macroeconomic models and thus predict consumer behaviour.
The search engine is not only a seismograph that records the twitches and movements of the digital society, but also a tool that generates preferences. And if you change your route based on a Google Maps traffic jam forecast, for example, you change not only your own behaviour, but also that of other road users by changing the parameters of the simulation with your own data.
Using the accelerometers built into smartphones, Google can tell if someone is cycling, driving or walking. If you click on the algorithmically generated search prediction Google proposes when you type “Merkel”, for instance, the probability increases that the autocomplete mechanism will also display this for other users. The mathematical models produce a new reality. The behaviour of millions of users is conditioned in a continuous feedback loop. Continuous, and controlled.
The Italian philosopher and media theorist, Matteo Pasquinelli, who teaches at the Karlsruhe University of Arts and Design, has put forward the hypothesis that this explosion of data exploitation makes a new form of control possible: A “metadata society”. With metadata, new forms of biopolitical control could be used to establish mass and behavioural control, such as online activities in social media channels or passenger flows in public transport.
“Data,” Pasquinelli writes, “are not numbers, but diagrams of surfaces, new landscapes of knowledge that inaugurated a vertiginous perspective over the world and society as a whole: The eye of the algorithm, or algorithmic vision.”The accumulation of figures and numbers through the information society has reached a point where they become a space and create a new topology. The metadata society can be understood as an extension of the cybernetic control society, writes Pasquinelli: “Today it is no longer a matter of determining the position of an individual (the data), but of recognising the general trend of the mass (the metadata).”
Deadly deductionsPasquinelli doesn’t see a problem in the fact that individuals are under tight surveillance (as they were in Germany under the Stasi), but rather in the fact that they are measured and that society as a whole becomes calculable, predictable and controllable. As an example, he cites America’s National Security Agency’s (NSA) mass surveillance program SKYNET, in which terrorists were identified using mobile phone data in the border region between Afghanistan and Pakistan. The program analysed and put together the daily routines of 55 million mobile phone users like pieces of a giant jigsaw puzzle: Who travels with whom? Who shares contacts? Who’s staying over at his friend’s house for the night? A classification algorithm analysed the metadata and calculated a terror score for each user.
“We kill people based on metadata,” former NSA and CIA chief Michael Hayden boasted.
The cold-blooded contempt for humanity expressed in this sentence makes one shiver. The military target is no longer a human person, but only the sum of its metadata. The “algorithmic eye” doesn’t see a terrorist, just a suspicious connection in the haze of data clouds. As a brutal consequence, this means that whoever produces suspicious links or patterns is liquidated.
Thousands of people were killed in drone attacks ordered on the basis of SKYNET’s findings. It is unclear how many innocent civilians were killed in the process. The methodology is controversial because the machine’s learning algorithm only learnt from already identified terrorists and blindly reproduced these results. What this means is that whoever had the same trajectories — that is, metadata — as a terrorist, was suddenly considered one himself. The question is how sharp the algorithmic vision is set.
“What would it lead to if Google Trend’s algorithm was applied to social issues, political rallies, strikes or the turmoil in the periphery of Europe’s big cities?” asks Pasquinelli.
