1.2264598-819849952
Image Credit: Hugo A. Sanchez/©Gulf News

Every day, Google processes 3.5 billion search queries. Users google everything: Resumes, diseases, sexual preferences, criminal plans. And in doing so, they reveal a lot about themselves; more so, probably, than they would like.

From the aggregated data, conclusions can be drawn in real time about the emotional balance of society. What’s the general mood like? How’s the buying mood? Which product is in demand in which region at this second? Where is credit often sought? Search queries are an economic indicator. Little wonder, then, that central banks have been relying on Google data to feed their macroeconomic models and thus predict consumer behaviour.

The search engine is not only a seismograph that records the twitches and movements of the digital society, but also a tool that generates preferences. And if you change your route based on a Google Maps traffic jam forecast, for example, you change not only your own behaviour, but also that of other road users by changing the parameters of the simulation with your own data.

Using the accelerometers built into smartphones, Google can tell if someone is cycling, driving or walking. If you click on the algorithmically generated search prediction Google proposes when you type “Merkel”, for instance, the probability increases that the autocomplete mechanism will also display this for other users. The mathematical models produce a new reality. The behaviour of millions of users is conditioned in a continuous feedback loop. Continuous, and controlled.

The Italian philosopher and media theorist, Matteo Pasquinelli, who teaches at the Karlsruhe University of Arts and Design, has put forward the hypothesis that this explosion of data exploitation makes a new form of control possible: A “metadata society”. With metadata, new forms of biopolitical control could be used to establish mass and behavioural control, such as online activities in social media channels or passenger flows in public transport.

“Data,” Pasquinelli writes, “are not numbers, but diagrams of surfaces, new landscapes of knowledge that inaugurated a vertiginous perspective over the world and society as a whole: The eye of the algorithm, or algorithmic vision.”

The accumulation of figures and numbers through the information society has reached a point where they become a space and create a new topology. The metadata society can be understood as an extension of the cybernetic control society, writes Pasquinelli: “Today it is no longer a matter of determining the position of an individual (the data), but of recognising the general trend of the mass (the metadata).”

Deadly deductions

Pasquinelli doesn’t see a problem in the fact that individuals are under tight surveillance (as they were in Germany under the Stasi), but rather in the fact that they are measured and that society as a whole becomes calculable, predictable and controllable. As an example, he cites America’s National Security Agency’s (NSA) mass surveillance program SKYNET, in which terrorists were identified using mobile phone data in the border region between Afghanistan and Pakistan. The program analysed and put together the daily routines of 55 million mobile phone users like pieces of a giant jigsaw puzzle: Who travels with whom? Who shares contacts? Who’s staying over at his friend’s house for the night? A classification algorithm analysed the metadata and calculated a terror score for each user.

“We kill people based on metadata,” former NSA and CIA chief Michael Hayden boasted.

The cold-blooded contempt for humanity expressed in this sentence makes one shiver. The military target is no longer a human person, but only the sum of its metadata. The “algorithmic eye” doesn’t see a terrorist, just a suspicious connection in the haze of data clouds. As a brutal consequence, this means that whoever produces suspicious links or patterns is liquidated.

Thousands of people were killed in drone attacks ordered on the basis of SKYNET’s findings. It is unclear how many innocent civilians were killed in the process. The methodology is controversial because the machine’s learning algorithm only learnt from already identified terrorists and blindly reproduced these results. What this means is that whoever had the same trajectories — that is, metadata — as a terrorist, was suddenly considered one himself. The question is how sharp the algorithmic vision is set.

“What would it lead to if Google Trend’s algorithm was applied to social issues, political rallies, strikes or the turmoil in the periphery of Europe’s big cities?” asks Pasquinelli.

The data gurus have an obsession with predicting human interactions like the weather. Adepts of the “Social Physics” school of thought, founded by data scientist Alex Pentland, look at the world as if through a high-performance microscope: Society consists of atoms whose nuclei are surrounded by individuals orbiting like electrons in fixed orbits. Facebook founder Mark Zuckerberg, for his part, once said he believed there was “a fundamental mathematical law underlying human social relationships”. Love? Job? Crime? Everything is determined, everything is predictable! As if society were a linear system of equations in which variables can be removed.

Control and predictability

In Isaac Asimov’s science fiction series Foundation, mathematician Hari Seldon develops the fictitious science of Psychohistory, a major theory that combines elements of psychology, mathematics and statistics. Psychohistory models society according to physical chemistry. It assumes that the individual behaves like a gas molecule. And like a gas molecule, the sometimes chaotic movements of an individual cannot be calculated, but the general course and “state of aggregation” of society can be computed with the help of statistical laws.

In one of the novels, Emperor Cleon I tells his mathematician: “You don’t need to predict the future. Just choose a future — a good future, a useful future — and make the kind of prediction that will alter human emotions and reactions in such a way that the future you predicted will come to fruition.” Even if Seldon rejects this plan as “impossible” and “impractical”, it excellently describes the technique of social engineering, in which reality (and sociality) are constructed and individuals are reduced to their physical characteristics.

This manifests a new power technique: The crowd is no longer controlled, but predicted. And that is the dialectical point: Its predictability is completely controllable. If you know where society is going, groups can be directed in the desired direction through manipulation techniques such as nudging, taking advantage of their psychological weaknesses.

Recently, an internal Google video was leaked in which the behavioural concept of a “Selfish Ledger” was presented — a kind of central register on which all user data is stored: Surfing behaviour, weight, health condition. Based on the data, Google suggests individualised options for action: Eat healthier, protect the environment, or support local business. Analogous to DNA sequencing, it could carry out a “behavioural sequencing” and identify behaviour patterns. Just as DNA can be changed, behaviour can also be modified. The end result of this evolution would be a perfectly programmed human being controlled by artificial intelligence systems.

What is threatening about this algorithmic regulation is not only the subtlety of control that takes place somewhere in the opaque machine rooms of private corporations, but that a techno-authoritarian political mode could be installed, in which the masses would be a politico-physical quantity. Only what has a mass of data has weight in the political discourse.

The visionaries of technology think politics from the point of view of cybernetics: The aim is to avoid “disturbances” and keep the system in balance. The Chinese search engine giant Baidu has developed an algorithm that can use search inputs to predict up to three hours in advance where a crowd of people (“a critical mass”) will form.

Here the program code becomes a pre-emptive prevention policy. The promise of politics is that it is open to the future and flexible. But when the behaviour of individuals, groups and society becomes predictable, political decision-making becomes a waste. Where everything is determined, nothing can be changed anymore.

— Worldcrunch, 2018, in partnership with Suddeutsche Zeitung/New York Times News Service

Adrian Lobe is a freelance journalist based in Germany.