Safety in Numbers?
Um artigo publicado na Frieze, março 2014.
Algorithms, Big Data and surveillance: what’s the response, and responsibility, of art? Jörg Heiser asked seven artists, writers and academics to reflect.
In the current moment, we are experiencing a sense of being tracked and measured by a cabal of machines whose genius is to distil the particulars of our lives into a substance called ‘data’. The machines (and by extension their handlers) then use this data to make inferences about our behaviour, our associations and our beliefs – information that we haven’t intentionally revealed or which we perhaps don’t even have access to ourselves.
Spooky, right? And seemingly antipodal to the kind of insight that art is supposed to provide: mechanical where art is human, repetitive where art is inventive. The machines that watch us can seem like H.G. Wells’s Martians: ‘minds that are to our minds as ours are to those of the beasts that perish, intellects vast and cool and unsympathetic’ which peer down at the aggregate trail we leave in the informational substrate, and thus at us, ‘as a man with a microscope might scrutinize the transient creatures that swarm and multiply in a drop of water’.
But what machines do with data is not so foreign. It appears foreign, because when we talk about data we do so in the language of mathematics: loss functions and kernels, logistic regression and Greek letters. The language presents the same kind of difficulty for outsiders as the international art-speak found on museum wall texts.
Quantitative surveillance has two main goals: to classify and, having classified, to predict. And prediction comes down to this: people are likely to do things in the future that people like them did in the past. This principle – that we have tendencies, which are not inescapable but which take some work or some luck to escape – is not the property of mathematicians. How would novels function without it?
And the project of classification – which is to say all the work that’s hidden in the word ‘like’ or the phrase ‘people like them’ – is nothing more than the project of analogy, which asks us to set aside the boring observation that no two human beings (and, likewise, no two moments in time, no two societies etc.) are identical to each other, and replace it with a suite of more interesting questions, such as: in the space of human beings, which people are near each other? Or, when are two things alike, in ways beyond the obvious ones? That, of course, is a traditional artistic project too.
Big Data, automated behaviour prediction and classification relate to traditional art forms as photography does to drawing and painting. Photography isn’t there to replace artistic representation; in some of its manifestations it’s a new form of artistic representation, and in all its forms it’s something art can talk about, without acquiring expertise in photoreactive chemistry or digital compression algorithms. It will be the same story here.
And if you regard surveillance as a thing to be resisted, take some comfort from the fact that Wells’s Martians were eventually felled by terrestrial microorganisms. They were different from us on the surface. But on the inside, where they were vulnerable, they were built much as we are.
Jordan Ellenberg is Professor of Mathematics at the University of Wisconsin, USA. He is a regular columnist forSlate and his book How Not to Be Wrong (Penguin, 2014) is forthcoming.