We are living through a period, arising in roughly the nineteen-forties, starting with the introduction of the digital computer that fundamentally changes the nature of knowledge and of power. In particular, it mutates the nature of the human species’ means of self-reflection. If, historically, text and image were the crucial means for reflexivity, self-knowledge in human cultures, the development of the computational text and the digital image mean that such things are conjoined and operate by another layer. Now we do not simply operate via meaning, representation, description, inference, metaphor, allusion, statement, order and reproduction but supplement and modify these terms with logic, calculation and new forms of network and massification. These underlying codes are the computational forms, such as algorithms, data structures, encoding systems, databases, searching, indexing and prediction systems and so on, that constitute computational culture.
The shift to computation changes forms of knowledge in as fundamental a way as the move to the written word from oral cultures, the introduction of monasteries as giant textual, but non-sexual, reproduction machines in the middle ages of Europe, and the invention of print in the early modern period. Underlying texts and images, and one can therefore also say film, music and other cultural forms, but also scientific forms of knowledge, such as the abstracted modes of databased life in genomics and proteomics, and in the forms of governance, (as are being rolled out at citywide scales for Rio, for the management of infrastructures, or London, for the control of the olympics,) and economics are, in part, new techniques based on big data, statistical analysis and machine learning. But we can also see them articulated in more humble forms, such as the word processor, the musical sequencer and the image editor, not to mention the portable telephones without which contemporary life would be rather unhinged.
In some ways, this is a recursion of the profound split between the logical layer, of computation, and the semantic layer, of meaning – something explicit in the design of mark-up languages for the web, for instance - that gives the appearance of all being the same. Were one to have the taste to read a text on a Kindle for instance it appears to be roughly like the same process as reading in a print publication. At the same time however, the object works to another agenda, recording and reporting on the pages looked at, places marked, links followed. The data goes back to Amazon. Perhaps it makes possible the generation of new books designed to please an audience with a statistically likely aggregate of tastes and interests. Perhaps it is a means of entraining the market in authors, or discovering previously indiscernable genres. Perhaps no one quite knows what to do with this data.
As it is made of parts, text has always been digital, and the image has always involved forms of sorting, hierarchy, the ordering and arrangement of vision, just as text involves clauses, grammars, the procedures for linearity and the networking of meaning, by referral, citation, indexing and so on. But these functions are enhanced, engrossed, mutated and refined in the new conditions of computational humanity. Equally, they change what we understand by the nature of the copy. Some episodes in this change seem worth recounting.
Andrew Norman Wilson’s short film Workers Leaving the Googleplex shows some aspects of the work involved in the labour of adding texts as raw food for the largest artificial intelligence experiment yet conducted. As a project with an apparently high degree of secrecy, and, for unusually for Google an high level of non-white staff, Google Books is hived off from the rest of the workforce gathered in the happy camp atmosphere of the corporate campus. The rank of the book scanners, shown by a yellow-coloured badge, does not entitle them to any of the perks that Google is so lauded for equitably lading on its workers. Filming them loses Wilson his temp job, subcontracting via another company. Like Christopher Wilcha’s The Target Shoots First, a more extensive film on mail-order music, this is a film by a worker whose curiosity is surplus to requirements of the assembling of knowledge.
But what is worked on in this hidden arm of the Googleplex are not simply texts, what one might call, following Giorgio Agamben, ‘Bare data’. Via the deals brokered with numerous libraries who are legally entitled to hold that part of the textual residue of their various national territories considered “published”, collectively these are something approaching a summa of human culture as manifest in printed language. That is to say, that a great deal of what is written is not here, but what is, is explicitly addressed to the notion, in a broad sense, of a public. The texts are scanned and transformed into various sorts of files, ingested into Google’s systems. As scanned PDFs they are made available in little bits and pieces, sections, chapters, or, given the right twist of regulation – such as the end of a term of copyright - in whole, to users of the Google search engine. The scans, if one examines them, bear in a corner, a miniscule statement of the copyright of Google, something vested in them since the hidden labour they paid for placed them on the scanning device.
But these texts do not stay solely within the category of the public. The texts of Wittgenstein, the Bhagavad Gita and Kobo Abe are also combined as analytical fodder with all the spreadsheets, meeting minutes, emails and effluvia of Google Docs that are held in the cloud, a quasi-etherealisation of thousands of banks of computers chained together and chewing through the earth’s last black seams of energy. When there are riots over the burning of the Qur’an in a NATO incinerator, what is the righteous response to the integration of such texts in a database and the concomitant disentangling of its words into data-atoms, or the articulation of their sentences as conditions of probabilistic co-occurrence knitted together by Markov Chains. What are their chewings feeding? An artificial intelligence, if one can still call it by such a name, that learns – if by this term can be meant aggregation, comparison and pattern matching – and does so voraciously.
Google is most often criticized for either control-freakery or being a mere marketing machine, honing humanity down into finer and finer grained demographics in order to snort it up as dust. Perhaps the crude mechanics of advertising that it in turn drives, via various oligopolistic arms are merely an excus, a fig-leaf, to make it seem just like everything else. Perhaps too they are merely a necessary measure for the generation of a monstrous birth for which its users habits, thoughts, connections, readings and repetitions provide the most succulent of placentas.
IBM’s Magnificent Watson
If only both unhappiness and love can render in us the sufficient conditions to understand many texts, for instance those now rather unfashionable ones of André Breton, what are we to anticipate from this monster of the probabilistic enlightenment? In the copying of all possible statements, the sifting and evaluation of their likelihood of meaning based upon their formation against a background of all other statements perhaps we will witness an incipient irony in all communication, another layer of irony composed by statistical equivalence and distinction amongst those already composed by power, emotion and the yearnings yielded by desire.
Given the flexible generality of the computer as a machine, such irony may take many forms, one of which is already manifest in words in their relation to other words. And here is a chance for some efficiency: we may speak by describing ratios of likelihood of the sentences we wish to utter that would be indexical to certain kinds of statement. Within each class of likelihood, specific phrases might be identifiable by an alphanumeric key. This would, for instance, make the criticism of Twitter, (as being, solely by virtue of its fixed message-length, an engine for triviality) rather negligible. How much information could be packed into 140 characters if we were to speak in terms of simple conjugations of ratios or better, by unique identifying codes? We could communicate via irrefutable formulae, such as those imagined by Leibniz as a young diplomat dreaming of balancing out the competing interests of the Thirty Years War.
The most unlikely seeming statements, such as those of love – why is it, for instance, that one is so wretchedly drawn to such an impossible person, one who is perpetually responding by partial rejections and repositionings as much as by plunges into abysses of voluptuation – or the slightly less fraught but nonetheless perplexed language of particle physics can, by these means, be improved. They may retain their traditional relation to the infinite as a trope of the inexpressible, but expressed efficiently by numerical means. In the future, as they say, we will be able to cite subsets of the grand linguistic stockpile rather than go through the time-consuming convolutions of actual speech. Class libraries of sentiment, instruction, allusion and function, need only be invoked to be communicated. Art will therefore be in making codes that are more complex and beautiful than the cultural assets they index.
Archive of desires
What is not archived, copied, preserved in the present? Surely there are things that people now save, diagram and record that were not previously associated with the copy or publication. Having become luckily disarticulated from reproduction, the sexual act, even if only a tilt of the hips in front of a mirror, is now seemingly also a means to make copies, to be archived, dumped online and aggregated, sent from phone to phone. Are novel archives of desire being established in the way that only the rarest of entities were once recorded or duplicated? Desire for the act, for the state of desire, for the sense of being in desire, is archived in countless sites. Needless to say confusion arises between the real and the copy. Sometimes the one passes from one state to the other. One is photographed and exhibited in such a state because it is difficult to represent a state of lust, of love, desire, hunger for the possibility of such, memory or surprise at the intensity of such, as a mode of thought, or in text without appearing to be cumbersome to oneself. To be photographed, expectantly, by oneself, expecting another, or that state another might induce, of consummation in a state of inexplicable hunger, in a mode that might excite such desire, in postures worked in concordance with thousands of others. That such images are further entangled and entrained by copies of other kinds, the full catalogue of patriarchy amongst others, places the performance of the copy, as it spreads across networks rather more on edge.
Who has not, when walking down a busy street, found themselves to be absolutely at one with every aspect of the crowd, the scene, and who has not, at times, felt utterly wretched, or abundant in dislike for others on an equally multitudinous street? The condition of such images is equally ambivalent. They are an archive of anticipated intensities, needing an enormity of interpretative buttressing to maintain their status as records. Mise en abyme, the repetition of an image within an image, or a story within a story, an infinite regress that starts to devour the future, an old motif in art theory is one of visual culture’s articulations of the same phylum of kind as computational logic of the loop or of recursion, the launching of a process by that process. But perhaps too, since each image refers to a sense of desire in relation to desire it becomes, in the avalanche of sexual images available on the internet, a more general mode of polymorphous figuration? What spirals and skeins and blocs of images can we now elucidate now that they are there in millions of copies with all their fields of variation and excruciating and fantastical specificity of meaning? As intensity is archived, looses itself, triggers something elsewhere, is this dissipation what is desired?
The great systems of general equivalence, those that work at layers of greater or lesser abstraction rendering those things to which they refer into a state of relative or absolute replacability include money, chronometry, musical notation, text, mathematics. Each of them has their own kind of characteristics, qualities, affinities and traits that are also worked over time, and that change in combination with forces and capacities unleashed by each other. Computation now works in and amongst all of these to provide means for their integration according to multiple further forms of ordering. In linking them together they undergo the possibility for new kinds of conjugation and accident.
In such conditions it becomes useful to recognize the multifarious conditionality of all language, of all images, something mapped in certain ways by Gerhard Richter’s Atlas, a visual directory of genres of images, or traced in Susan Hiller’s tables of postcards and their messages. These works gesture to an underlying working, that audiences also fabulate in their viewing. Under the condition of computation, systems are running that attempt to articulate such fabulations by the precise means of effective procedures.
A precursor can be found in the loom, which, in its versioning by Jacquard, provided, via Ada Lovelace, one of the keys to the invention of computation. The programmable machine renders pieces of cloth equivalent, not simply by means of their fabric or pattern, but by means of the procedural logic by which they are yielded. Both sides of a piece of fabric refer back to the algorithms that ordered it. The relation between front and back of a textile, some in which the rear fibres are tightly incorporated, others where they are as loose as the wiring in a hand-made analogue computer provides an interesting point of differentiation from its partial equivalent, the reverse side of a digital image. In his recent book Uncreative Writing the poet Kenneth Goldsmith notes the tangles of interpretation implied in the conversion of an image file into a text (Recipe: switch the file suffix of an image from .jpg to .txt and open the file in a text editor or word processor).
The reverse side of a digital image is alphanumeric code, a code that in turn has its own reverse, as mise en abyme in references to binary code, thence to their anticipation in logic gates, manifest in circuit boards, and rephrased as electrical charges. Unpicking the layers of a fabric one may be able to deduce the patternings in its ordering of fibres sufficiently for them to be recapitulated and reset in another machine, and certain more studious producers of pirated goods have certainly done so. Something similar is done for images each time they are passed from one computer to another, reinterpreted from the reverse up. Here though, the choice of precisely which of the reverses is passed on and reinstantiated is crucial. The move from data in the form of alphanumeric to the position and colour value of a pixel involves the interpretative action of intermediaries known as codecs, encoding and decoding standards that provide the mechanism of transition.
Each codec, has its particularities, exploited, for instance, by artists working with noise and glitchiness to draw out the texture of transmission from one image to another, or from one state to another within an image. Each system of general equivalence has its own traits and tendencies of material and behavioural kinds. Working their inter-relations operates in and induces this texture, producing the domain of operations within their various kinds. The digital copy however changes the nature of all other equivalences, in its position as a general intermediary and operator, via its capacity for precision.
The difficulty of machine poetry, and the clunkiness of most computer art, especially when it is gormlessly heralded as being as being ‘creative’, is this precise machining of each signifier’s relation to another signifier in turn leading, somewhere up the lines, to an instruction. If we understand noise and glitch not merely as something like a transcription error or a mistake (where it is often read as something warm and human, the machine bearing its inner self at the point where it ceases to function predictably) but as part of a wider range of ways in which systems yield behaviours and certain kinds of performance - what is perceived as glitch being an expression of these - then their interoperation will have its own aesthetics. These interoperations may be, from different perspectives and at different times, symphonic and all-encompassing, fragmentary and grating, strangely isomorphic across instantiations, or as scintillating as a finely constructed clockwork mechanism with gearing made of air, brass and cheese. And equally, human interpretations of such, for those deemed worthy of entry into such a category, will be simply another artefact of reception amongst many. A particularly intriguing one maybe, but one with its own specificities and capacities, in turn complexified by histories and becomings.
The question of reflexivity, and the particular qualities that different media, text, image, computation and their various kinds generate potential capacities for self-knowing and invention. Reflexivity implies some kind of convolution in the arrangement of knowing and doing that also generates fissiparation and change over time, recognizing its unavoidability. What is significant about these media and their characteristics in relation to the figure of the human is that they are all non-totalisable multiplicities: no system of general equivalence, with all their partialities, has yet succeeded in quite arranging the delivery of love, joy or sorrow, (the usual crass exemplars that are wheeled out as being unprovidable by, for instance, money, AI, or musical composition) though they may contribute to closing down the variables in more or less interesting or perilous ways. What becomes more intriguing than the question of simple copy therefore is that of feedback, in the many ramified kinds in which it exists.
The copy indeed, as repetition, mimesis, forgery, plagiarism, memory, reproduction, version, re-enactment, edition, fake, tribute, piracy, hoax, replica, clone, generic, duplicate, rewind, proliferates as a manifold field, not simply of the same, but of the negotiation, storage, drifting and development of the capacities of variation and invention. Equally, in such a context, as the image and text are supplemented by computational forms such as code, algorithm, database, data structure and others as fundamental cultural elements, the question of the copy in relation to an original becomes of less significance than the kinds of context, the media ecology, at multiple scales and kinds, that text, image and computational elements operate in and as part of and how they in turn change, shape, agglomerate with and repeat other repetitions. Whilst we marvel at such a multiplicity and plunge into its workings in the age of digital humanity, systems of general equivalence also mesh and integrate to entrain and pattern behaviours and possibilities. The aesthetic capacities of such a condition have yet to be fully explored.