^thanks for those images
the piece below mentions observations by mark fisher and franco berardi viz a digital commons and the algorithmic shaping of an economy. there's a long bibliography at the essay's end which i haven't pasted or begun to investigate. cross-reference to benjamin bratton here:viewtopic.php?f=8&t=37133&p=521005&hilit=benjamin+bratton#p521005http://www.euronomade.info/?p=1708
Red stack attack! Algorithms, capital and the automation of the common
Posted in Reti
di TIZIANA TERRANOVA
This essay is the outcome of a research process which involves a series of Italian institutions of autoformazione of post-autonomist inspiration (‘free’ universities engaged in grassroots organization of public seminars, conferences, workshops etc) and anglophone social networks of scholars and researchers engaging with digital media theory and practice officially affiliated with universities, journals and research centres, but also artists, activists, precarious knowledge workers and such likes. It refers to a workshop which took place in London in January 2014, hosted by the Digital Culture Unit at the Centre for Cultural Studies (Goldsmiths’ College, University of London). The workshop was the outcome of a process of reflection and organization that started with the Italian free university collective Uninomade 2.0 in early 2013 and continued across mailing lists and websites such as Euronomade, Effimera, Commonware, I quaderni di San Precario and others. More than a traditional essay, then, it aims to be a synthetic but hopefully also inventive document which plunges into a distributed ‘social research network’ articulating a series of problems, theses and concerns at the crossing between political theory and research into science, technology and capitalism.
What is at stake, then, is the relationship between ‘algorithms and ‘capital’, that is the increasing centrality, announced in the document that called for the workshop, of algorithms ‘to organizational practices arising out of the centrality of information and communication technologies stretching all the way from production to circulation, from industrial logistics to financial speculation, from urban planning and design to social communication.’
(http://quaderni.sanprecario.info/2014/0 ... lgorithms/
). These apparently esoteric mathematical structures, have also become part of the daily life of users of contemporary digital and networked media. Most users of the Internet daily interface or are subjected to the powers of algorithms such as Google’s Pagerank (which sorts the results of our search queries) or Facebook Edgerank (which automatically decides in which order we should get our news on our feed) not to talk about the many other less known algorithms (Appinions, Klout, Hummingbird, PKC, Perlin noise, Cinematch, KDP Select and many more) which modulate our relationship with data, digital devices and each other. This widespread presence of algorithms in the daily life of digital culture, however, is only one of the expressions of the pervasiveness of computational techniques as they become increasingly co-extensive with processes of production, consumption and distribution displayed in logistics, finance, architecture, medicine, urban planning, infographics, advertising, dating, gaming, publishing and all kinds of creative expressions (music, graphics, dance etc).
The staging of the encounter between ‘algorithms’ and ‘capital’ as a political problem invokes the possibility of breaking with the spell of ‘capitalist realism’, that is the idea that capitalism constitutes the only possible economy while at the same time claiming that new ways of organizing the production and distribution of wealth need to seize on scientific and technological developments (Fisher 2009, Negri 2014b). Going beyond the opposition between state and market, public and private, the concept of the common is used here as a way to instigate the thought and practice of a possible post-capitalist mode of existence for networked digital media.
Algorithms, capital and automation
Looking at algorithms from a perspective that looks at the constitution of a new political rationality around the concept of the ‘common’ means engaging with the ways in which algorithms are deeply implicated with the changing nature of automation. Automation is described by Marx as a process of absorption in the machine of the ‘general productive forces of the social brain’ such as ‘knowledge and skills’ (Marx 1973: 694), which hence appear as an attribute of capital rather than the product of social labour. Looking at the history of the implication of capital and technology, it is clear how automation has evolved away from the thermo-mechanical model of the early industrial assembly line toward the electro-computational dispersed networks of contemporary capitalism. It is possible hence to read algorithms as part of a genealogical line that, as Marx put it in the ‘Fragment on Machines’ starting with the adoption of technology by capitalism as fixed capital, pushes the former through several metamorphoses ‘whose culmination is the machine, or rather, an automatic system of machinery.. set in motion by an automaton, a moving power that moves itself’ (Marx 1973: 692). The industrial automaton was clearly thermodynamical and gave rise to a system ‘consisting of numerous mechanical and intellectual organs so that workers themselves are cast merely as its conscious linkages’( ibidem). The digital automaton, however, is electro-computational, it puts ‘the soul at work’ and involves primarily the nervous system and the brain and comprising ‘possibilities of virtuality, simulation, abstraction, feedback and autonomous processes’ (Fuller 2008: 4; Berardi). The digital automaton unfolds in networks consisting of electronic and nervous connections so that users themselves are cast as quasi-automatic relays of a ceaseless information flow. It is in this wider assemblage, then, that algorithms need to be located when discussing the new modes of automation.
Quoting a textbook of computer science, Andrew Goffey describes algorithms as ‘the unifying concept for all the activities which computer scientists engage in… and the fundamental entity with which computer scientists operate.” (Goffey 2008: 15) An algorithm can be provisionally defined as the “description of the method by which a task is to be accomplished…’ by means of sequences of steps or instructions, sets of ordered steps that operate on data and computational structures. As such, an algorithm is an abstraction, ‘having an autonomous existence independent of what computer scientists like to refer to as “implementation details,” that is, its embodiment in a particular programming language for a particular machine architecture’ (Goffey 2008: 15). It can stretch in complexity from the most simple set of rules described in natural languages (such as those used to generate coordinated patterns of movement in smart mobs) to the most complex mathematical formulas involving all kind of variables (as in the famous Monte Carlo algorithm used to solve problem in nuclear physics and later also applied to stock markets and now to study non-linear technological diffusion processes). At the same time, in order to work, algorithms must exist as part of assemblages that include hardware, data, data structures (such as lists, databases, memory, etc..), and bodies’ behaviors and actions. For the algorithm to become social software, in fact, ‘it must gains its power as a social or cultural artifact and process by means of a better and better accommodation to behaviors and bodies which happen on its outside.’ (Fuller 2008: 5).
Furthermore, as contemporary algorithms become increasingly exposed to larger and larger data sets (and in general to a growing entropy in the flow of data also known as Big Data), they are, according to Luciana Parisi, becoming something more then mere sets of instructions to be performed: ‘infinite amounts of information interfere with and re-program algorithmic procedures… and data produce alien rules.’ (Parisi 2013: X) It seems clear from this brief account, then, that algorithms are neither a homogeneous set of techniques nor they guarantee ‘the infallible execution of automated order and control’ (Parisi 2013: IX).
From the point of view of capitalism, however, algorithms are mainly a form of ‘fixed capital’, that is they are just means of production. They encode a certain quantity of social knowledge (abstracted from that elaborated by mathematicians, programmers, but also users’ activities), but they are not valuable per se. In the current economy, they are valuable only in as much as they allow for the conversion of such knowledge into exchange value (monetization) and its (exponentially increasing) accumulation (the titanic quasi-monopolies of the social Internet). In as much as they constitute fixed capital, algorithms such as Google’s Page Rank and Facebook’s Edgerank appear ‘as a presupposition against which the value-creating power of the individual labour capacity is an infinitesimal, vanishing magnitude’.( Marx 1973: 694) and that is why calls for individual retributions to users for their ‘free labor’ are misplaced. It is clear that for Marx what needs to be compensated is not the individual work of the user, but the much larger powers of social cooperation thus unleashed and that this compensation implies a profound transformation of the grip that the social relation that we call the capitalist economy has on society.
From the point of view of capital, then, algorithms are just fixed capital, that is means of production finalized to achieve an economic return, but that does not mean that, like all technologies and techniques, that is all that they are. Marx explicitly states that even as capital appropriates technology as the most effective form of the subsumption of labor, that does not mean that this is all that can be said about it. Its existence as machinery, he insists, is not ‘identical with its existence as capital… and therefore does not follow that subsumption under the social relation of capital is the most appropriate and ultimate social relation of production for the application of machinery.’ (Marx 1973: 699-700) It is then essential to remember that the instrumental value that algorithms have for capital does not exhaust the ‘value’ of technology in general and algorithms in particular, that is their capacity to express not just ‘use value’ as Marx put it, but also aesthetic, existential, social, and ethical values. Wasn’t this clash between the necessity of capital to reduce software development to exchange value, marginalizing hence the aesthetic and ethical values of software creation, what pushed Richard Stallman and countless hackers and engineers towards the Free and Open Source Movement? Isn’t the enthusiasm that animate hack-meetings and hacker-spaces fueled by the energy liberated from the constraints of ‘working’ for a company in order to remain faithful to one’s own aesthetics and ethics of coding?
Contrary to some variants of Marxism which tend to identify technology completely with ‘dead labor’ or ‘fixed capital’ or ‘instrumental rationality’, and hence with control and capture, it seems important to remember how for Marx, the evolution of machinery also indexes a level of development of productive powers that are unleashed but never totally contained by the capitalist economy. What interested Marx (and what makes his work still relevant to those who strive for a post-capitalist mode of existence) is the way in which he claims that the tendency of capital to invest in technology to automate and hence reduce its labor costs to a minimum potentially frees up a ‘surplus’ of time and energy (labor) or an exceeding of the capacity to produce with relation to the basic, important and necessary labor of reproduction (a global economy for example should first of all produce enough wealth for all members of a planetary population to be adequately fed, clothed, cured and sheltered). However, what characterizes a capitalist economy is that this surplus of time and energy is not simply released, but must be constantly reabsorbed in the cycle of production of exchange value leading to increasing accumulation of wealth by the few (the collective capitalist) at the expense of the many (the multitudes).
Automation, then, when seen from the point of view of capital, must always be balanced with new ways to control, that is absorb and exhaust, the time and energy thus released. It must produce poverty and stress when there should be wealth and leisure. It must make direct labour the measure of value even when it is apparent that science, technology and social cooperation constitute the source of the wealth produced. It thus inevitably leads to periodic and widespread destruction of the wealth accumulated in the form of psychic burnout, physical destruction of the wealth created or environmental catastrophe. It creates hunger where there should be satiety, it puts food banks next to the opulence of the super-rich. That is why the notion of a post-capitalist mode of existence must become believable, that is must become what Maurizio Lazzarato described as an enduring autonomous focus of subjectivation. What a post-capitalist commonism then can aim for is not only a better distribution of wealth compared to the unsustainable one that we have today, but also a reclaiming of ‘disposable time’, that is time and energy freed from work to be deployed in developing and complicating the very notion of what is ‘necessary’.
The history of capitalism has shown how automation as such has not reduced the quantity and intensity of labor demanded by managers and capitalists, on the contrary in as much as technology is only a means of production to capital, where it has been able to deploy other means, it has not innovated. For example it does not look like industrial technologies of automation in the factory have recently experienced significant technological breakthroughs. Most industrial labor today is still heavily manual, automated only in the sense of being hooked onto the speed of electronic networks of prototyping, marketing and distribution, and it is economically sustainable only by political means that is by exploiting geo-political and economic differences (arbitrage) at a global scale and by controlling migration flows through new technologies of the border. The state of things in most industries today is intensified exploitation which produces an impoverished mode of mass production and consumption which is damaging to both to the body, subjectivity, social relations and the environment. As Marx put it, disposable time released by automation should allow for a change in the very essence of the ‘human’ so that the new subjectivity is allowed to return to the performing of necessary labor in such a way as to redefine what is necessary and what is needed . It is not then simply about arguing for a ‘return’ to simpler times, but on the contrary acknowledging that growing food and feeding populations, constructing shelter and adequate housing, learning and researching, caring for the children, the sick and the elderly requires the mobilization of social invention and cooperation. The whole process is thus transformed from a process of production by the many for the few steeped in impoverishment and stress to one where the many redefine the meaning of what is necessary and valuable, while inventing new ways of achieving it. This corresponds in a way to the notion of ‘commonfare’ as recently elaborated by Andrea Fumagalli and Carlo Vercellone, implying, in the latter’s words, ‘the socialization of investment and money and the question of the modes of management and organisation which allow for an authentic democratic reappropriation of the institutions of Welfare… and the ecologic re-structuring of our systems of production’ (cf. Vercellone forthcoming; also Fumagalli 2014) We need to ask then not only how algorithmic automation works today (mainly in terms of control and monetization feeding the debt economy) but also what kind of time and energy it subsumes and how it might be made to work once taken up by different social and political assemblages – autonomous ones not subsumed by or subjected to the capitalist drive to accumulation and exploitation.
The red stack: virtual money, social networks, bio-hypermedia
In a recent intervention, digital media and political theorist Benjamin H. Bratton has argued that we are witnessing the emergence of a new nomos of the earth, where older geopolitical divisions linked to territorial sovereign powers are intersecting the new nomos of the Internet and new forms of sovereignty extending in electronic space (Bratton 2012). This new heterogenous nomos involves the overlapping of national governments (China, United States, European Union, Brasil, Egypt and such likes), transnational bodies (the IMF, the WTO, the European Banks and NGOs of various types) and corporations such as Google, Facebook, Apple, Amazon etc producing differentiated patterns of mutual accommodation marked by moments of conflict. Drawing on the organizational structure of computer networks or ‘the OSI network model, upon with the TCP/IP stack and the global internet itself is indirectly based” , Bratton has developed the concept and/or prototype of the ‘stack’ t’to define the feature of a possible new nomos of the earth linking technology, nature and the human.’(Bratton 2012). The stack supports and modulates a kind of ‘social cybernetics’ able to compose ‘both equilibrium and emergence’. As ‘megastructure’ the stack implies a ‘confluence of interoperable standards-based complex material-information system of systems, organized according to a vertical section, topographic model of layers and protocols… composed equally of social, human and “analog” layers (chthonic energy sources, gestures, affects, user-actants, interfaces, cities and streets, rooms and buildings, organic and inorganic envelopes) and informational, non-human computational and “digital” layers (multiplexed fiber optic cables, datacenters, databases, data standards and protocols, urban-scale networks, embedded systems, universal addressing tables)’ (Bratton 2012).
In this section, drawing on Bratton’s political prototype, I would like to propose the concept of the ‘Red Stack’, that is a new nomos for the post-capitalist common. Materializing the ‘red stack’ involves engaging with (at least) three levels of socio-technical innovation: virtual money, social networks, and bio-hypermedia. These three levels although ‘stacked’ that is layered are to be understood at the same time as interacting transversally and nonlinearly. They constitute a possible way to think about an infrastructure of autonomization linking together technology and subjectivation.
The contemporary economy, as Christian Marazzi and others have argued, is founded on a form of money which has been turned into a series of signs, with no fixed referent to anchor them (such as gold), explicitly dependent on computational automation of simulational models, screen media with automated displays of data (indexes, graphics etc) and algo-trading (bot-to-bot transactions) as its emerging mode of automation (Marazzi n.d.). As Toni Negri also puts ‘money today -as abstract machine – has taken on the peculiar function of supreme measure of the values extracted out of society in the real subsumption of the latter under capital’ (Negri 2014b). Since ownership and control of capital-money (different, as Maurizio Lazzarato remind us, from wage-money for its capacity to be used not only as means of exchange, but as means of investment empowering certain futures over others) is crucial to maintaining populations bonded to the current power relation, how can we turn financial money into the money of the common? An experiment such as Bitcoin demonstrates that in a way ‘the taboo on money has been broken’ (Jaromil 2013) and that beyond the limits of this experience, forkings are already developing in different directions. What kind of relationship can be established between the algorithms of money-creation and ‘a constituent practice which affirms other criteria for the measurement of wealth, valorizing new and old collective needs outside the logic of finance’? (Lucarelli 2014) Current attempts to develop new kinds of cryptocurrencies must be judged, valued and rethought on the basis of this simple question as posed by Andrea Fumagalli: is the currency created not only limited to being a means of exchange, but can it also affect the entire cycle of money creation – from finance to exchange? (Fumagalli 2014). Does it allow speculation and hoarding or does it promote investment into post-capitalist projects and facilitates freedom from exploitation, autonomy of organization etc? What is becoming increasingly clear from is that algorithms are an essential part of the process of creation of the money of the common, but that algorithms also have politics (what are the gendered politics of individual ‘mining’ for example and of the complex technical knowledge and machinery implied in mining bitcoins?). Furthermore the drive to completely automate money production in order to escape the fallacies of subjective factors and social relations might cause such relations to come back in the form of speculative trading. In the same way as financial capital is intrinsically linked to a certain kind of subjectivity (the financial predator narrated by Hollywood cinema), so an autonomous form of money need to be both jacked into and productive of a new kind of subjectivity not limited to the hacking milieu as such, but similarly oriented not towards monetization and accumulation, but towards the empowering of social cooperation. Other questions that might involve the design of the money of the common are: is it possible to draw on the current financialization of the Internet by corporations such as Google (with its Adsense/Adword programme) to subtract money from the circuit of capitalist accumulation and turn it into a money able to finance new forms of commonfare (education, research, health, environment etc)? What are the lessons to be learned from crowdfunding models and their limits in thinking about new forms of financing autonomous projects of social cooperation? How can we perfect and extend experiments such as that carried out by the Inter-Occupy movement during the Kathrina hurricane with turning social networks into crowdfunding networks which can then be used as logistical infrastructure able to move not only information, but also physical goods (Common Ground Collective 2012)?
Over the past ten years, digital media have undergone a process of becoming social, that has introduced a genuine innovation with relation to previous forms of social software (mailing lists, forums, multi-user domains etc). If mailing lists for example drew on the communicational language of sending and receiving, social network sites and the diffusion of (proprietary) social plug-ins have turned the social relation itself into the content of new computational procedures. When sending and receiving a message, we can say that algorithms operated outside the social relation as such in the space of transmission and distribution of messages, but social network software places itself straight within it. Indeed, digital technologies and social network sites ‘cut into’ the social relation as such, that is they turn it into a discrete object and introduce a new supplementary relation (Stiegler 2013). If we understand, with Gabriel Tarde and Michel Foucault, the social relation as an asymmetrical relation involving at least two poles (one active and the other receptive) and characterized by a certain degree of freedom we can think of actions such as liking and being liked, writing and reading, looking and being looked at, tagging and being tagged, and even buying and selling as the kind of conducts that transindividuate the social (they induce the passage from the pre-individual through the individual to the collective). In social network sites and social plug-ins these actions become discrete technical objects (like buttons, comment boxes, tags etc) which are then linked to underlying data structures (for example the social graph) and subjected to the power of ranking of algorithms. This produces the characteristic spatio-temporal modality of digital sociality today: the feed, an algorithmically customized flow of opinions, beliefs, statements, desires expressed in words, images, sounds etc. Much reviled in contemporary critical theory for their supposedly homogenizing effect, these new technologies of the social, however, also open the possibility of experimenting with many-to-many interaction and thus with the very processes of individuation. Political experiments (se the various internet-based parties such as the 5 star movement, Pirate Party, Partido X) draw on the powers of these new socio-technical structures in order to produce massive processes of participation and deliberation, but like with Bitcoin, they also show the far from resolved processes that link political subjectivation to algorithmic automation. They can function, however, because they draw on widely socialized new knowledges and crafts (how to construct a profile, how to cultivate a public, how to share and comment, how to make and post photos, videos, notes, how to publicize events) and ‘soft skills’ of expression and relation (humour, argumentation, sparring) which are not implicitly good or bad, but they present a series of affordances or degrees of freedom of expression for political action that cannot be left to capitalist monopolies, but can migrate to new platforms and services. Given that algorithms, as we have said, cannot be unlinked from wider social assemblages, their materialization within the red stack involves the hijacking of social network technologies, the invention of new kinds of plugins, the construction of new platforms through a crafty bricolage of existing technologies, the enactment of new subjectivites through a detournement of widespread social media literacy.
The term bio-hypermedia, coined by Giorgio Griziotti, identifies the ever more intimate relation between bodies and devices which is part of the diffusion of smart phones, tablet computers and ubiquitous computation. As digital networks shift away from the centrality of the desktop or even laptop machine towards smaller, portable devices, a new social and technical landscape emerges around ‘apps’ and ‘clouds’ which directly ‘intervene in how we feel, perceive and understand the world’. (Griziotti 2014, also Portanova 2013). Bratton defines the ‘apps’ for platforms such as Android and Apple as interfaces or membranes linking individual devices to large databases stored in the ‘cloud’ (massive data processing and storage centres owned by large corporations) (Bratton 2013). This topological continuity has allowed for the diffusion of downloadable applications or apps which increasingly modulate the relationship of bodies and space. Such technologies do not only ‘stick to the skin and respond to the touch’ (as Bruce Sterling once put it), but create new ‘zones’ around bodies who now move through ‘coded spaces’ overlayed with information able to locate other bodies and places within interactive, informational visual maps. New spatial ecosystems emerge at the crossing of the ‘natural’ and the artificial allow for the activation of a process of chaosmotic co-creation of urban life. (Iaconesi and Persico n.d.). Here again we can see how apps are for capital simply a means to ‘monetize’ and ‘accumulate’ data about the body’s movement while subsuming it ever more tightly in networks of consumption and surveillance. However, this subsumption of the mobile body under capital does not necessarily imply that this is the only possible use of these new technological affordances. Turning bio-hypermedia into components of the red stack (the mode of reappropriation of fixed capital in the age of the networked social) implies drawing together current experimentation with hardware (shenzei phone hacking technologies, makers movements etc) able to support a new breed of ‘imaginary apps’ (think for example about the apps devised by the artist collective Electronic Disturbance Theatre which allow migrants to bypass border controls or apps able to track the origin of commodities, their degrees of exploitation etc).