Mutual Coordination Through Software Agents

From P2P Foundation
Jump to navigation Jump to search


Discussion

Nick Dyer-Whiteford in Red Plenty Platforms:

"Perhaps the idea of everyone watching mobile screens lest they miss, not a Facebook poke, but voting the seventh iteration of the participatory plan, duplicates unattractive features of everyday life in high-tech capitalism. So we might speculate further, and suggest that what decentralized collective planning really needs is not just council media but communist agents: communist software agents. Software agents are complex programmed entities capable of acting ‘with a certain degree of autonomy... on behalf of a user (or another program)’ (Wikipedia, 2013b: np). Such agents manifest ‘goal- direction, selection, prioritization and initiation of tasks’; they can activate themselves, assess and react to context, exhibit aspects of artificial intelligence, such as learning, and can communicate and cooperate with other agents (Wikipedia, 2013b: np).

[...]

the arena in which such agents truly excel is in the financial sector, where high frequency trading is entirely dependent on software ‘bots’ capable of responding to arbitrage possibilities in milliseconds.

One can’t help but ask, however, what if software agents could manifest a different politics? Noting that Multi-Agent System models can be thought of as a means to answer problems of resource allocation, Don Greenwood (2007: 8) has suggested they could be geared toward solving the ‘socialist calculation problem’. As planning tools, Multi-Agent Systems, he notes, have the advantage over real markets that ‘the goals and constraints faced by agents can be pre-specified by the designer of the model’ (Greenwood, 2007: 9). It is possible to design agents with macro-level objectives that involve more than just the maximization of individual self-interest; two ‘welfare’ principles that economists have experimented with incorporating are equality and environmental protection sustainability.

[...]

Within capital, automation threatens workers with unemployment or production speed-up. If, however, there were no dominant structural tendency for increases in productivity to lead to unemployment or greater output without reduction in labour time, automation could systematically yield to less time spent in formal workplaces. In a communist framework that protected access to the use value of goods and services, robotization creates the prospect of a passage from the realm of necessity to freedom. It reintroduces the goal – closed down both within the Stakhanovite Soviet experiment and in the wage-raising trades unionism of the West – of liberating time from work, with all this allows both in terms of human self- development and communal engagement.

Juliet Schor’s (1991) estimate, that if American workers had taken gains won from productivity increases since the 1950s, not in wages but in time off, they would by 2000 have been working a twenty hour week. [Added note: that seems like too many hours. Paul Goodman in 1950 estimated one day a week.] It indicates the scale of possible change.

Proposals for a ‘basic income’ have recently figured in left politics. There are certainly criticisms to be made of these insofar as they are advanced as a reformist strategy, with the risk of becoming merely a rationalized welfare provision supporting neoliberal precarity. But it would be hard to envision a meaningful communist future that did not institute such measures to acknowledge the reductions in socially necessary labour time made possible by advances in science and technology, destroying Hayek’s calculation problem by progressively subtracting from it the capitalist ur-commodity, labour power.

[...]

An abundant communist society of high automation, free software, and in-home replicators might, however, as Fraise (2011) suggests, need planning more than ever – not to overcome scarcity but to address the problems of plenty, which perversely today threaten shortages of the very conditions for life itself. Global climate change and a host of interlinked ecological problems challenge all the positions we have discussed to this point. Bio-crisis brings planning back on stage, or indeed calculation – but calculation according to metrics measuring limits, thresholds and gradients of the survival of species, human and otherwise. Discussing the imperatives for such ecosocialist planning, Michael Lowy (2009) points out how this would require a far more comprehensive social steering than mere ‘workers control’, or even the negotiated reconciliation of worker and consumer interests suggested by schemes such as Parecon. Rather, it implies a far-reaching remaking of the economic systems, including the discontinuation of certain industries, such as industrial fishing and destructive logging, the reshaping of transportation methods, ‘a revolution in the energy-system’ and the drive for a ‘solar communism’ (Lowy, 2009: np).

[...]

By revealing the contingency of conditions for species survival, and the possibility for their anthropogenic change, such ‘knowledge infrastructures’ of people, artifacts, and institutions (Edwards, 2010: 17) – not just for climate measurement, but also for the monitoring of ocean acidification, deforestation, species loss, fresh water availability – reveal the blind spot of Hayek’s catallaxy in which the very grounds for human existence figure as an arbitrary ‘externality’.

So-called ‘green capital’ attempts to subordinate such bio-data to price signals. It is easy to point to the fallacy of pricing non-linear and catastrophic events: what is the proper tag for the last tiger, or the carbon emission that triggers uncontrollable methane release? But bio-data and bio-simulations also now have to be included in any concept of communist collective planning. Insofar as that project aims at a realm of freedom that escapes the necessity of toil, the common goods it creates will have to be generated with cleaner energy, and the free knowledge it circulates have metabolic regulation as a priority. Issues of the proper remuneration of labor time require integration into ecological calculations.

[...]

The Soviet experience, of which the cyberneticians featured in Red Plenty were part, was only a narrow, historically specific and tragic instantiation of this capability, whose authoritarianism occludes the most crucial point in the Marxist concept of planning, namely that it is intended as a means of communal election of which, of a variety of trajectories, collective human ‘species-becoming’ might follow (Dyer-Witheford, 2004).

A new cybernetic communism, itself one of these options, would, we have seen, involve some of the following elements: use of the most advanced super-computing to algorithmically calculate labour time and resource requirements, at global, regional and local levels, of multiple possible paths of human development; selection from these paths by layered democratic discussion conducted across assemblies that include socialized digital networks and swarms of software agents; light-speed updating and constant revision of the selected plans by streams of big data from production and consumption sources; the passage of increasing numbers of goods and services into the realm of the free or of direct production as use values once automation, copy-left, peer-to-peer commons and other forms of micro-replication take hold; the informing of the entire process by parameters set from the simulations, sensors and satellite systems measuring and monitoring the species metabolic interchange with the planetary environment."