Cybernetic Planning
Discussion
The history of Cybernetic Planning
Nick Dyer-Witheford:
"If central planning suffered from a calculation problem, why not just solve it with real calculation machines? This was precisely the point made by Hayek’s opponent, the economist Oskar Lange, who, retrospectively reviewing the ‘socialist calculation’ debate, remarked: ‘today my task would be much simpler. My answer to Hayek … would be: so what’s the trouble? Let us put the simultaneous equations on an electronic computer and we shall obtain the solution in less than a second’ (1967: 159). Such was the project of the cyberneticians featured in Red Plenty, a project driven by the realization that the apparently successful Soviet industrial economy, despite its triumphs in the 1940s and ‘50s, was slowly stagnating amidst organizational incoherence and informational bottlenecks.
Their effort depended on a conceptual tool, the input-output table, whose development is associated with two Russian mathematicians: the émigré Wassily Leontief, who worked in the US, and the Soviet Union’s Kantorovich, the central protagonist of Red Plenty. Inputoutput tables – which, it was recently discovered, are amongst the intellectual foundations of Google’s PageRank algorithm (Franceschet, 2010) – chart the complex interdependence of a modern economy by showing how outputs from one industry (e.g. steel or cotton) provide inputs for another (say, cars or clothing), so that one can estimate the change in demand resulting from a change in production of final goods. By the 1960s such tables were an accepted instrument of large scale industrial organizations: Leontief’s work played a role in the logistics of the US Air Force’s massive bomber offensive against Germany. However, the complexity of an entire national economy was believed to preclude their application at such a level.
Soviet computer scientists set out to surmount this problem. As early as the 1930s, Kantorovich had improved input-output tables with the mathematical method of linear programming that estimated the best, or ‘optimizing’, combination of production techniques to meet a given target. The cyberneticians of the 1960s aimed to implement this breakthrough on a massive scale by establishing a modern computing infrastructure to rapidly carry out the millions of calculations required by Gosplan, the State Board for Planning that oversaw economic five year plans. After a decade of experimentation, their attempt collapsed, frustrated by the pitiful state of the Soviet computer industry – which, being some two decades behind that of the US, missed the personal computer revolution and did not develop an equivalent to the Internet. It was thus utterly inadequate to the task set for it. All this, alongside political opposition from a nomenklatura that, seeing in the new scientific planning method a threat to its bureaucratic power, compelled abandonment of the project (Castells, 2000; Gerovitch, 2008; Peters, 2012).
This was not the only twentieth century project of ‘cybernetic revolutionaries’; as remarkable was the attempt by Salvador Allende’s Chilean regime to introduce a more decentralized version of electronic planning, ‘Project Cybersyn’ (Medina, 2005). Led by the Canadian cybernetician Stafford Beer, this was conceived as a system of communication and control that would enable the socialist regime to collect economic data, and relay it to government decision makers, even while embedding within its technology safeguards against state micro-management and encouragement for many-sided discussions of planning decisions. This was an attempt at socio-technical engineering of democratic socialism that today perhaps seems more attractive than the post-Stalinist manoeuvres of the Soviet computer planners. But it met an even more brutal fate; Project Cybersyn was extinguished in the Pinochet coup of 1973. In the end the failure of the USSR to adapt to a world of software and networks contributed to its economic/military defeat by the United States. Its disintegration, in which, as Alec Nove (1983) demonstrated, information bottlenecks and reporting falsifications played a major role, seemed to vindicate the Austrian economists. Hayek’s praise of market catallaxy thus became central to the ‘neoliberal thought collective’ (Mirowski, 2009) that led the subsequent victory march of global capitalism.
The combined pressure of the practical disaster of the USSR and the theoretical argument of the Austrian school exerted immense force inside what remained of the left, pressuring it to reduce and reset the limit of radical aspiration to, at most, an economy of collectively owned enterprises coordinated by price signals. The many variants on such ‘market socialist’ proposals have evoked rebuttals from Marxists who refuse to concede to commodity exchange. Perhaps because they grant to the market the automatic information processing functions ascribed by the Austrian economists and market socialists, they may address issues of technological innovation or public data availability, yet do not seem to engage deeply with the potentialities of contemporary computing.
Today, post-crash, claims that markets are infallible information
machines may seem less credible than they did a quarter of century
ago. The parasitic energy-theft that underlies price-signal
transmissions (exploitation at the point of production); the inability
of individual commodity exchanges to register collective
consequences (the so-called ‘externalities’); and the recursivity of a
chrematistic system that loops back on itself in financial speculation,
have all become more salient in the midst of global capital’s
economic and ecological implosion."
(http://www.culturemachine.net/index.php/cm/article/view/511/526)