Planning as Democratization vs Planning as Totalization

From P2P Foundation
Revision as of 06:16, 23 February 2025 by Mbauwens (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Text

Batuhan:


1.

"Friedrich Hayek famously argued that "the curious task of economics is to demonstrate to men how little they really know about what they imagine they can design" (Hayek, 1945). In his view, natural normalizing mechanisms such as money, language, and vote serve as spontaneous orders that enable decentralized coordination without the need for central planning. These instruments provide comparability and stability; they allow dispersed knowledge to be aggregated into workable, if ultimately provisional, metrics like GDP. However, when one attempts to codify all aspects of social life—reducing human existence solely to numbers and measurable outputs—the system becomes both rigid and dehumanized, a point echoed in recent critiques of neoliberal growth ideology.

At the same time, Deleuze disrupts the notion of any “natural” normalizing mechanism. For Deleuze, there is no essential difference between natural and artificial systems; both are products of social processes that are continually deterritorialized and reterritorialized. He emphasizes that the processes of overcoding and decoding constantly reconfigure meaning, thereby ensuring that any static, centralized measure—such as GDP—is inherently limited. In this framework, the idea of a fixed “general intelligence” or a stable, all-encompassing code collapses into a dynamic interplay of multiplicity and flux.

Antonio Negri further complicates this picture by arguing that capitalism itself, through its globalized, networked forms of control, creates conditions in which economic calculation becomes a matter not just of measurement but also of power. Negri’s concept of the “multitude” underscores that collective action and democratic planning cannot be reduced to centralized technocratic decision-making. Rather, the intersubjective negotiation of meaning—what might be called a “social choice” process—is essential for transcending the limitations imposed by static measures like GDP. As Negri contends, democratic practice must remain a “speculative science,” always open to reinterpretation and renewal, rather than a system of fixed rules and predetermined outcomes.

These arguments converge on a critical point: any attempt to reduce the complexity of human and social life solely to formal, quantifiable systems (what some call “codeable knowledge”) is doomed to overlook the irreducible, experiential dimensions of existence. As scholars have noted, “the system” that aggregates dispersed knowledge into a single, centralized order inevitably sacrifices the richness of intersubjective dialogue and individual creativity (see also Kuhn’s analysis of paradigm shifts and Quine’s critique of analytic-synthetic dichotomies). In other words, while natural normalizing mechanisms provide a necessary foundation for social coordination, they cannot capture the full scope of human subjectivity and collective agency.

Thus, the contemporary political-economic order—where GDP, interest rates, and inflation become the sole determinants of societal value—represents not a natural or eternal truth but a historically contingent configuration. This configuration, through its process of totalization (the aggregation of diverse inputs into standardized outputs), socialization (the intersubjective negotiation of meaning), and democratization (the dispersal of decision-making power), reveals the limitations of any attempt to achieve an absolute, unchanging order. Instead, what we call democracy emerges as a continuous, speculative process—a “flow” of collective negotiation that always leaves room for dissent, rearticulation, and the transformation of both economic and social norms.

In sum, while Hayek’s spontaneous order relies on natural normalizing mechanics to sustain comparability and coordination, Deleuze and Negri remind us that these systems are inherently unstable and subject to overcoding. They argue that true democratic planning, which does not merely reproduce technocratic control but rather remains open to continual reinterpretation and collective input, is the only way to approach the problem of incompleteness that Gödel and others have exposed in formal systems. Democracy, then, is not the static outcome of a perfected measurement system but the dynamic process of negotiating meaning and power in a perpetually unfinished world. This synthesis emphasizes that while standardized tools provide necessary coordination, the true democratic potential lies in the continuous, open-ended re-negotiation of meaning—a process that, by its very nature, remains incomplete and subject to change."


2.

""Human epistemic formations operate as dynamic assemblages—what may be conceived through a Deleuzian lens as "machines of knowledge"—which emerge through the contingent confluence of historical sedimentations and material conditions. These epistemic machines do not merely offer ontological coherence but rather function as vectors of becoming, shaped by the interplay of gothic materialism’s dark material substrates and speculative realism’s insistence on the autonomy of the real. The former unveils the oppressive infrastructures that encode and constrain thought, while the latter challenges the primacy of human cognition by affirming the external world's independent operations. In this machinic interplay, the elements composing these knowledge formations are less fixed Kantian norms than fluid, modulating intensities—forces that, rather than stabilizing thought, continuously produce new trajectories of sense-making, deterritorializing and reterritorializing the epistemic landscape.

Further integrating speculative epistemology into this framework, one can conceptualize the relationship between subjective cognition and objective reality as a Deleuzo-Negrian process of constituent power—an immanent field of epistemic production wherein inscription and re-inscription function as machinic assemblages of thought. Knowledge ceases to be a static repository and instead operates as a dynamic interplay of forces, continually deterritorializing and reterritorializing the conditions of intelligibility. These epistemic transactions do not merely reaffirm pre-existing structures but facilitate their continuous becoming, engendering new configurations of sense and collective subjectivity. In this light, knowledge is neither purely given nor transcendent; rather, it is an ongoing, productive multiplicity that unfolds within the material conditions of historical and cognitive assemblages.

Within this paradigm, the normalization mechanisms we have discussed—such as taxation, legal rights, voting systems, and pricing structures—function as regulatory apparatuses that discipline and standardize social interactions. Drawing from Foucault’s conceptualization of disciplinary societies and societies of control, modern state apparatuses extend their reach by encoding these normalizing forces within institutional mechanisms. The nation-state, in this light, operates as both a product and an instrument of epistemic normalization, consolidating its legitimacy through the stabilization of social categories and the perpetuation of normative behaviors.

Moreover, through the lens of universal function approximation, we can distinguish between absolute knowability—akin to global interpolation in natural sciences—and localized epistemic formations, which remain speculative yet transferable. This distinction parallels the divergence between traditional computational approaches and neural network-based learning, where the former relies on deterministic logic while the latter approximates complex patterns through emergent adaptability. In the broader context of our discussion, this analogy underscores the tension between rigid, state-imposed normalization and more fluid, decentralized epistemic structures that allow for the multiplicity of knowledge paradigms.

Ultimately, synthesizing these perspectives reveals that while epistemic structures offer stability, they simultaneously condition the possibility of transformation. The bridges of knowledge we construct are not immutable but rather contingent and subject to the recursive processes of reinterpretation and rearticulation. It is within this dynamic interplay that the potential for epistemic pluralism and radical diversity can be realized, challenging the totalizing imperatives of normalization while fostering new modes of collective and individual cognition."


More information

    • References:**

Hayek, F. A. (1945). “The Use of Knowledge in Society”. 

Kuhn, T. S. (1962). “The Structure of Scientific Revolutions”. 

Quine, W. V. (1951). “Two Dogmas of Empiricism”. 

Deleuze, G. (1992). “Difference and Repetition”. 

Negri, A. (2000). “Empire”. 