Protocollary Power: Difference between revisions

From P2P Foundation
Jump to navigation Jump to search
Line 207: Line 207:
(http://www.metamute.org/en/Immaterial-Aristocracy-of-the-Internet)
(http://www.metamute.org/en/Immaterial-Aristocracy-of-the-Internet)


==There is no [[Openness]] without secrecy and control!==
Tim Leberecht insists, taking Wikileaks as a case study, that there is no openness without secrecy:
“an ecosystem on the Social Web could be seen as a system in permanent crisis – it is always in flux, and its composition and value are constantly threatened by a multitude of forces, from the inside and the outside. What if we understood “designing for the loss of control” as designing for structures that are in a permanent crisis? Crises are essentially disruptions that shock the system. They are deviations from routines, and the very variance that the advocates of planning and programs (the “Push” model) so despise. At their own peril, because they fail to realize that variance is the mother of all meaning; it is variance that challenges the status quo, pulls people and their passions towards you, and propels innovation. “Designing for the loss of control” means designing for variance.
One system in permanent crisis that contains a high level of variance is WikiLeaks. The most remarkable thing about the site appears to be the dichotomy between the uncompromised transparency it aims at and the radical secrecy it requires to do so. The same organization that depends on the loss of control for its content very much depends on a highly controlled environment to protect itself and keep operating effectively. But not just that: Ironically, secrecy is also a fundamental prerequisite for the appeal of WikiLeaks’ “there are no secrets” claim. Simply put: there is no light without darkness. And there is no WikiLeaks without secrets.
Applied to systems and solutions design, this means that total openness is the antidote to openness. When everything is open, nothing is open. In order to design openness, one of the first decisions designers have to make is therefore to determine what needs to remain closed. This is a strategic task: making negative choices for positive effects. You need to build enough variance into a system to make it “flow” and yet retain some control over the underlying parameters (access, boundaries, authorship, participants, agenda, process, conversation, collaboration, documentation, etc.). Only if you maintain the fundamental ability to at least manage (and modify) the conditions for openness, will you be able to create it. To design for the loss of control, control the parameters that enable it.”
(http://designmind.frogdesign.com/blog/openness-or-how-do-you-design-for-the-loss-of-control.html-0)


=Key Book to Read=
=Key Book to Read=

Revision as of 20:31, 13 September 2010

Protocally Power is a concept developed by Alexander Galloway in his book Protocol, to denote the new way power and control are exercized in distributed networks.

See also: Architecture of Control ; & Computing Regime


Description

From Alexander Galloway in his book Protocol:

"Protocol is not a new word. Prior to its usage in computing, protocol referred to any type of correct or proper behavior within a specific system of conventions. It is an important concept in the area of social etiquette as well as in the fields of diplomacy and international relations. Etymologically it refers to a fly-leaf glued to the beginning of a document, but in familiar usage the word came to mean any introductory paper summarizing the key points of a diplomatic agreement or treaty.

However, with the advent of digital computing, the term has taken on a slightly different meaning. Now, protocols refer specifically to standards governing the implementation of specific technologies. Like their diplomatic predecessors, computer protocols establish the essential points necessary to enact an agreed-upon standard of action. Like their diplomatic predecessors, computer protocols are vetted out between negotiating parties and then materialized in the real world by large populations of participants (in one case citizens, and in the other computer users). Yet instead of governing social or political practices as did their diplomatic predecessors, computer protocols govern how specific technologies are agreed to, adopted, implemented, and ultimately used by people around the world. What was once a question of consideration and sense is now a question of logic and physics.

To help understand the concept of computer protocols, consider the analogy of the highway system. Many different combinations of roads are available to a person driving from point A to point B. However, en route one is compelled to stop at red lights, stay between the white lines, follow a reasonably direct path, and so on. These conventional rules that govern the set of possible behavior patterns within a heterogeneous system are what computer scientists call protocol. Thus, protocol is a technique for achieving voluntary regulation within a contingent environment.

These regulations always operate at the level of coding--they encode packets of information so they may be transported; they code documents so they may be effectively parsed; they code communication so local devices may effectively communicate with foreign devices. Protocols are highly formal; that is, they encapsulate information inside a technically defined wrapper, while remaining relatively indifferent to the content of information contained within. Viewed as a whole, protocol is a distributed management system that allows control to exist within a heterogeneous material milieu.

It is common for contemporary critics to describe the Internet as an unpredictable mass of data--rhizomatic and lacking central organization. This position states that since new communication technologies are based on the elimination of centralized command and hierarchical control, it follows that the world is witnessing a general disappearance of control as such.

This could not be further from the truth. I argue in this book that protocol is how technological control exists after decentralization. The "after" in my title refers to both the historical moment after decentralization has come into existence, but also--and more important--the historical phase after decentralization, that is, after it is dead and gone, replaced as the supreme social management style by the diagram of distribution." (http://rhizome.org/discuss/view/12004)


Citations on Design as a function of Protocollary Power

Mitch Ratfliffe:

"Yes, networks are grown. But the medium they grow in, in this case the software that supports them, is not grown but designed & architected. The social network ecosystem of the blogosphere was grown, but the blog software that enabled it was designed. Wikis are a socially grown structure on top of software that was designed. It's fortuitous that the social network structures that grew on those software substrates turn out to have interesting & useful properties.

With a greater understanding of which software structures lead to which social network topologies & what the implications are for the robustness, innovativeness, error correctiveness, fairness, etc. of those various topologies, software can be designed that will intentionally & inevitably lead to the growth of political social networks that are more robust, innovative, fair & error correcting." (http://www.greaterdemocracy.org/archives/000471.html)


Mitch Kapor on 'Politics is Architecture'

"“politics is architecture": The architecture (structure and design) of political processes, not their content, is determinative of what can be accomplished. Just as you can’t build a skyscraper out of bamboo, you can’t have a participatory democracy if power is centralized, processes are opaque, and accountability is limited." (http://blog.kapor.com/?p=29)



Power in Networks: Virtual Location (V. Krebs)

"In social networks, location is determined by your connections and the connections of those around you – your virtual location.

Two social network measures, Betweenness and Closeness, are particularly revealing of a node’s advantageous or constrained location in a network. The values of both metrics are dependent upon the pattern of connections that a node is embedded in. Betweenness measures the control a node has over what flows in the network – how often is this node on the path between other nodes? Closeness measures how easily a node can access what is available via the network – how quickly can this node reach all others in the network? A combination where a node has easy access to others, while controlling the access of other nodes in the network, reveals high informal power." (http://www.orgnet.com/PowerInNetworks.pdf)


Fred Stutzman on Pseudo-Govermental Decisions in Social Software


"When one designs social software, they are forced to make pseudo-governmental decisions about how the contained ecosystem will behave. Examples of these decisions include limits on friending behavior, limits on how information in a profile can be displayed, and how access to information is restricted in the ecosystem. These rules create and inform the structural aspects of the ecosystem, causing participants in the ecosystem to behave a specific way.

As we use social software more, and social software more neatly integrates with our lives, a greater portion of our social rules will come to be enforced by the will of software designers. Of course, this isn't new - when we elected to use email, we agree to buy into the social consequences of email. Perhaps because we are so used to making tradeoffs when we adopt social technology, we don't notice them anymore. However, as social technology adopts a greater role in mediating our social experience, it will become very important to take a critical perspective in analyzing how the will of designers change us." (http://chimprawk.blogspot.com/2006/10/colonialist-perspective-in-social.html)


Philippe Zafirian on the Two Faces of the Control Society

Philippe Zafirian's citation suggests that protocollary power is related to a shift with the 'Society of Control', from disciplinary control, to the control of engagement.

"Gilles Deleuze, commentant Foucault, a développé une formidable intuition : nous basculons, disait-il, de la société disciplinaire dans la société de contrôle. Ou, pour dire les choses de manière légèrement différente, de la société de contrôle disciplinaire à la société de contrôle d'engagement . Sous une première face, on pourra interpréter ce contrôle comme une forme d'exercice d'un pouvoir de domination, d'un pouvoir structurellement inégalitaire, agissant de manière instrumentale sur l'action des autres. Ce contrôle d'engagement se distingue, en profondeur, du contrôle disciplinaire en ce qu'il n'impose plus le moule des "tâches", de l'assignation à un poste de travail, de l'enfermement dans la discipline d'usine. Il n'enferme plus, ni dans l'espace, ni dans le temps. Il cesse de se présenter comme clôture dans la cellule d'une prison, elle-même placée sous constante surveillance. Selon l'intuition de Deleuze, on passe du moule à la modulation, de l'enfermement à la circulation à l'air libre, de l'usine à la mobilité inter-entreprises. Tout devient modulable : le temps de travail, l'espace professionnel, le lien à l'entreprise, les résultats à atteindre, la rémunération… La contractualisation entre le salarié et l'employeur cesse elle-même d'être rigide et stable. Elle devient perpétuellement renégociable. Tout est en permanence susceptible d'être remis en cause, modifié, altéré." (http://perso.wanadoo.fr/philippe.zarifian/page109.htm)


Example

Social Values in Technical Code:

"In a paper about the hacker community, Hannemyr compares and contrasts software produced in both open source and commercial realms in an effort to deconstruct and problematize design decisions and goals. His analysis provides us with further evidence regarding the links between social values and software code. He concludes:


"Software constructed by hackers seem to favor such properties as flexibility, tailorability, modularity and openendedness to facilitate on-going experimentation. Software originating in the mainstream is characterized by the promise of control, completeness and immutability" (Hannemyr, 1999).

To bolster his argument, Hannemyr outlines the striking differences between document mark-up languages (like HTML and Adobe PDF), as well as various word processing applications (such as TeX and Emacs verses Microsoft Word) that have originated in open and closed development environments. He concludes that "the difference between the hacker’s approach and those of the industrial programmer is one of outlook: between an agoric, integrated and holistic attitude towards the creation of artifacts and a proprietary, fragmented and reductionist one" (Hannemyr, 1999). As Hannemyr’s analysis reveals, the characteristics of a given piece of software frequently reflect the attitude and outlook of the programmers and organizations from which it emerges" (http://www.firstmonday.org/issues/issue8_10/jesiek/)


Social Media: Re-introducing centralization through the back door

Armin Medosch:

"n media theory much has been made of the one-sided and centralised broadcast structure of television and radio. the topology of the broadcast system, centralised, one-to-many, one-way, has been compared unfavourable to the net, which is a many-to-many structure, but also one-to-many and many-to-one, it is, in terms of a topology, a highly distributed or mesh network. So the net has been hailed as finally making good on the promise of participatory media usage. What so called social media do is to re-introduce a centralised structure through the backdoor. While the communication of the users is 'participatory' and many-to-many, and so on and so forth, this is organised via a centralised platform, venture capital funded, corporately owned. Thus, while social media bear the promise of making good on the emancipatory power of networked communication, in fact they re-introduce the producer-consumer divide on another layer, that of host/user. they perform a false aufhebung of the broadcast paradigm. Therefore I think the term prosumer is misleading and not very useful. while the users do produce something, there is nothing 'pro' as in professional in it.

This leads to a second point. The conflict between labour and capital has played itself out via mechanization and rationalization, scientific management and its refinement, such as the scientific management of office work, the proletarisation of wrongly called 'white collar work', the replacement of human labour by machines in both the factory and the office, etc. What this entailed was an extraction of knowledge from the skilled artisan, the craftsman, the high level clerk, the analyst, etc., and its formalisation into an automated process, whereby this abstraction decidedly shifts the balance of power towards management. Now what happened with the transition from Web 1.0 to 2.0 is a very similar process. Remember the static homepage in html? You needed to be able to code a bit, actually for many non-geeks it was probably the first satisfactory coding experience ever. You needed to set the links yourself and check the backlinks. Now a lot of that is being done by automated systems. The linking knowledge of freely acting networked subjects has been turned into a system that suggests who you link with and that established many relationships involuntarily. It is usually more work getting rid of this than to have it done for you. Therefore Web 2.0 in many ways is actually a dumbing down of people, a deskilling similar to what has happened in industry over the past 200 years.

Wanted to stay short and precise, but need to add, social media is a misnomer. What social media would be are systems that are collectively owned and maintained by their users, that are built and developed according to their needs and not according to the needs of advertisers and sinister powers who are syphoning off the knowledge generated about social relationships in secret data mining and social network analysis processes.

So there is a solution, one which I continue to advocate: lets get back to creating our own systems, lets use free and open source software for server infrastructures and lets socialise via a decentralised landscape of smaller and bigger hubs that are independently organised, rather than feeding the machine ..." (IDC mailing list, Oct 31, 2009)

Discussion

Protocollary Power and P2P

Michel Bauwens on the P2P aspects of Protocollary Power:

The P2P era indeed adds a new twist, a new form of power, which we have called Protocollary Power, and has first been clearly identified and analyzed by Alexander Galloway in his book Protocol. We have already given some examples. One is the fact that the blogosphere has devised mechanisms to avoid the emergence of individual and collective monopolies, through rules that are incorporated in the software itself. Another was whether the entertainment industry would succeed in incorporating software or hardware-based restrictions to enforce their version of copyright. There are many other similarly important evolutions to monitor: Will the internet remain a point to point structure? Will the web evolve to a true P2P medium through Writeable Web developments? The common point is this: social values are incorporated, integrated in the very architecture of our technical systems, either in the software code or the hardwired machinery, and these then enable/allow or prohibit/discourage certain usages, thereby becoming a determinant factor in the type of social relations that are possible. Are the algorhythms that determine search results objective, or manipulated for commercial and ideological reasons? Is parental control software driven by censorship rules that serve a fundamentalist agenda? Many issues are dependent on hidden protocols, which the user community has to learn to see (as a new form of media literacy and democratic practice), so that it can become an object of conscious development, favoring peer to peer processes, rather than the restrictive and manipulative command and control systems. In P2P systems, the formal rules governing bureaucratic systems are replaced by the design criteria of our new means of production, and this is where we should focus our attention. Galloway suggests that we make a diagram of the networks we participate in, with dots and lines, nodes and edges. Important questions then become: Who decides who can participate?, or better, what are the implied rules governing participation? (since there is no specific 'who' or command in a distributed environment); what kind of linkages are possible? On the example of the internet, Galloway shows how the net has a peer to peer protocol in the form of TCP/IP, but that the Domain Name System is hierarchical, and that an authorative server could block a domain family from operating. This is how power should be analyzed. Such power is not per se negative, since protocol is needed to enable participation (no driving without highway code!), but protocol can also be centralized, proprietary, secret, in that case subverting peer to peer processes. However, the stress on protocol, which concerns what Yochai Benkler calls the 'logical layer' of the networks, should not make us forget the power distribution of the physical layer (who owns the networks), and the content layer (who owns and controls the content).


Protocols are Designed by People

Harry Halpin:

"Galloway is correct to point out that there is control in the internet, but instead of reifying the protocol or even network form itself, an ontological mistake that would be like blaming capitalism on the factory, it would be more suitable to realise that protocols embody social relationships. Just as genuine humans control factories, genuine humans – with names and addresses – create protocols. These humans can and do embody social relations that in turn can be considered abstractions, including those determined by the abstraction that is capital. But studying protocol as if it were first and foremost an abstraction without studying the historic and dialectic movement of the social forms which give rise to the protocols neglects Marx’s insight that

Technologies are organs of the human brain, created bythe human hand; the power of knowledge, objectified.[8]

Bearing protocols’ human origination in mind, there is no reason why they must be reified into a form of abstract control when they can also be considered the solution to a set of problems faced by individuals within particular historical circumstances. If they now operate as abstract forms of control, there is no reason why protocols could not also be abstract forms of collectivity. Instead of hoping for an exodus from protocols by virtue of art, perhaps one could inspect the motivations, finances, and structure of the human agents that create them in order to gain a more strategic vantage point. Some of these are hackers, while others are government bureaucrats or representatives of corporations – although it would seem that hackers usually create the protocols that actually work and gain widespread success. To the extent that those protocols are accepted, this class that I dub the ‘immaterial aristocracy’ governs the net. It behoves us to inspect the concept of digital sovereignty in order to discover which precise body or bodies have control over it." (http://www.metamute.org/en/Immaterial-Aristocracy-of-the-Internet)


There is no Openness without secrecy and control!

Tim Leberecht insists, taking Wikileaks as a case study, that there is no openness without secrecy:

“an ecosystem on the Social Web could be seen as a system in permanent crisis – it is always in flux, and its composition and value are constantly threatened by a multitude of forces, from the inside and the outside. What if we understood “designing for the loss of control” as designing for structures that are in a permanent crisis? Crises are essentially disruptions that shock the system. They are deviations from routines, and the very variance that the advocates of planning and programs (the “Push” model) so despise. At their own peril, because they fail to realize that variance is the mother of all meaning; it is variance that challenges the status quo, pulls people and their passions towards you, and propels innovation. “Designing for the loss of control” means designing for variance.

One system in permanent crisis that contains a high level of variance is WikiLeaks. The most remarkable thing about the site appears to be the dichotomy between the uncompromised transparency it aims at and the radical secrecy it requires to do so. The same organization that depends on the loss of control for its content very much depends on a highly controlled environment to protect itself and keep operating effectively. But not just that: Ironically, secrecy is also a fundamental prerequisite for the appeal of WikiLeaks’ “there are no secrets” claim. Simply put: there is no light without darkness. And there is no WikiLeaks without secrets.

Applied to systems and solutions design, this means that total openness is the antidote to openness. When everything is open, nothing is open. In order to design openness, one of the first decisions designers have to make is therefore to determine what needs to remain closed. This is a strategic task: making negative choices for positive effects. You need to build enough variance into a system to make it “flow” and yet retain some control over the underlying parameters (access, boundaries, authorship, participants, agenda, process, conversation, collaboration, documentation, etc.). Only if you maintain the fundamental ability to at least manage (and modify) the conditions for openness, will you be able to create it. To design for the loss of control, control the parameters that enable it.” (http://designmind.frogdesign.com/blog/openness-or-how-do-you-design-for-the-loss-of-control.html-0)

Key Book to Read

Protocol. Alexander Galloway.


More Information

  1. Democratizing software: Open source, the hacker ethic, and beyond by Brent K. Jesiek. First Monday, volume 8, number 10 (October 2003), URL: http://firstmonday.org/issues/issue8_10/jesiek/index.html
  2. The Immaterial Aristocracy of the Net
  3. Digital Sovereignty
  4. Overview of architectures of control in the digital environment, by Dan Lockton.
  5. Value Sensitive Design
  6. The Politics of Code in Web 2.0. By Ganaele Langlois, Fenwick McKelvey, Greg Elmer, and Kenneth Werbin. Fibreculture Journal, Issue 14. [1]
  7. Social Machines