P2P and Human Evolution Ch 5

From P2P Foundation
Jump to navigation Jump to search

5. "Network Theory" or: The Discovery of P2P principles in the Cosmic Sphere

Chapter 5 of P2P and Human Evolution



5.1.A Distributed networks and 'Small World' theory

Note the difference in the above chapter title. Here we are not speaking of emergence, but rather the recognition or discovery of principles within the natural world, which obey P2P principles. They were always and already there, but we have only recently learned to see them. Technology reflects, to a certain extent, humanity’s growing knowledge of the natural world. Technological artifacts and processes integrate and embed, within their protocols, this growing knowledge. And lately, we have learned to see the natural (physical, biological, cognitive) world quite differently from before. No longer are they seen as simple mechanisms, or hierarchies, but as networks. Thus, the fact that engineers, software architects, and social network managers are devising and implementing more and more P2P systems also reflects this new understanding. Studies of distributed intelligence in physical systems, of the swarming behavior of social insects, of the ‘wisdom of crowds’ and collective intelligence in the human field, show that in many situations participative distributed systems function more efficiently than command and control systems, which create bottlenecks. In natural systems, true centralized and hierarchic command and control systems seem rather rare.

Though there can be said to exist hierarchies in nature, such as a succession of progressively more enfolding systems, and many pyramidal systems of command and control in human society, the former are better called 'holarchies', as actual command and control systems are actually quite rare. More common is the existing of multiple agents, which through their interaction, create emergent coherent orders and behavior. The brain for example, has been shown to be a rather egalitarian network of neurons, and there is no evidence of a command centre.[1] And there are of course multiple scientific fields where this is now shown to be the case. Network theory is therefore focused on the interrelationships of equipotent, and distributed agents, and how complex systems arise from them. Network theory is a form of systemic reductionism, which focuses on the interaction of agents, without looking much at their 'personal' characteristics, but is remarkably successful in explaining the behaviour of many systems. Thus, if historians are starting to look at the world in terms of flows, social science in general is increasingly looking at its objects of study in terms of social network analysis.[2]

An important contribution is the work of Alexander R. Galloway, "Protocol", because he clearly makes the important distinction between 'decentralised' and 'distributed' networks. First we had centralized networks. In this format, all links between nodes must go through the centre, which has to authorize or enable them. Think about mainframe computers with dumb terminals, or the central switches in telephone systems. In a second phase, networks are decentralized, which means the centre is broken up in several subcentres. Here, linkages and actions between nodes must still pass through one of these subcentres. An example is the American airport system, organized around hubs such as Atlanta. To go from one regional city to another, you must pass through such a hub. In distributed networks, such as the network of interstate highways or the internet, this requirement no longer applies. Hubs, i.e. nodes that carry more links than others, may exist, but they are optional, and grow organically, they are not obligatory or designed beforehand. Abstract network theory, seeing hubs in both cases, may miss this important point. Peer to peer is the relational dynamic of distributed networks! A distributed network may or may not be a egalitarian network (see just below).

Nexus is a book by Duncan Watts, who summarises network theory investigations for the lay public, focusing on small world networks. These differ from totally random networks, where it takes many steps to go from one node to another, and are characterized by a relative 'low degree of separation'. Typically, human society is determined by no more than six degrees of separation: it never takes more intermediaries to contact any other person on the planet. Such networks come in two varieties: 1) aristocratic networks, where it is larger hubs and connectors who are responsible for linking the network together as a whole; and 2) egalitarian networks, where the nodes have largely a same number of links, but while the majority has strong links to a few surrounding links with whom they interact a lot, a minority has weak ties with faraway nodes, and it is they who are responsible for holding the network together, and rapidly moving information from one local or affinity group, to another different one. Each forms has its strength and weaknesses: aristocratic networks are very strong in resisting random attacks, but vulnerable when their connectors are attacked, while egalitarian networks are more vulnerable to random disruption.

5.1.B Equipotentiality vs. the Power Law

One of the most interesting findings is the existence of a power law. A power law says that for any increase in the number of links per node (or specific characteristic per node, such as acreage per square kilometer for a river basin), the number of nodes having that characteristic will decline by a fixed factor. In economics this gives us the famous Pareto principle, i.e. 20% of the people having 80% of the wealth. But the power law is nearly everywhere, suggesting a natural form of concentration and even monopolization as almost inevitable. In fact, it seems that whenever we have many choices and many distributed agents making these choices, inequality of choice is created.[3] This seems to be the natural result of any 'economy of attention'. But that is the point: such distribution is not forced, as in a oligopoly or monopoly, but arises naturally from the freedom of choice, and can be considered a 'fair' result, provided no coercion is used. Networks where such a power law operates are called 'scale-free', because at whatever scale, the same relation between variables (i.e. distribution pattern) applies.

In terms of a normative P2P ethos, it is important to note that it is not systematically favouring egalitarian networks. The internet and the web are both aristocratic networks; the blogosphere is characterized by a power law distribution. The key questions are:

  1. is the network efficient;
  2. does it enable participation;
  3. is the emergence of an aristocratic structure non-coercive and eventually reversible.

Focusing on this reversibility is probably one of the tasks of peer governance. Granted that a power law may be in operation, that does not mean we must acquiesce in social processes that re-inforce such inequalities, but rather, that we then look for human and technical/algorithmic solutions that renders the structure fluid enough so that it may be reversed if need be. But in many cases, we have to admit that some form of centralization, is necessary and efficient. We all prefer one standard for our operating systems for example.

The power law can possibly be mitigated by the development of algorithms, that can highlight important information and connections from nodes that may not come up 'naturally', but this discipline, though still in its infancy at the moment, is making rapid strides and is the core competence of new internet companies like Google, Technorati and others. But the power law is also counteracted by what some network economists have called the 'Long Tail'. This is the phenomenon whereby minority groups are not excluded from the distribution of knowledge and exchange, and markets, but are on the contrary enable to organize micro-communities. In the market for cultural products, this has the effect of radically enhancing the supply and demand for products. Instead of the 80/20 distribution of products, i.e. 20% of the products being responsible for 80% of the sales and profits, we get something more akin to a 50/50 distribution. Online stores like Amazon and eBay are instrumentalising the phenomenon by using affinity matching schemes, which have resulted in the creation of many thousands of previously not existing mini-markets. Books, CD's and films which would be destroyed for lack of interest in the mass media system, now have a second and third lease of life, through the continued attention given to them by self-organising minority interests. This is an important guarantee for a vibrant cultural life, which does not destroy difference and cultural heterogeinity.[4]

One of the keys to avoiding the power law may therefore be to keep sub-networks small. One of the recurring debates within cooperation studies indeed concers a discussion on the optimal size of online groups. Dunbar, an primatologist formerly at the University College of London, has posited a link between brain size and our maximum number of close social ties,[5] a claim supported by many animal, especially primate, and anthropological studies. He predicts that around 150 is the "mean group size" for humans, and this number has also been applied to online cooperation. But, extrapolating from group size and time spent grooming in primates, such a number would require an impractically large time of social grooming and in reality it is much less than predicted on the primate model. Thus, language developed to as a more effective method of maintaining the social fabric than literal grooming was.

This discussion is important because other researchers, such as Valdis Krebs, have shown that in smaller groups, the power law does not operate and that they function as egalitarian networks.[6] The key therefore is to organize online collaboration in such a way so that it is divided in appropriate subgroups, and this seems pretty much the way software peer production teams seem to operate.

There will be a lot to learn from this emergent field of cooperation studies, as it weans us from wishful thinking into a more systematic understanding of what it takes to make cooperative projects work.

One of the most important works have been those of Axelrod, in his Evolution of Cooperation (Axelrod, 1984). He reconceived Game Theory, which had originally been seen as undermining altruism, by grounding the experiments in the real conditions of social life, instead of abstracting it in unrealistic laboratory or thought experiments. Game theory is important because it models human intentionality as it wavers between altruistic and selfish strategies. His study of the classic Prisoner's Dilemma has yielded three important rules for cooperation to occur:

  1. communities must promote ongoing interaction;
  2. individuals must be able to identify each other;
  3. individuals must have information about the past behaviour of others.

Many of these insights have been incorporated in the social software tools being developed, and are the reason of the success of reputation systems such as exists in eBay etc… According to the findings of Howard Rheingold and his cooperation studies group, the Prisoner's Dilemma game, which undermines cooperation and operates in an information-poor environment, may well be superseded by new forms of the Assurance Game. A Prisoner's Game will operate when no information about the partner is available, as distrust will prevail, but the social accounting technologies generate information about the trustworthy of a potential partner, and in such an environment, the Assurance Game will prevail.

Another important milestone in cooperation studies did not focus on interperson interaction (as does Game Theory) but on group behaviour in real physical communities involved in the use and management of communal resources. It can be found in Elinor Ostrom's Governing the Commons (Ostrom, 1990). Amongst the principles applied in successful communities are:

  1. boundaries must be clearly defined so that there is a clear sense of who may use collective resources;
  2. the rules of usage must match local conditions;
  3. affected individuals must be able to participate in the adaptation of these rules;
  4. control mechanisms of user behaviour must exist, as well as a system of graduated sanctions (her survey concluded that this was done better through self-regulation than through external authorities);
  5. finally, the community must have access to low-cost conflict resolution mechanisms.

It must be stressed however that her study about physical Commons dealing with scarce rival goods, cannot be applied without adaptation to the digital commons dealing with non-rival goods, where a Tragedy of the Commons, i.e. an abuse of these scarce goods for personal gain, cannot occur, though some of her conclusions on group behaviour and its regulation do apply. But the totality of her conclusions are certainly of interest to defenders of our very important physical Commons and show that well regulated Commons have found ways to deal with abuse and overuse.

The shift in 'business models' characteristics of the new networks is explained by David Reed, who has summarized the different mathematical laws inherent in the value created by networks. First, we focus on the individuals. If a network has N-members and memberships grows, then one can see a linear growth in audience, i.e. N+1, N+2, etc., i.e. a proportional growth in value. This formula was already at play in broadcast media and in such an environment, 'content is king', and publisher vie for the attention of the users of the network. This explains the role of portal sites such as Yahoo, who re-intermediate the economy of attention that we discussed before. If we now focus on the 'interaction between individuals', we see that the network enables transactions, but that these grow by a 'square value'. This characteristic is called Metcalfe's Law. A network of 2 allows for 1 connection (back and forth buying and selling), a network of 3 allows for 3 connections, a network of 4 allows for 12 connections. This aspect of the network creates transactional platforms such as eBay. Finally, we focus on community. Networks have the ability to enable the formation of subgroups, they are 'Group Forming Networks'. But value growth here is 'exponential'. It is this characteristic that is called Reed's Law. Every affinity group creates and 'consumes' its own content, and it is here that the true peer to peer processes emerge, characterized by infinite content creation. The economy of attention becomes moot, because what is happening is not limited content competing for the same audience, but infinite content competing for infinite combinations of affinity groups. You are then creating content, not for an audience, but as a means of creating interconnectedness between a group of people sharing an interest or common goal.

To conclude: the discovery of the theory of networks in the physical sphere has therefore a corollary, seeing it in social life, and in particular, in the area of organizational life, including business. The finds its expression in the emerging discipline of social network analysis and cooperation studies generally, and in particular, in coordination theory, as pioneered by Thomas Malone.[7] All these studies are important because they act as a corrective to misplaced idealism and provide lessons from scientific studies and objective experience with true cooperation.

More Information

Endnotes

  1. Connectionist theories of mind and brain, at http://www.artsci.wustl.edu/~philos/MindDict/connectionism.html

  2. Bruce Sterling on the 'coming of age' of social network analysis:

    http://www.wired.com/wired/archive/12.11/view.html?pg=4?tw=wn_tophead_7

  3. Consequences of the power law in scale-free networks

    "A scale-free network is one that obeys a power law distribution in the number of connections between nodes on the network. Some few nodes exhibit extremely high connectivity (essentially scale-free) while the vast majority are relatively poorly connected. The reason that scale-free networks emerge, as opposed to evenly distributed random networks, is due to these factors.
    1. Rapid growth confers preference to early entrants. The longer a node has been in place the greater the number of links to it. First mover advantage is very important.
    2. In an environment of too much information people link to nodes that are easier to find. This preferential linking reinforces itself by making the easier to find nodes even more easy to find.
    3. The greater the capacity of the hub (bandwidth, work ethic, etc.) the faster its growth"

    (http://globalguerrillas.typepad.com/globalguerrillas/complex_networks/index.html)

  4. The Long Tail in Marketing:

    "People are going deep into the catalog, down the long, long list of available titles, far past what's available at Blockbuster Video, Tower Records, and Barnes & Noble. And the more they find, the more they like. As they wander further from the beaten path, they discover their taste is not as mainstream as they thought (or as they had been led to believe by marketing, a lack of alternatives, and a hit-driven culture). An analysis of the sales data and trends from these services and others like them shows that the emerging digital entertainment economy is going to be radically different from today's mass market. If the 20th-century entertainment industry was about hits, the 21st will be equally about misses. For too long we've been suffering the tyranny of lowest-common-denominator fare, subjected to brain-dead summer blockbusters and manufactured pop. Why? Economics. Many of our assumptions about popular taste are actually artifacts of poor supply-and-demand matching – a market response to inefficient distribution."
    (http://www.wired.com/wired/archive/12.10/tail.html)

  5. The Dunbar number and the limits to cooperation

    ... there is a cognitive limit to the number of individuals with whom any one person can maintain stable relationships, that this limit is a direct function of relative neocortex size, and that this in turn limits group size ... the limit imposed by neocortical processing capacity is simply on the number of individuals with whom a stable inter-personal relationship can be maintained.
    (http://www.bbsonline.org/documents/a/00/00/05/65/bbs00000565-00/bbs.dunbar.html )
    See also Dunbar's book: "Grooming, Gossip and the Evolution of Language"

  6. Cooperation without Power Law?

    The following table by Ross Mayfield summarises recent research, showing that small groups can maintain egalitarian networks:

    Network

    Size

    Description

    Distribution

    Political Network

    ~1000s

    Blogs as mass media

    Power-law (scale-free)

    Social Network

    ~150

    Blogging Classic

    Bell-curve (random)

    Creative Network

    ~12

    Blogs as dinner conversation

    Dense (equal)


    After reviewing data of work relationships, information flows, and knowledge exchanges from hundreds of consulting assignments inside Fortune 2000 organizations, Valdis Krebs did not see much evidence of power laws in this data. His data is of confirmed ties [both persons agreed/recognized their mutual interactions/flows/relationships] from a worldwide pool of clients dating back to 1988. Of course he found some people were better connected than others, but the extreme hubs found in power law networks just were not evident. Adapting a famous line from the movie "Blazing Saddles" Valdis concluded: "Power Law? There ain't no stinkin' power law in this data!"
    (http://radio.weblogs.com/0114726/2003/02/12.html#a284)
    The whole discussion above was inspired by a entry from the Life with Alacrity blog.

  7. the general principles of Coordination Theory

    “Thomas Malone: What I mean by coordination theory is that body of theory and principles that help explain the phenomena of coordination in whatever systems they arise. Now what do I mean by coordination? We define coordination as the management of dependencies among activities. Now how do we proceed on the path of developing coordination theory? The work we've done so far says that if coordination is the managing of dependencies among activities, a very useful next step is to say: what kinds of dependencies among activities are possible? We've identified three types of dependencies that we call atomic or elementary dependency types. Our hypothesis is that all the dependencies, all the relationships in the world, can be analyzed as either combinations of or more specialized types of these three elementary types. The three are: flow, sharing, and fit. Flow occurs whenever one activity produces some resource used by another activity. Sharing occurs when a single resource is used by multiple activities. And fit occurs when multiple activities collectively produce a single resource. So those are the three topological possibilities for how two activities and one resource can be arranged. And each of them has a clear analog in the world of business or any of the other kinds of systems we talked about."
    (http://www.dialogonleadership.org/Malone2001.html)

    Book: Thomas Malone. Coordination Theory and Collaboration Technology.

    The Open Process Handbook Initiative (OPHI)

    "a group of organizations and individuals dedicated to developing an on-line collection of knowledge about business processes that is freely available to the general public under an innovative form of "open source" licensing."
    (http://ccs.mit.edu/ophi/index.htm)