Peer Production - Characteristics

From P2P Foundation
Jump to navigation Jump to search

Discussion about the characteristics of Peer Production

Characteristics

Comparison with traditional mode

Stefano De Paolia; Cristiano Storni:

"In the area of Free and Open Source Software (FOSS) research there are contributions that have provided analogous descriptions, usually based on opposing ideal-typical production paradigms: the proprietary software productive paradigm opposed to the FOSS productive paradigm.


Examples of this opposition are:

(1) the exclusive (intellectual) property of innovation of large software corporations (via restrictive licenses) opposed to the commu- nal character of innovations in user communities (Wark 2004);

(2) the protestant ethic of labor based on formally free work (Weber 1904Á1905) and scientific management (Taylor 1911) opposed to the hacker ‘‘ethic of labor’’ based on free labor and unstructured work (Himanen 2001);

(3) a waterfall model of software development (whereby ‘‘implementation’’ is followed by ‘‘debugging,’’ ‘‘usability testing,’’ ‘‘release,’’ ‘‘maintenance,’’ and so on) opposed to a flat division of labor in which the aforementioned phases collapse in a unique, ongoing ‘‘hacking’’ activity (Hannemyr 1999); and

(4) a centralized software development methodology with managerial control in a large software corporation (the Cathedral), opposed to a distributed one in which communities of people cooperate in distributed organizational forms (the Bazaar) (Raymond 1999)." (http://p2pfoundation.net/Sociotechnical_Skills_in_the_Case_of_Arduino)

A word of caution on such idealized comparisons by the same authors:

"We believe that these explanations put in play a form of reductionism. Although we recognize differences between certain practices of production/design and consumption/use in different contexts, we believe we should also look at how produsage actor-networks are built and how the above-mentioned structural elements are achieved and maintained. Our approach considers that these elements have to be explained as effects and not as causes of produsage."



Guidelines for Successful Cooperation in Peer Production

Christian Siefkes:


"1. Find other people who have the same (or a similar) problem or goal as you.

2. Join forces with them in order to produce what you want to have or achieve (need-driven production).

3. Be fair and accept the others as your peers—since you all participate voluntarily, nobody can order others around.

4. Be generous and share what you can. By doing so, you’ll attract further users, some of whom will sooner or later turn into contributors. There is rarely a clear separation between users and contributors, but rather a smooth transition: most participants use your product only, some contribute occasionally, and a small percentage contributes intensely on a long-time basis.

5. Be open and welcoming to make it easy for “newbies” to join and contribute to your project.

6. Leave hints on what there is to do and which contributions you would like to see (Stigmergy). Frequently some other participants will take up a hint and self-select to handle one of the wanted tasks. The more participants care for a task, the more visible the hints will be, increasing the chance that somebody self-selects for the task.

7. Jointly develop the rules and structures that are most suitable for reaching your goals.

8. Strive to reach rough consensus regarding the goals of your projects and the best ways of realizing them. Narrow or arbitrary decisions will tend to drive away the people that disagree with them.

9. But if you really can’t agree on an issue, that’s not so bad. Just fork the project and do your own thing." (http://www.keimform.de/2010/self-organized-plenty/)


Commentary by Michel Bauwens

An excerpt from the manuscript.

3.3.C. Beyond Formalization, Institutionalization, Commodification

Observation of commons-based peer production and knowledge exchange, unveils a further number of important elements, which can be added to our earlier definition and has to be added to the characteristic of holoptism just discussed in 3.4.B.

In premodern societies, knowledge is ‘guarded’, it is part of what constitutes power. Guilds are based on secrets, the Church does not translate the Bible, and it guards its monopoly of interpretation. Knowledge is obtained through imitation and initiation in closed circles.

With the advent of modernity, and let’s think about Diderot’s project of the Encyclopedia as an example, knowledge is from now on regarded as a public resource which should flow freely. But at the same time, modernity, as described by Foucault in particular, starts a process of regulating the flow of knowledge through a series of formal rules, which aim to distinguish valid knowledge from invalid one. The academic peer review method, the setting up of universities which regulate discourse, the birth of professional bodies as guardians of expertise, the scientific method, are but a few of such regulations. An intellectual property rights regime also regulates the legitimate use one can make of such knowledge, and which is responsible for a re-privatization of knowledge. If original copyright served to stimulate creation by balancing the rights of authors and the public, the recent strengthening of intellectual property rights can be more properly understood as an attempt at ‘enclosure’ of the information commons, which has to serve to create monopolies based on rent obtained through licenses. Thus at the end of modernity, in a similar process to what we described in the field of work culture, there is an exacerbation of the most negative aspects of the privatization of knowledge: IP legislation is incredibly tightened, information sharing becomes punishable, the market invades the public sphere of universities and academic peer review and the scientific commons are being severely damaged.

Again, peer to peer appears as a radical shift. In the new emergent practices of knowledge exchange, equipotency is assumed from the start. There are no formal rules to prohibit anyone from participation, a characteristic that could be called 'anti-credentialism' . (unlike academic peer review, where formal degrees are required ). Validation is a communal intersubjective process. It often takes place through a process akin to swarming, whereby large number of participants will tug at the mistakes in a piece of software or text, the so-called 'piranha effect', and so perfect it better than an individual genius could. Many examples of this kind are described in the book 'The Wisdom of Crowds', by James Surowiecki. Though there are constraints in this process, depending on the type of governance chosen by various P2P projects, what stands out compared to previous modes of production is the self-selection aspect. Production is granular and modular, and only the individuals themselves know exactly if their exact mix of expertise fits the problem at hand. We have autonomous selection instead of heteronomous selection.

If there are formal rules, they have to be accepted by the community, and they are ad hoc for particular projects. In the Slashdot online publishing system which serves the open source community, a large group of editors combs through the postings, and there’s a complex system of ratings of the editors themselves; in other systems every article is rated creating a hierarchy of interest which pushes the lesser-rated articles down the list. As we explained above, in the context of knowledge classification, there is a move away from institutional categorization using hierarchical trees of knowledge, such as the bibliographic formats (Dewey, UDC, etc..), to informal communal ‘tagging’, what some people have termed folksonomies. In blogging, news and commentary are democratized and open to any participant, and it is the reputation of trustworthiness, acquired over time, by the individual in question, which will lead to the viral diffusion of particular ‘memes’. Power and influence are determined by the quality of the contribution, and have to be accepted and constantly renewed by the community of participants. All this can be termed the de-formalization of knowledge.

A second important aspect is de-institutionalization. In premodernity, knowledge is transmitted through tradition, through initiation by experienced masters to those who are validated to participate in the chain mostly through birth. In modernity, as we said, validation and the legitimation of knowledge is processed through institutions. It is assumed that the autonomous individual needs socialization, ‘disciplining’, through such institutions. Knowledge has to be mediated. Thus, whether a news item is trustworthy is determined largely by its source, say the Wall Street Journal, or the Encyclopedia Brittanica, who are supposed to have formal methodologies and expertise. P2P processes are de-institutionalized, in the sense that it is the collective itself which validates the knowledge.

Please note my semantic difficulty here. Indeed, it can be argued that P2P is just another form of institution, another institutional framework, in the sense of a self-perpetuating organizational format. And that would be correct: P2P processes are not structureless, but most often flexible structures that follow internally generated rules. In previous social forms, institutions got detached from the functions and objectives they had to play, became 'autonomous'. In turn because of the class structure of society, and the need to maintain domination, and because of 'bureaucratization' and self-interest of the institutional leaderships, those institutions turn 'against society' and even against their own functions and objectives. Such institutions become a factor of alienation. It is this type of institutionalization that is potentially overcome by P2P processes. The mediating layer between participation and the result of that participation, is much thinner, dependent on protocol rather controlled by hierarchy.

A good example of P2P principles at work can be found in the complex of solutions instituted by the University of Openness. UO is a set of free-form ‘universities’, where anyone who wants to learn or to share his expertise can form teams with the explicit purpose of collective learning. There are no entry exams and no final exams. The constitution of teams is not determined by any prior disciplinary categorization. The library of UO is distributed, i.e. all participating individuals can contribute their own books to a collective distributed library . The categorization of the books is explicitly ‘anti-systemic’, i.e. any individual can build his own personal ontologies of information, and semantic web principles are set to work to uncover similarities between the various categorizations .

All this prefigures a profound shift in our epistemologies. In modernity, with the subject-object dichotomy, the autonomous individual is supposed to gaze objectively at the external world, and to use formalized methodologies, which will be intersubjectively verified through academic peer review. Post-modernity has caused strong doubts about this scenario. The individual is no longer considered autonomous, but always-already part of various fields, of power, of psychic forces, of social relations, molded by ideologies, etc.. Rather than in need of socialization, the presumption of modernity, he is seen to be in need of individuation. But he is no longer an ‘indivisible atom’, but rather a singularity, a unique and ever-evolving composite. His gaze cannot be truly objective, but is always partial, as part of a system can never comprehend the system as a whole. The individual has a single set of perspectives on things reflecting his own history and limitations. Truth can therefore only be apprehended collectively by combining a multiplicity of other perspectives, from other singularities, other unique points of integration, which are put in ‘common’. It is this profound change in epistemologies which P2P-based knowledge exchange reflects.

A third important aspect of P2P is the process of de-commodification. In traditional societies, commodification, and ‘market pricing’ was only a relative phenomenon. Economic exchange depended on a set of mutual obligations, and even were monetary equivalents were used, the price rarely reflected an open market. It is only with industrial capitalism that the core of the economic exchanges started to be determined by market pricing, and both products and labor became commodities. But still, there was a public culture and education system, and immaterial exchanges largely fell outside this system. With cognitive capitalism, the owners of information assets are no longer content to live any immaterial process outside the purview of commodification and market pricing, and there is a strong drive to ‘privatize everything’, education included, our love lives included Any immaterial process can be resold as commodities. Thus again, in the recent era the characteristics of capitalism are exacerbated, with P2P representing the counter-reaction. With ‘commons-based peer production’ or P2P-based knowledge exchange more generally, the production does not result in commodities sold to consumers, but in use value made for users. Because of the GPL license, no copyrighted monopoly can arise. GPL products can eventually be sold, but such sale is usually only a credible alternative (since it can most often be downloaded for free), if it is associated with a service model. It is in fact mostly around such services that commercial open source companies found their model (example: Red Hat). Since the producers of commons-based products are rarely paid, their main motivation is not the exchange value for the eventually resulting commodity, but the increase in use value, their own learning and reputation. Motivation can be polyvalent, but will generally be anything but monetary.

One of the reasons of the emergence of the commodity-based economy, capitalism, is that a market is an efficient means to distribute ‘information’ about supply and demand, with the concrete price determining value as a synthesis of these various pressures. In the P2P environment we see the invention of alternative ways of determining value, through software algorithms. In search engines, value is determined by algorithms that determine pointers to documents, the more pointers, and the more value these pointers themselves have, the higher the value accorded to a document. This can be done either in a general matter, or for specialized interests, by looking at the rankings within the specific community, or even on a individual level, through collaborative filtering, by looking at what similar individuals have rated and used well. So in a similar but alternative way to the reputation-based schemes, we have a set of solutions to go beyond pricing, and beyond monetarisation, to determine value. The value that is determined in this case is of course an indication of potential use value, rather than ‘exchange value’ for the market.

Peer Production as Producer-Driven

From http://opencontent.org/blog/archives/332:

"Let’s come back to the consumer-driven / producer-driven question and look at open source software as an example. Open source software projects are successful when:


1. a specific person with a specific need develops a specific solution to **their own** specific problem,

2. that person then shares that specific solution with the world,

3. other specific people with the same or a very similar specific need find the solution and adopt or adapt it to solve their own specific problems, and

4. adaptations and extensions of the solution, developed to make the solution solve additional, closely related problems, are shared with the group.

In other words, open source software is the epitome of a producer-driven work. The work begins life with one producer, one with a specific personality and attitude, and continues life with a group of like-minded producers. These producers engage in a kind of work we might call “produce-to-use,” because they make software to satisfy their own needs. This guarantees that the software they produce will be useful to and used by someone. Of course, every project is happy to have users that might be described as “users-only,” but these do not contribute to the long-term growth or health of the project. " (http://opencontent.org/blog/archives/332)


Hybridity of open/collaborative practices and the business world

Charles Leadbeater:

'Yet our organisational future will not be a ‘yes’ or ‘no’ choice, open or closed, public or private. Between traditional, pure and closed organisations at one end of the spectrum, where ultimately the boss rules and the company owns all the assets, and the pure open end of the spectrum, where the community owns and no one tells you what to do, a vast and very fertile middle ground is opening up. eBay, Craigslist and many other organisations are starting to operate in this space. It will spawn a rich array of new hybrids. At the edges of this middle ground we will find traditional companies seeking to develop more open and interactive approaches to innovation with communities of developers and users. Phillips the giant Dutch electronics company, for example is redesigning its famous national laboratory in Eindhoven, where much of the early work on the light bulb was done, to accommodate a range of outside companies. The Phillips national laboratory used to be like an intellectual fortress, surrounded by high fences and barbed wire, to make sure all the secrets were kept safe. Now Phillips wants to create a campus where its researchers will work alongside others, sharing ideas. Nokia, the Finnish mobile telecommunications company, has an online forum through which it works with thousands of smaller developers, on applications for mobile services. The forum has elicited more than 1m contributors from user-developers. Intel, the semi-conductor giant, has adopted open and collaborative approaches to innovation, with hosts of developers to make sure the technologies it developers meet their needs.

At the other end of the spectrum we should expect open source initiatives that started life with a group of volunteers to become increasingly dependent on corporate support. IBM and Hewlett Packard are donating thousands of hours of developer time to open source platforms. Many smaller software companies are finding that collaborating to develop a shared software platform is the only way they can do research and development. Linux itself is the basis for a mass of commercial activity. The Linux community supports a range of companies such as Red Hat and VA Linux which make a good living, selling services linked to the implementation and application of Linux software.

Nor will organisations have to occupy just one position on the spectrum. They could attempt to adopt open, participative approaches to some aspects of their work, closed and commercial approaches to others. The computer games industry, for example, develops the core to its games in house, at great expense. But once the game is released, as we will see, that is the basis for massive open innovation among players. Equally innovations that start as open, shared knowledge amongst a group of user-developers – an example we explore in the next chapter is the mountain bike – can then become the basis for commercial businesses. Organisations such as the Institute for Microelectronics in Leuven, Flanders, one of Europe’s most impressive industrial research facilities brings together researchers from more than 300 international semi-conductor companies in pooled research projects. Teams of researchers from several companies join forces to thrash out solutions to shared problems with technologies that might be three to five years from the market. The companies contributing to these projects each have rights to use the combined knowledge generated. How they exploit this shared knowledge base commercially is up to them.

In the long run the most effective way to make sure open source style working prospers is to expand the base of people and organisations that adopt it." (http://wethink.wikia.com/wiki/Chapter_5_part_3)