Communal Validation

From P2P Foundation
Jump to navigation Jump to search

Communal Validation = the [Peer Production]] of accreditation

In Peer Review, scientific articles are vetted by scientific colleagues. Read that article on peer review to see why communal validation des not replace it, and why the process of vetting in peer production is usually different.


Definition

Peer production is based on equipotential participation (see Equipotentiality), i.e. the a priori self-selection of participants, and the communal vetting of the quality of their work in the process of production itself. Peer review is based on credentialism (= the a priori need for credentials),in contrast to peer production vetting which is based on Anti-Credentialism. Peer review is part of an elaborate process of institutional and prior validation of what constitutes valid knowledge; peer production vetting is a posteriory vetting by the community of participants.


Discussion

Communal Validation in Citizen Science projects

By Adam Glenn at http://www.poynter.org/column.asp?id=31&aid=116168

See our entry on Citizen Journalism for extra context:

"I think such citizen science projects offer valuable models that can be applied to citizen media projects:


1. Rigorous data collection. The Bird Count uses carefully developed methodologies to avoid spoiling data with inaccurate or duplicate information. Likewise, citizen journalists can establish and disseminate guides for reporting and photography standards -- especially regarding verifiable info such as names, quotes, attribution, numbers and the like.


2. Pooling and verifying cumulative results. The sheer volume of overall data collected in the Bird Count ensures that, if any contaminated info does sneak in, it won't unacceptably distort the final result. That's an important lesson for citizen journalism sites, harking back to the journalistic principle of verifying information with multiple sources. Ideally, citJ projects should seek multiple iterations of information -- for example, requiring that assertions by one contributor be verified by others.


3. Vetting amateurs. Even small hurdles like registration forms and minimal fees can weed out the unworthy, while extensive mandatory training can seriously raise the level of contributions (as well as the cost, unfortunately). It's worth considering whether citJ sites might benefit from mandatory online tutorials, accuracy checklists or story forms to make sure vital info isn't left out of submissions.


4. Expert-amateur interaction. Most citizen science projects aim to pair the novice with either experienced amateurs or experts themselves, fostering mentoring relationships that ultimately improve the data. Why shouldn't experienced citizen journalists (or professional journalists associated with new media or even mainstream media) provide the same mentoring? This could be done via workshops, in-the-field training, online editing, or other means. If the gains in media democratization aren't enough for you, how about the ways in which the resulting bond with the community and its most active news consuming members could pay off in loyalty to the news product?" (http://www.poynter.org/column.asp?id=31&aid=116168)


Free vs. Controlled Contributions

Most open source projects are not being run in a Wikipedia-like "anyone can edit" mode. What crap.

From a commentary here at http://interviews.slashdot.org/interviews/07/01/30/1858212.shtml:

"I know of no successful open source software projects run that way. On all the successful open source projects only few are granted write access to cvs/svn and most open source projects are run by one or two very opinionated people who do not accomodate others on a whim. In most cases, people finding a problem submit a patch and onte of the trusted few will apply it. In many cases, the patch will not be applied directly, but will be rewritten to achieve the desired effect better.

Sure people can take all the code and fork the project, but that is very different to having control over the document. You very seldom get wikipeia-style edit wars in OSS code bases because "the boss" does not tolerate it. Abuse the priviledge of write access and you lose it.

To draw a parallels between Wikipedia (which is uncontrolled) and Open Source (which is controlled) just does Open Source a disservice. There's enough anti-Open Source FUD out there and we don't need people thinking that any dummy with a chip on their shoulder can modify open source."


Quality Control is a process

What happens when the quality control process moves from the front to the back end of content production, as it does with peer production of content?

Here’s an interesting take on this topic by John Blossom at http://shore.com/commentary/newsanal/items/2007/20070521quality.html:


“How does one address the need for high-quality content in a context-driven publishing environment? Here are a few thoughts as to how and where quality content will survive and thrive:

Accept that quality is a process of continuous improvement. While forming well-researched articles and information sources does require a great deal of quality control, the experience of the Web points towards evolutionary quality control as the most promising route for publishing. Having every fact and figure exactly right at a fixed point in time was a “must” in the era of print-oriented publishing. Content management systems and simpler tools such as weblogs and Wikis have make it far simpler to publish revisions online, but editorially we’re still caught oftentimes in the print-oriented quality cycle. The experiences offered by search engines and social bookmarking services suggest that people perceive quality on a given topic as a highly movable feast. Being able to evolve content quality on a continuous basis therefore becomes at least as important as any initial efforts.

Accept that quality is best implemented as a social process. Although Wikipedia’s editorial processes are far from perfect and worthy of some skepticism, Wikipedia has served as a critical proving ground to demonstrate that open social editing processes can scale effectively. The PLoS ONE experiment with online collaboration is developing peer-reviewed scientific research articles successfully through an open comment and review process that supplements traditional peer reviewing. Not every peer review process need be as open as PLoS ONE or Wikipedia but as the Web offers the broadest opportunity for peer input it would appear that the quality of audience engagement in developing materials is perhaps as good a measure of quality as the engagement of audiences in a finished product.

Accept that quality is as much about aggregation as it is about the one right pure answer. As much as tools such as Wikis, weblogs and social bookmarking are about what people write they’re also important for what they bring together as reference content through links, comments and embedded content. Social media is challenging search engines as a starting point for finding answers to questions in part because people come to trust the insights and expertise of specific communities to provide both their own insights and insights from their own research. Answer-oriented communities such as Yahoo! Answers, WikiAnswers and LinkedIn Answers provide audiences the ability to vote on answers to specific questions - a competitive aspect to publishing that helps to both aggregate potential high-quality content and to rank its value.” (http://shore.com/commentary/newsanal/items/2007/20070521quality.html)


Example

Slashdot (Owned by OSTG owned by VASoftware):

"The FAQ (Frequently Asked Question) response to, "how do you verify the accuracy of Slashdot stories?" is revealing: "We don't. You do. If something seems outrageous, we might look for some corroboration, but as a rule, we regard this as the responsibility of the submitter and the audience. This is why it's important to read comments. You might find something that refutes, or supports, the story in the main." In other words, Slashdot very self-consciously is organized as a means of facilitating peer production of accreditation; it is at the comments stage that the story undergoes its most important form of accreditation--peer review ex-post. Filtering and accreditation of comments on Slashdot offer the most interesting case study of peer production of these functions." (Benkler, Wealth of Networks page 77 - cited by Franz Nahrada [1])


More Information

Jack Whitehead, who explores "Living Education Theories" says that he's "been using a peer-to-peer process of social validation (modified from Habermas' views in his work on communication and the evolution of society) in assisting individuals to create their own living educational theories as they account to themselves and others for the lives they are living and their learning as they seen to live their values as fully as they can." (see http://www.actionresearch.net and http://www.bath.ac.uk/~edsajw/living.shtml)