Open Source Science

From P2P Foundation
Jump to navigation Jump to search

Concept

Definition

Beth Ritter-Guth:

"“Open Source Science” refers to the sharing of all data, including failed experiments, and is likened to “open source” code in computing." (http://bethritterguth.wikispaces.com/rpp)


Characteristics

Walter Jessen:

"Open Source Science is a collaborative and transparent approach to science. To me, it means four things:

1. Open Source: the use of open and freely accessible software tools for scientific research and collaboration.

2. Open Notebook: transparency in experimental design and data management.

3. Open Data: public accessibility of scientific data, which allows for distribution, reuse and derived works.

4. Open Access: public access to scholarly literature." (http://www.walterjessen.com/promoting-open-source-science/)


Typology

Jamais Cascio:

"Ball covers three broad categories of mass-collaborative science. The first I would characterize as mass analysis, in which large numbers of people take a look at a set of data to try to find mistakes or hidden details. His best example of this is the NASA Clickworkers project, which used a large group of volunteers to look at maps of Mars in order to identify craters. It turned out that the collective crater identification ability of volunteers given a small amount of training was as good as the best experts in the field. Ball links this directly to the James Surowiecki book, The Wisdom of Crowds, which argues that the collective decision-making power of large groups can be surprisingly good. WorldChanging's Nicole Boyer has mentioned The Wisdom of Crowds in a couple of her essays, most notably this week's The Wisdom of Google's Experiment. The ability of groups to act collectively to analyze and generate information is one of the drivers of collaborative efforts such as Wikipedia -- any individual contributor won't be an expert on everything, but the collected knowledge of the mass of authors is unbeatable.

The second model of collaborative science he discusses is that of mass evaluation, in which large numbers of people have the opportunity to vet articles and arguments by researchers. This is a less quantitative and more subjective approach than collaborative analysis, but can still produce high-quality results. Ball cites Slashdot and Kuro5hin as examples of this approach, with the mass of participants on the sites evaluating the posts and/or comments, eventually pushing the best stuff up to the top. In the world of science, articles submitted to journals are regularly checked out by groups of reviewers, but the set of evaluators for any given article is usually fairly small. Ball cites the physics pre-print journal arXiv as an exemplar of a countervailing trend -- that of open evaluation. ArXiv allows anyone to contribute articles, and lets participants evaluate them -- a true "peer review."

The third model Ball discusses is perhaps the most controversial -- that of collaborative research, where research already in progress is opened up to allow labs anywhere in the world to contribute experiments. The deeply networked nature of modern laboratories, and the brief down-time that all labs have between projects, make this concept quite feasible. Moreover, such distributed-collaborative research spreads new ideas and discoveries even faster, ultimately accelerating the scientific process. Yale's Yochai Benkler, author of the well-known Coase's Penguin, or Linux and the Nature of the Firm, argues in a recent article in Science (pay access only) that such a method would be potentially revolutionary. He calls it "peer production;" we've called it "open source" science, and have been talking about the idea since we started WorldChanging." (http://www.worldchanging.com/archives/001090.html)

Source: "The Common Good," a new essay in Nature by consulting editor Philip Ball


Examples

The Polymath Project is a great example of Open Source Science. The goal of the project was to use blogs and wikis to collaboratively work on an unsolved problem in mathematics. In less than two months, it was announced that the Polymath participants had worked out an elementary proof, and a manuscript describing the proof is currently being written. The project demonstrated that many people could work together to solve difficult mathematical problems.

Another example is the Life Scientists Room on FriendFeed. Online collaborations are being made there between researchers around the world. Indeed, a paper was recently published in which only three of the eight authors have actually met. This illustrates how the networked world of the Web can be used to advance science.

The Public Library of Science (PLoS) is a great example of Open Access. With a grant in 2003, it started two Open Access journals. Today, there are seven PLoS journals and all are publically accessible. Each PLoS journal provides a suite of metrics for every article, including measures of online usage, citations from the scholarly literature, social bookmarks, blog coverage, and the Comments, Notes and ‘Star’ ratings. Its most recent journal, PLoS One, has experienced substantial growth since it was launched in 2006; it’s already the largest Open Access journal in the world and is projected to become the third largest journal in the world by the end of the year.

Nature Precedings is another great example of Open Access; it is a permanent, citable archive for pre-publication research and preliminary findings. Pre-publication of research was unheard of a decade ago. Nature Precedings allows researchers to communicate their thoughts and observations, solicit feedback and date stamp their ideas." (http://www.walterjessen.com/promoting-open-source-science/)


Discussion

Beth Ritter-Guth:

"Open Source Science, Open Data, Open Standards, and Open Access Science generally refer to the same principle; it indicates the publication of data for free use and distribution via the web using wikis, blogs, chemical docking programs, or other RSS technology.

Historically, this same data has only been available, in parts, through traditional peer review journals. ODOSOS is one acronym used to define "Open Data, Open Source, Open Standards" (Murray-Rust). However, there is legitimate discussion about what constitutes “Open Source” as compared to “Open Standards” and “Open Data.”

Open Access, for example, refers to the publication of "final" data or articles, and is not, inherently, about the sharing of collaborative data although there is a place for that to exist within OA (BOAI).

“Open Source Science” refers to the sharing of all data, including failed experiments, and is likened to “open source” code in computing. It includes both the process and the resulting data. As such, it communicates the "thinking behind the chemistry" - a practice not embraced by traditional methods (Bradley).

Open Data” is similar to Open Source Science in the philosophy of sharing, but differs because it does not include the publication of failed data or experiments, and shares, instead, successful processes and data. In short, "open data" refers to data "which we can attach a CC [Creative Commons] or similar license" (Murray-Rust).

Finally, “Open Standards” refers to the sharing of the processes by which data is shared." (http://bethritterguth.wikispaces.com/rpp)

More Information

  1. Open Chemistry


Project(s)

OpenSourceScience

URL = http://www.opensourcescience.net/index.php?title=Main_Page

"OpenSourceScience is a public space for managing controversial scientific experiments in a way that provides open access to of all phases of the research. We provide a centralized resource for scientific collaboration, and help underwrite scientifically rigorous experiments that may contribute to an improved understanding of human consciousness.

The essence of the open source model is the rapid creation of innovative results within an inclusive and collaborative environment. At OpenSourceScience, we bring together the skeptical community, controversial science researchers, and interested laypeople to help design and facilitate high-quality scientific experiments. Our community encompasses multiple points of view joined together by a commitment to "follow the data". This spirit of cooperation promises to improve the long-term viability of our results."