Rebecca Blood: "Wide adoption of the Internet has fueled a resurgence of citizen science. Cornell's Project Feeder Watch employs 16,000 volunteers across North America who record their sightings on a website that will automatically ask them to double-check if they report sighting a bird that normally does not range in their area. In Canada, Frogwatch has set up systems for reporting and mapping observations so that volunteers can see the results of their input immediately.And Earthdive is working on a global scale, allowing recreational divers and snorkellers to record their experiences. Members can search and explore dives, snorkel trips, science logs, and personal experiences recorded all over the world. By including sightings of key indicator species during their dives and trips, Earthdive members are creating a daily global snapshot of the state of our planet's oceans." (http://www.worldchanging.com/archives/002974.html)
Collaboration between experts and amateurs
Citizen science projects pose the problem of accuracy and the cooperation between experts and amateurs.
By Adam Glenn at http://www.poynter.org/column.asp?id=31&aid=116168
See our entry on Citizen Journalism for extra context:
"I think such citizen science projects offer valuable models that can be applied to citizen media projects:
1. Rigorous data collection. The Bird Count uses carefully developed methodologies to avoid spoiling data with inaccurate or duplicate information. Likewise, citizen journalists can establish and disseminate guides for reporting and photography standards -- especially regarding verifiable info such as names, quotes, attribution, numbers and the like.
2. Pooling and verifying cumulative results. The sheer volume of overall data collected in the Bird Count ensures that, if any contaminated info does sneak in, it won't unacceptably distort the final result. That's an important lesson for citizen journalism sites, harking back to the journalistic principle of verifying information with multiple sources. Ideally, citJ projects should seek multiple iterations of information -- for example, requiring that assertions by one contributor be verified by others.
3. Vetting amateurs. Even small hurdles like registration forms and minimal fees can weed out the unworthy, while extensive mandatory training can seriously raise the level of contributions (as well as the cost, unfortunately). It's worth considering whether citJ sites might benefit from mandatory online tutorials, accuracy checklists or story forms to make sure vital info isn't left out of submissions.
4. Expert-amateur interaction. Most citizen science projects aim to pair the novice with either experienced amateurs or experts themselves, fostering mentoring relationships that ultimately improve the data. Why shouldn't experienced citizen journalists (or professional journalists associated with new media or even mainstream media) provide the same mentoring? This could be done via workshops, in-the-field training, online editing, or other means. If the gains in media democratization aren't enough for you, how about the ways in which the resulting bond with the community and its most active news consuming members could pay off in loyalty to the news product?" (http://www.poynter.org/column.asp?id=31&aid=116168)
Some other examples:
- distributed proofreading for Project Gutenberg, at http://www.pgdp.net/ #Great Internet Mersenne Prime Search, http://www.mersenne.org/prime.htm #NASA Clickworkers, http://clickworkers.arc.nasa.gov/top ;
- SETI http://setiathome.ssl.berkeley.edu
- Christmas Bird Count
See our entry on Communal Validation