Christmas Bird Count

From P2P Foundation
Jump to navigation Jump to search


Description

By Adam Glenn at http://www.poynter.org/column.asp?id=31&aid=116168

"This annual citizen science effort runs for several weeks in early winter. It uses tens of thousands of volunteers to collect widely dispersed data from hundreds of sites around the country. The National Audubon Society has been running it for over a century now, and it's no publicity stunt. (I wrote more about this project in a Jan. 4 posting at NewAssignment.net.)

This project is taken very seriously by the scientific community because this data helps ornithologists understand bird population shifts, environmental pressures, serious threats to given species, and more. It's spawned similar data-pooling projects among birders, and is often cited by citizen science advocates as a prime example of how amateurs can work side-by-side with experts to share in a deeper scientific endeavor." (http://www.poynter.org/column.asp?id=31&aid=116168)


Commentary

By Adam Glenn at http://www.poynter.org/column.asp?id=31&aid=116168

See our entry on Citizen Journalism for extra context:

"I think such citizen science projects offer valuable models that can be applied to citizen media projects:


1. Rigorous data collection. The Bird Count uses carefully developed methodologies to avoid spoiling data with inaccurate or duplicate information. Likewise, citizen journalists can establish and disseminate guides for reporting and photography standards -- especially regarding verifiable info such as names, quotes, attribution, numbers and the like.


2. Pooling and verifying cumulative results. The sheer volume of overall data collected in the Bird Count ensures that, if any contaminated info does sneak in, it won't unacceptably distort the final result. That's an important lesson for citizen journalism sites, harking back to the journalistic principle of verifying information with multiple sources. Ideally, citJ projects should seek multiple iterations of information -- for example, requiring that assertions by one contributor be verified by others.


3. Vetting amateurs. Even small hurdles like registration forms and minimal fees can weed out the unworthy, while extensive mandatory training can seriously raise the level of contributions (as well as the cost, unfortunately). It's worth considering whether citJ sites might benefit from mandatory online tutorials, accuracy checklists or story forms to make sure vital info isn't left out of submissions.


4. Expert-amateur interaction. Most citizen science projects aim to pair the novice with either experienced amateurs or experts themselves, fostering mentoring relationships that ultimately improve the data. Why shouldn't experienced citizen journalists (or professional journalists associated with new media or even mainstream media) provide the same mentoring? This could be done via workshops, in-the-field training, online editing, or other means. If the gains in media democratization aren't enough for you, how about the ways in which the resulting bond with the community and its most active news consuming members could pay off in loyalty to the news product?" (http://www.poynter.org/column.asp?id=31&aid=116168)


More Information

  1. See our entries on Citizen Science and Citizen Journalism
  2. http://en.wikipedia.org/wiki/Christmas_Bird_Count
  3. Stardust@home