Research Reputation Metrics

From P2P Foundation
Jump to navigation Jump to search


Discussion

Richard Price:

"new reputation metrics, developed by a number of startups, have been developed that incentivize scientists to share their research openly, rather than incentivizing them to put their research behind a paywall. Scientists are adopting them to better stand out from the crowd when applying for jobs. Examples of these new reputation metrics include inbound citation counts, readership metrics and follower counts.

Inbound citation metrics. A few years ago, Google Scholar started displaying inbound citation counts for papers – counts of how often a given paper was cited by other papers. Scientists have started to see these inbound citation counts as a way to demonstrate the impact of their work, and are increasingly including them in their job and grant applications. In some fields, such as physics, scientists are more proud of their inbound citation counts than they are of the journal titles on their resume.

Readership metrics. Academia.edu, Mendeley and ResearchGate are helping scientists to understand readership metrics around their research. These sites tell academics how many people are reading their work, as well as some demographic data about those readers. Increasingly these readership metrics are helping to influence hiring decisions by tenure committees.

Follower counts. Scientists are increasingly wanting direct, unmediated relationships with their audiences. Twitter, Facebook and other sites have put content creators directly in touch with their audiences. Scientists are saying ‘I want that direct relationship with my audience too!’ The personal brands of scientists are starting to eclipse those of journals, and follower counts help a scientist understand the growth of their personal brand.


In the pre-web era, scientists used to print out papers and read them in their labs in non-trackable ways. Increasingly scientists are reading and sharing papers online. The reputation metrics described above are derived from this online activity; two others that will emerge include:

  • Commenting metrics: As scientists increasingly comment on papers online, metrics will emerge to reflect the most discussed papers.
  • Recommendation metrics: As scientists increasingly share paper recommendations online, metrics will emerge to reflect the most shared/recommended papers.

To distinguish between mere popularity and genuine impact, these metrics will take into account the reputation of the scientists doing the commenting/recommending. The metrics will be recursive in the way that Google’s PageRank algorithm looks at the quality of the linking site and not just the quantity of them." (http://techcrunch.com/2013/02/03/the-future-of-the-scientific-journal-industry/)


More Information

  • An ecosystem of startups is working on building these new reputation metrics in science, including my startup Academia.edu, as well as Mendeley and ResearchGate (other important players in the space are PLoS and Google Scholar).
  • The startups looking to help facilitate this, such as those mentioned above and Science Exchange, Figshare, Microryza, Quartzy, Altmetric and ImpactStory