Pages

Sunday, November 28, 2010

Soft peer review? Social software and distributed scientific evaluation

Posted on Academic Productivity on February 21st, 2007 by dario.

“The solution I’d like to suggest is that online reference management systems implement an idea similar to that of anonymous refereeing, while making the most of their social software nature. The most straightforward way to achieve this would be, I believe, a wiki-like system coupled with anonymous rating of user contributions. Each item in the reference database would be matched to a wiki page where users would freely contribute their comments and annotations. Crucial is the fact that each annotation would be displayed anonymously to other users, who whould then have the possibility to save it in their own library if they consider it useful. This behavior (i.e. importing useful annotations) could then be taken as an indicator of a positive rating for the author of the annotation, whose overall score would result from the number of anonymous contributions she wrote that other users imported. Now it’s easy to see how user expertise could be measured with respect to different topics. If user A got a large number of positive ratings for comments she posted on papers massively tagged with tag “dna”, this will be a indicator of her expertise for the “dna” topic within the user community. User A will have different degrees of expertise for topics “tag1″, “tag2″, “tag3″, as a function of the usefulness other users found in her anonymous annotations to papers tagged respectively with “tag1″, “tag2″, “tag3″.”


I'd like to suggest a problem with this system: Annotations are often used to evaluate the original text upon which they're written, if one anonymous commentator writes an annotation which has little to do with the original text, this might not be obvious to whomever hasn't read the original article. If someone is relying on annotations in order to find useful information among many articles, an annotation written by a less than competent annotator may be misleading, albeit attractive if it is simpler than the complex article it annotates.
An intriguing post! Metadata based scientific evaluation seems like a suitable alternative to the current peer review process. Thank you!

No comments:

Post a Comment