Friday, September 28, 2007
[T]he use of these download data in rankings gives people an incentive to do the opposite of what SSRN was designed to do in the first place: encourage people to look at each others' work and to engage with it ...
I'll add here that there is an additional difference between download counts and citation counts. While citation counts are also problematic, they cannot really be stopped. That is, there is nothing to stop any motivated party from going out and counting citations. All you have to have is the time and technology, and you can count the citations of any given paper in any set of publications you like. SSRN download counts, by contrast, are not (so far as I know) inherently public data. If the management of SSRN decided that the information that the download data provide is less valuable than the distortions that they cause, they could simply choose not to provide this information anymore. If I'm right that they have this option, I wish that they would exercise it. In my mind, the cost/benefit analysis clearly disfavors publishing these numbers -- especially because they undermine what I take to be SSRN's core purpose.
That said, I also wanted to add a criticism that was suggested to me by my new [tax] colleague at GW, Sarah Lawsky. ... Sarah suggests, without taking a position on whether it is actually a useful exercise to count downloads, that downloads need to be adjusted for the equivalent of inflation. ... [T]those who do think that downloads are a good measure of something would make a more compelling argument if they could identify what they think they're measuring and then devise an appropriate price deflator.
In our article, Ranking Law Schools: Using SSRN to Measure Scholarly Performance, 81 Ind. L.J. 83 (2006), Bernie Black (Texas) and I concede that the SSRN download rankings are imperfect measures of faculty scholarly performance but argue that they complement the other existing imperfect ranking methodologies of reputation surveys, productivity counts, and citation counts. In particular, the differing biases of SSRN download and citation counts can produce a more accurate picture when viewed together than either does when viewed in isolation. Bill Henderson (Indiana) goes further and argues that SSRN provide a "superior measure of faculty productivity" than the other ranking methodologies.