Paul L. Caron

Friday, December 11, 2015

Rutgers Faculty Rebels Against Use Of Metrics To Assess Their Scholarly Performance

AAInside Higher Ed, Refusing to Be Evaluated by a Formula:

Rutgers faculty members, citing philosophical concerns and errors, are pushing back against the use of Academic Analytics to evaluate their productivity. 

With the advent of Google Scholar and other metrics for faculty productivity, advancing one’s career as a professor is much more of a numbers game than it used to be. Still, the traditional system of peer review in hiring, tenure and promotion decisions has retained a good deal of nuance. Scholars in the same field as those they’re evaluating know that while one project may not be as prestigious as another, for example, a good degree of academic innovation might be worth a little professional risk.

But is that system under threat? Full-time faculty members at Rutgers University at New Brunswick say that it may be, in light of the university’s contract with a faculty productivity monitoring company called Academic Analytics.

Rutgers professors say they don't need the system, which is based on a patented algorithm for measuring faculty productivity, and that what little data they’ve been able to obtain to so far include some serious errors. On Monday, the faculty of the School of Arts and Sciences will vote on a faculty union-backed resolution asking the university not to use Academic Analytics data in personnel and curricular decisions, and to give faculty members access to data collected by the company. ...

In 2013, Rutgers signed a $492,500, four-year contract with Academic Analytics, a New York-based company founded by Lawrence Martin, former dean of the Graduate School at the State University of New York at Stony Brook, and Anthony Olejniczak, a fellow anthropologist. Their premise was that college and universities needed a more dynamic set of data, updated on an annual basis, than is included in the National Research Council’s periodic rankings of graduate programs. ...

Data now available online include numbers of scholarly books and journal articles, citations, research funding by federal agencies, and awards earned by faculty members. Comparisons can be made between disciplines and institutions over all. Of course, individual departments have long looked at just these types of measures. But faculty members say that when they do the reviewing, they know in a way no formula can which journal article really made a difference to a field, which grant was particularly influential and which research helped a local community (even if it didn't win a big grant). This type of information, they say, is why they don't need a formula.

Early on, critics of the program pointed to the fact that there was no consideration of teaching or service in Academic Analytics’ formula.

Legal Education | Permalink


Ehh. Who cares, it is all about how fast they are paving the highway to hell. A productive college faculty will just publish more leftist clap trap. Doing nothing, would be a step up.

Posted by: Walter Sobchak | Dec 12, 2015 3:14:36 PM

So then if you are measuring nothing you will get...

Posted by: Rob Anderson | Dec 12, 2015 3:01:35 PM

I have been informed that a publication of mine leads my school -- one of the top two in its discipline according to US News -- in citations in Google Scholar. I want to tell you that this measure is total garbage.

Posted by: mike | Dec 12, 2015 10:34:17 AM

Old accounting axiom, "You get what you measure. Therefore, you better make sure of what you want before you start measuring."

Posted by: Dale Spradling | Dec 12, 2015 9:02:32 AM

I'm covering myself by posting on TaxProf, which is plainly worth more points than mere academics

Posted by: mike livingston | Dec 12, 2015 5:17:32 AM