There’s a new player in the rankings game, but University administrators aren’t paying much attention.

Academic Analytics -a company of faculty and researchers from the State University of New York at Stony Brook and Educational Directories Unlimited, Inc. – developed a new ranking system for doctoral programs that purports to measure universities purely based on numerical data – ignoring factors like reputation and prestige.

First released in 2004, the Faculty Scholarly Productivity Index uses an algorithm to measure the productivity of faculty by considering the number of books and journal articles they publish. Awards and grants also factor into the equation.

So far, the company has only released its rankings for 2004 and 2005. For years, they were private.

University of Michigan administrators said the University did not actively participate in either study but was still included in the rankings.

In a portion of the 2005 index, which appeared publicly in the Chronicle of Higher Education for the first time earlier this month, the University placed 27th in a ranking of 50 large research universities – one spot behind Columbia University and one in front of Northwestern University. Harvard University topped the list.

SUNY-Stony Brook came in 19th, ahead of the University of Chicago and Dartmouth College.

The index also includes a list of the top department in 104 different academic fields. The University only appeared on this list once, for its applied mathematics program.

In an interview on Monday, University President Mary Sue Coleman said the study is misleading and University administrators have “serious reservations about the methodology.”

“‘Junk in, junk out’ is our view on the subject,” Coleman said.

Rackham Dean Janet Weiss identified two major problems with the index.

She said the first problem stems from the fact that Academic Analytics developed the faculty lists for each university from its departments’ websites. Because some departments don’t update their websites regularly, faculty lists are often inaccurate.

“When we looked at a preliminary list of University faculty from Academic Analytics, we were amazed at how many errors of both omission and commission were included,” Weiss said in an e-mail interview.

The other problem stems from journal citations. Weiss said that while these citations can be useful for measuring quality in some disciplines, others that don’t typically use journals like theater or art shouldn’t be evaluated by the program’s number citations.

Weiss had a similar opinion as Coleman’s about the usefulness of the index.

“We won’t rely on these flawed rankings,” she said. “However, we conduct our own evaluations of the quality of graduate programs, and we do rely on them to help improve the programs.”

Coleman said the University views the National Research Council’s ranking system as the most reliable, but the council hasn’t released new rankings since 1995.

The council conducts studies on behalf of the National Academies, a group made up of the National Academies of Science and Engineering, the National Research Council and the Institute of Medicine.

Weiss said the University is participating in the council’s new study, expected to be released later this year.

“We believe that these ratings will be much more informative than the Academic Analytics results,” she said.

27 The University’s rank among 50 large research universities in a new study based on the amount of work published by faculty

38 Michigan State University’s rank

14 University of Wisconsin at Madison’s rank

2 University of California at San Francisco’s rank (Tied with the California Institute of Technology)

Gabe Nelson contributed to this report.

Leave a comment

Your email address will not be published.