New measure of scholarly output puts Princeton and Johns Hopkins historians at topHistorians in the News
The Faculty Scholarly Productivity Index, partly financed by the State University of New York at Stony Brook and produced by Academic Analytics, a for-profit company, rates faculty members' scholarly output at nearly 7,300 doctoral programs around the country. It examines the number of book and journal articles published by each program's faculty, as well as journal citations, awards, honors, and grants received. The company has given The Chronicle exclusive access to some of its data, including rankings of the top 10 programs in 104 disciplines.
The most recent index, based on data from 2005, contains plenty of surprises. Some relatively unknown programs rank higher than Ivy League and other institutions with sterling reputations. Take English. The index ranks the University of Georgia at No. 2, while Columbia, Cornell, Duke, Harvard, and Yale Universities, and the Universities of Pennsylvania and of Virginia don't even crack the top 10.
The data on History departments is here. The top ten, in order, are: Princeton, Johns Hopkins, Harvard, Univ. of Maryland College Park, Yale; tied at sixth: NYU & Loyola Univ. Chicago; Ohio State, Rice, Northwestern. (Law schools are not ranked in this index.)
The statistics for all programs seem low, in terms of percentage of faculty with book and articles published. This is because only recent data was used: books from 2001-05, and articles from 2003-05. Some grants were counted from 2003-05, some awards were counted from 2001-06, but Nobel Prizes were counted within 50 years.
The service is lauded by those who created and subscribe to it, but it is not just the unexpected rankings that suggest that caution is in order. The varying date ranges for different productivity measures strike me as problematic. Why not pick one date range -- say five years -- and stick with it across categories? That would make it easier to track all the data, to measure results across time, and to see whether particular events have an impact. In fields in which research can take years, and the review process in peer-reviewed journals is lengthy, it is not necessarily a sign of lack of productivity to have few articles within a short time period, since articles might be clumped in a time period the survey misses. Also, the very low citation numbers across the board seem to suggest that for History, the wrong data was collected. For most historians, books are their principal publications, rather than articles. But the Index does not count citations to books in journals, only citations to articles.
When reading these rankings, I would proceed with caution.
comments powered by Disqus
- Snopes debunks slavery Internet meme
- Revamped Chinese History Journal Welcomes Hard-Line Writers
- Poll: 3 Out of 5 Texan Trump Supporters Want Secession if Hillary Clinton Is Elected
- The Psychiatric Question: Is It Fair to Analyze Donald Trump From Afar?
- Minorities still feel Eugene, Oregon’s historical link to the Ku Klux Klan
- Ernst Nolte, Historian Whose Views on Hitler Caused an Uproar, Dies at 93
- Japan should give formal apology for wartime aggression, says historian
- Historian Benjamin Madley says what whites did to Indians in the 19th century in California was genocide.
- Kevin Baker says America needs to bring back political machines
- Covell Meyskens uses his blog to show what life was like under Mao. (Interview)