Introduction to Deconstructing University Rankings

In the 2005 edition of The Top American Research Universities report, we began disaggregating the medical school and engineering components from the federal research expenditures reported in our data for those universities with more than $20 million in federal research expenditures. The data for this exercise proved somewhat difficult to acquire, given the various ways in which universities report information to different agencies for different purposes. As is frequently the case for university data, reports provided to one agency or for one purpose do not necessarily match information collected for another agency or purpose, even if the information appears to address the same universe.

For this and subsequent reports, we use three sets of data of interest. The primary set comes from the National Science Foundation (NSF) and captures all federal research expenditures. The second set comes from the Association of American Medical Colleges (AAMC) and identifies medical school research expenditures defined in the same fashion as the NSF data. The third set comes from the American Society for Engineering Education (ASEE) for engineering colleges, again using the same definitions as the NSF data, to capture engineering research expenditures.

However, in some cases if we add up the engineering expenditures from ASEE and the medicine expenditures from AAMC for each institution, the total is larger that the institution reported to the NSF for all research fields. This usually means that the institution used slightly different definitions of what should be included in the various data reports, which leads to some overlap. These inconsistencies in the data recommend caution in making too-fine distinctions among institutions because relatively small differences may well be data reporting artifacts and not reflections of actual differences in performance. For the broader issues related to understanding the general impact of medical or engineering programs on campus research competitiveness, these data serve rather well.

To understand the impact of medicine and engineering schools on ranking, the tables include columns for federal expenditures reported by the NSF, research expenditures attributable to their medical schools (if they have one as part of the campus), and research expenditures attributable to their engineering schools (again, if they have one as part of the campus). We then produced four different rankings – one based on the NSF total, one based on the NSF total less the medical school amount, one based on the NSF total less the engineering amount, and one based on the NSF total less both engineering and medicine amounts.

These tables permit a view of how much of the rank based on NSF research expenditures, an indicator widely reported, is attributable to the contribution of medical and engineering units. The tables allow a recalculation of ranks that would result from ordering institutions by their NSF research expenditures without the medicine or engineering contribution.

It is no surprise to close observers of these data that the relative position of institutions after subtracting the medical school portion can change substantially. While some high performing institutions with medical schools do indeed drop in rank, not all drop by the same amount. Similarly in the reordering, not all institutions without medical schools improve dramatically in rank when compared to their counterparts minus their medical school contributions.

For a discussion of these issues see: Deconstructing University Rankings: Medicine and Engineering, and Single Campus Research Competitiveness, 2005 ; 2005 available on this website.

The table below titled Medicine and Engineering is the Excel version of the data provided in the publication above.

The second table titled Federal Research with and without Medical School Research (2002-2008) extends this report format for subsequent years focused on institutions with over $40 million in federal research expenditures.