From Head Hog to   School Builder

 

  
  

 
   

ALUMNI CORNER
Black and White and Read All Over
The U.S. News and World Report Ranking of Colleges and Universities
By Phillip M. Satow '63
President, Columbia College Alumni Association



Philip M. Satow '63

Our undergraduate experience taught us to think independently, develop our analytical skills and comfortably query and challenge conventional wisdom. Perhaps it is to be expected, given our common Core experience, that so many of us are repulsed not only by the relative placement of Columbia in the U.S. News annual ranking of colleges and universities, but by the magazine's notion of evaluating and ranking elite institutions of higher learning at all.

In the 1999-2000 survey, Columbia was ranked 10th overall and fifth in the Ivy League. California Institute of Technology was ranked first, having jumped from ninth place last year because of a change in the statistical ranking methodology instituted prior to the most recent rankings. Universities were allowed to count research budgets in their per-student expenditures, even though students may get no direct benefit from what research professors may be doing outside of class. This variable was worth 10 percent of a school's total score, and this year Cal Tech ranked first, MIT second and Johns Hopkins third in this category. Schools focused on scientific programs or engineering clearly benefited by the change in methodology. Also, until now U.S. News considered only a school's ranking in the category of educational expenditures per student, not by how much one school outpaced another. This year, schools benefited by large favorable variances, or suffered from negative ones.

As Robert Gottlieb wrote in the online magazine Slate (August 1999), "The real reason Cal Tech jumped eight places this year is that the editors of U.S. News fiddled with the rules...In other words, Cal Tech didn't improve this year, and Harvard, Yale and Princeton didn't get any worse. If the rule hadn't changed, Harvard, Yale and Princeton would still be ahead." The president of Stanford, ranked sixth, agrees that the rankings' volatility "says more about inconsistent scoring methods than actual changes in quality." And as Gottlieb reminds us, sales of this annual issue of U.S. News are almost double the normal level, and a paperback version sells an additional million copies. U.S. News is in the business of selling magazines, and students or parents have no incentive to purchase this particular issue if the rankings continue to look strikingly similar.

U.S. News editors believe that given the high cost of education today, prospective students and their parents should have as much comparative information as possible. Who can disagree with that? On their online site, however, the editors say if ranking information is available for "household appliances," it is even more important it be accessible to individuals making decisions involving more than $100,000. Why do they feel compelled to relate a four-year living and learning experience, by students with unique needs and preferences, to consumer goods? How can they compare the choice of a college with the choice between brands of refrigerators!

U.S. News's overall ranking system relies on gathering data in 16 areas. The editors call these variables "indicators of academic excellence." Each indicator is assigned a weight. Most of the data comes directly from the schools. In the case of the National University grouping, of which Ivy League schools are a part, there are 228 ranked institutions.

The outstanding reputation of the Columbia faculty is downplayed by the rating system. "Faculty Resources" are evaluated and allotted a 20 percent weighting, but include variables like faculty compensation, class size, percent of full-time faculty, etc. There is no attempt to assess curricular strength or faculty eminence.The collective excellence of a departmental faculty is not ascertained by ratios and numerical values. The U.S. News system also does not judge the quality of individual academic departments, so a student cannot depend upon it to find, for example, a top English or economics department. This is a factor a student should evaluate in the decision-making process.

Other indicators confound, much like "Faculty Resources" does. Why include graduation rate and graduation rate performance (the difference between the six-year actual graduation rate and an expected rate based upon test scores and educational expenditures)? Among top schools, the differences in graduation rates are next to meaningless - in fact, lower rates may indicate higher standards of academic rigor, rather than a less able student body. Does anyone believe that Yale or Princeton's 95 percent graduation rate really indicates anything significantly different from Columbia's 90 percent rate? The U.S. News system allows disproportionate weight for graduation in general. "Graduate Rate" and "Performance" combined have a 21 percent weighting. This compares with 20 percent for "Faculty Resources" and only 15 percent for "Student Selectivity." Why is "Alumni Giving" included, with a five percent weighting? What does the percentage of alumni contributing financially have to do with a school's academic excellence in a given year?

Columbia's ranking in some areas is noteworthy. Our selectivity was ahead of all Ivy League schools with the exception of Harvard, Yale and Princeton. Columbia's acceptance rate of 14 percent compared with Cornell's 34 percent, Penn's 29 percent and Yale's 18 percent. Only Harvard's 12 percent and Princeton's 13 percent were lower among the Ivies.

The U.S. News ranking formula places the greatest weight, 25 percent, on academic reputation, as determined by a survey of the subjective opinions of presidents, provosts and deans of admission at institutions in the same category. Schools are ranked from 1 (marginal) to 5 (distinguished). Cal Tech, the No. 1 ranked university in overall score, had a 4.7 in this category; Columbia scored 4.6. Only Harvard, Yale and Princeton, among Ivy League schools, were ranked higher than Columbia.

The obsession with a concrete rank order and numerical score is simplistic. Differences between the top 50 schools and the next 100, taken as groups, may have validity; quality differences, however, within the top group of elite institutions are exaggerated and do an injustice to the schools. Further, gross rankings, with statistical significance purposefully blurred, highlight misleading differences rather than similarities in quality. The U.S. News methodology confuses more than it helps prospective students to decide on the relative merits of one top school as compared with another.

Not surprisingly, there are college administrations today that are unduly influenced by these rankings. Some believe marketing-oriented administrators at the University of Chicago took steps to alter their core curriculum in response to disappointing rankings. Others believe these rankings encourage grade inflation. I know we will avoid the tendency alumni of some institutions have had in allowing the rankings to modify attitudes or behaviors. Don't you wonder how many potential Columbia students will apply to Johns Hopkins or Penn this year because their overall scores were 86 and Columbia's was only 85!

U.S. News's rankings are black and white and, unfortunately, read all over.

 
Search Columbia College Today
Search!
Need Help?

Columbia College Today Home
CCT Home
 

This Issue
This Issue

 

This Issue
Previous Issue

 
Masthead
CCT Masthead