How the Latest (Provocative) BusinessWeek Rankings Reflect on MBA Programs

BusinessWeek 2014 Business School RankingsToday BusinessWeek has released its 2014 Business School Ranking. There are a number of potentially provocative findings:

  • Duke has jumped to Number 1!
  • Harvard has fallen to Number 8!
  • Chicago Booth, after years, is no longer at Number 1!

Of course, no one knows what these headlines actually mean, if anything, as headlines alone. Here’s what’s going on behind the headlines, and what it means about business school.

Three Metrics: Student Survey, Employer Survey, and Intellectual Capital

These are the metrics that drive the results. “Student survey” is what it sounds like. It has a glaring, critical flaw: MBA students (and alums, for that matter) are deeply acquainted with only one program – the program they attended. If a Harvard student likes his or her MBA program less now than two years ago, does this really give us any information about whether Harvard is a better MBA program than another one? I would contend that it does not. As a good MBA marketing class is quick to teach, an “experience good” has to be sampled to be judged, and the student experience across MBA programs hasn’t been sampled by the respondents widely enough for their responses to give good comparative information.

The employer survey makes a lot more sense, because the perspective of an employer is directly comparative. Employers interview, hire, and manage people from different MBA programs and they can comment on the preferences they have developed over time. For example, a corporate recruiter once told me that she would hire a Michigan Ross MBA over a Harvard MBA any day, because the Ross MBA is equally qualified and has less attitude. I don’t necessarily agree (and I’m a Harvard undergrad), but I think that’s a relevant example of how recruiter opinions matter and are based on at least more than mere conjecture of what life is like on another MBA planet.

The last metric, “intellectual capital,” has some obvious implementation flaws. It is based solely on the counts of articles published by faculty in Harvard Business Review publications. This metric is clearly a case where available data – and anything that can be counted – has been made to stand for something somewhat different. As a Booth alum, I know the Chicago folks are sneering at this metric in a big way, because Chicago faculty (as it reminds anyone within earshot) is unusually loaded up with Nobel Prize laureates. The good news (I guess?) about the intellectual capital metric is that it’s only 10% of the ranking, as the rubric states.

…Not That Other Rankings are Better

I’m not harshly critical of the BusinessWeek rankings. The point rather is that placing a great deal of value on any MBA ranking is a path of ignorance or suggestibility. The great thing about the rankings is that they publish the information on all of their subscores. So, even though the BusinessWeek student surveys don’t give a definitive answer as to which program is best, they certainly give interesting information that you might use for something if you’re deciding to go to business school.

A couple years ago, I interviewed Bob Morse, the guru behind the U.S. News & World Report business school ranking. The metrics for that ranking are different (and also published, and hence useful). One substantial input is average GMAT score, and another is acceptance rate (where a lower acceptance rate is better). These factors lead to a thicket of redundancy in the U.S. News Rankings: the rankings drive popularity, which increase applications and hence push down acceptance rates for top schools, and thereby the rankings actually positively influence themselves, rather than measure a phenomenon from a distance. To a degree, the U.S. News rankings make schools popular by reporting in a misleading way on the fact that they are popular. Also, while GMAT score and acceptance rate do not measure the same thing, they have common causal elements, since schools with higher average GMATs are certainly more selective on average.

In summary, the rankings are most useful for their intermediate variables, which you can use to construct your own ranking (which I will write about someday).

Some Truth/Currency to the Story

Some of the qualitative and anecdotal results of the report are telling. For example:

On Bloomberg Businessweek‘s survey, HBS scored lowest among the top 10 business schools on its atmosphere for women and racial and religious minorities. “Diversity to HBS means playing around with grades and other measures of success until the numbers look right, rather than tackling the underlying issues,” wrote one student. And when asked to rate the “climate for people of all socioeconomic backgrounds,” HBS students ranked their school second to last out of 112 schools on the list. “There is a disparity of incomes among the students, which can cause pressure on those students who do not come from wealth to be able to participate in the extracurricular events,” one student complained. [Source]

Whether Harvard is better or worse about this than other programs, there seems to be a real problem with business school culture.