Let’s see how full-time MBA programs in the U.S. fared this year on the BW rankings…
There were some huge changes this year! Let’s take a look at some of the highlights:
• Newcomers to the top 20 this year are Yale SOM, which made a huge jump from 21st place to 10th place; Maryland Smith which went from 24th to 17th place; and Emory Goizueta which jumped from 22nd place to 18th place this year.
• There are three new schools in the top 10 this year – Yale SOM, as mentioned above; Columbia Business School (13th in 2012 and 5th this year); and CMU Tepper (which moved just one place from 11th place to 10th place).
• Beyond that, there was some major shifting in the rankings. The top 3 schools were all different this year (Wharton and Booth still there, but rearranged), with Harvard Business School falling from 2nd place to 8th place.
• UVA Darden also fell significantly this year, from 10th place to 20th.
• Big jumpers further down the rankings include Rice University Jones (from 34th to 25th); UC Irvine Merage (43rd to 31st); and Rochester Simon (50th to 38th).
• The schools that fell the most in the rankings include Texas A&M Mays (26th to 42nd); University of Wisconsin-Madison (33rd to 44th); Boston University (39th to 57th); Babson Olin (from 42nd to 58th); Thunderbird (45th to 62nd); and Arizona Carey (49th to 67th).
And here’s the scoop on the best U.S. undergraduate business schools in 2014…
Some highlights include:
• Newcomers to the top 20 are Northeastern (from 25th last year to 19th this year) and CMU Tepper (from 24th last year to 17th this year).
• The only new school in the top 10 this year is Indiana Kelley, which jumped from 13th place last year to 8th place this year.
• Michigan Ross fell from the top 10, from 8th place to 12th place.
• Big jumpers include Southern Methodist Cox, which jumped from 30th to 21st place; Babson, which jumped from 36th place to 26th place; UM Amherst Isenberg, which jumped from 45th to 36th; Bryant, which jumped from 63rd to 49th; and Case Western Reserve Weatherhead which jumped from 69th to 50th.
• Big falls include Villanova, which fell from 15th place to 24th; U of Illinois Urbana-Champaign which fell from 21st to 34th; and James Madison University which fell from 29th to 40th place.
For details on how ranking methodology see:
Analysis of the 2014 Businessweek Rankings
The Basics of BW’s Rankings Remain Unchanged
This year, as in the past, BW surveyed recruiters and students. The recruiter satisfaction results comprise 45% of the ranking. The student satisfaction survey results comprise another 45% and the remaining 10% is determined by “expertise of each school’s faculty” as evidenced by faculty research published in prominent academic journals AKA intellectual capital.
What’s New in BW’s Rankings Methodology?
• The employer ranking reflects this year’s data only. Previous rankings used data from the last three surveys or six years of biannual rankings data while weighting the most recent year most heavily.
• BW surveyed fifteen times the recruiters this year than it did in previous years. Previously, BW surveyed major recruiters who tended to recruit at multiple business schools. This year, BW attempted to survey as many MBA recruiters as possible, including “recruiters” who recruit primarily if not exclusively at their alma mater. The increased survey size is a major methodology change. The alumni recruiters may have a certain bias towards the school they attended. BW attempted statistically to reduce the impact of that bias, but it probably helped smaller schools like Duke, Tepper, and Yale, and hurt the traditional leaders, like Harvard, Wharton, and Chicago.
Impact of the Methodology Changes
• Surprise! The results will shock many applicants. Seven programs, including Duke and Yale, rank above HBS and MIT. Indiana Kelley and Maryland Smith rank above Haas, NYU Stern, and Darden. These are unexpected results.
• Reemphasizes the importance of understanding methodology. The changes highlight the need for anyone using the rankings as indications of “quality” or even reputation and brand value (a bad idea in my book) to look at the underlying data. Smith is ranked overall at 17. It was ranked #1 for student satisfaction and #51 in the employer survey ranking. Applicants to Smith should inquire about what is changing in its career management center. Clearly there is a satisfaction gap that has to be addressed.
• Increased volatility. Since BW has removed older rankings data from the ranking and has dramatically widened the survey pool while incorporating alumni recruiters, you are guaranteed to see more changes and more radical changes than with the previous methodology.
• Cognitive Dissonance. Either BW rankings will lose credibility because they don’t conform to expectations and will be more volatile, or people’s perception of the programs will change because of the BW rankings.
My money is on the former: loss of credibility. If BW’s results become less stable and predictable (like The Economist’s), they are more likely to lose credibility than to contribute to changes in school reputation.
As always my best advice to applicants reviewing the rankings is to:
• Use specialty rankings to get a sense of what schools excel in your areas of interest.
• Use the data that the ranking databases provide.
• If you have any thought of actually using the overall rankings, understand what they measure, and ask yourself if those qualities are of paramount importance to you. BW has been wonderfully transparent and even shared the questions actually asked in the survey.
• Layer in reputation and brand, i.e. ranking, after determining what schools best support your goals and are most likely to accept you.
By Linda Abraham, president and founder of Accepted.com and co-author of the new, definitive book on MBA admissions, MBA Admission for Smarties: The No-Nonsense Guide to Acceptance at Top Business Schools.
Last updated on