U.S. News Answers All Your Rankings Questions [Summary]
Normally, when U.S.-based applicants talk about “the top programs,” especially when they refer to the top X, as in the top 10 or the top 20, they are referring to U.S. News Rankings, the granddaddy of educational rankings. Today we’ll hear from U.S. News Executive Editor, Anne McGrath, and Chief Data Strategist, Robert Morse.
Interview with Anne McGrath and Bob Morse of U.S. News & World Report [Transcript]
*Transcript is not literal and includes paraphrasing.
Linda: I’m so pleased to have on the show Anne McGrath, Executive Editor of U.S. News & World Report, and Robert AKA Bob Morse, Chief Data Strategist at U.S. News & World Report. Anne has been with U.S. News since 1985 and Bob has been with U.S. News since 1976. Both are pioneers in the realm of rankings. Now for all the questions I’ve ever had about rankings but was afraid to ask, and hopefully these are some of yours. I can actually get answers.
Anne and Bob, welcome to Admissions Straight Talk.
Anne: Thank you, good to be here.
Bob: Thank you. We’re looking forward to doing our best effort to answer all your ranking questions. Here’s your chance to get U.S. News to answer them.
Linda: I was just thinking that this is very unusual. Usually representatives of U.S. News are asking the questions, not answering them! And I’m usually the one reading about it. It’s a pleasure to have you here. I’m really looking forward to this and I appreciate your taking the time to join me.
You’ve both been at U.S. News since the beginning of the college rankings, I think it was the 1980s, right? Around 1987 – correct me if I’m wrong.
Bob: Yes. It actually started in ’83 and ’85. They were very short versions in the weekly magazine. 1987 was the first guide book. That’s when we started doing them annually.
Linda: Can you give a little of the backstory – how did this get started? What criteria were there and how did you choose the criteria that you focused on? Has there been an evolution of the criteria over this period in the rankings?
Bob: There has definitely been an evolution of the criteria. At first the rankings were based on reputation alone. College presidents picked their top 10 schools, and the schools that had the most mentions were the ones that were rated. ’87 was the last reputation-only ranking.
Then we realized that for the rankings to become relatively more acceptable in higher education, we needed to bring in statistical data. We met with a lot of higher educators and came up with criteria to measure academic quality which is the basis for the rankings today. We knew that there was a big gap in needing consumer information on how to compare schools to each other, and we thought we could fill this gap by bringing in statistical and reputational data.
And we have evolved the methodology over time from at the beginning we had put more emphasis on inputs and recently we’ve been evolving towards putting more emphasis on outputs, graduation and retention and predicted graduation rate, what’s now account for thirty percent of the methodology and we’ve deemphasized – to at least to some degree – inputs.
Linda: Like test scores, grades, etc?
Bob: Well we never had grades –
Bob: Like at one point we had yield, and we dropped yield. And we have reduced the weight of high school class standing because it’s become less prevalent at least on the east and west coast on high school transcripts I’m sure you’re aware of that. And we made other– we added the predicted graduation rate. That was something that wasn’t part of the initial methodology. So those are some of the changes that we made.
Linda: Is that specifically for the college rankings or are you also talking here about the grad rankings? What are some of the differences between the two?
Bob: The grad and college rankings are different for 2 reasons: The college rankings are at the institutional level only, meaning that we’re focusing on data that’s at the undergrad level. The graduate rankings are at the program level – the MBA degree, the medical school, etc. Each one has its own methodology that picks up characteristics of the program. Some of them are reputation only. For the college rankings, there are approximately 1400 schools using basically the same methodology, ranking factors, which are compared to schools that have similar characteristics.
Anne: The undergraduate best colleges ranking isn’t just one ranking. It’s really 10 different rankings. National universities are ranked against themselves, national liberal arts colleges are ranked against themselves, etc. It gives people more options to compare schools to their own peer group, in a way you often can’t do just by doing your own research.
Linda: It also gives people ideas as to what are peer schools. If school A appeals to you, but is too expensive or too far, you can find other similar schools that are less expensive or closer.
Anne: One of the real values to readers is that you can see schools that you might never have known about and might be of interest.
Bob: That was actually one of the bigger successes of the rankings. It has made people more aware of the variety of U.S. higher education. There’s still a lot of focus on top schools, but we made people more aware of the diversity of U.S. higher education because of using the categories.
Anne: One of my favorite rankings is the A+ Schools for B students that correlates quality and academic excellence with the scores and grades in such a way that you can see what schools you can get into if you don’t apply with a GPA of over 4.0.
Linda: That sounds very valuable. Is that available only online, or online and offline versions?
Anne: Both places.
Bob: Everything that’s in print is online, but the online version has far more data. The paid version has the fullest rankings and data.
Linda: Just to clarify for myself, as well as for listeners. Bob, it seems to me that you’re the statistical guru behind the rankings. You make sure that everything is correct and meaningful and provides something of value. Anne, will you go a little bit into what your role is?
Anne: Sure. I’m the editor overseeing the publication of our print guidebooks. An advantage of seeing the rankings in print is that you can scan a lot of information on one page, without a lot of scrolling and clicking. It’s a much more compact visual version of what’s available online.
Linda: What is the purpose or primary benefits of the rankings material offered by U.S. News through the rankings?
Anne: It’s a great tool for families at the undergraduate level, or for potential grad students to be able to have a place to start with a lot of information. We never say it should be the basis for a decision.
Bob: It’s an incorrect use of the ranking to make it the sole determination in a decision.
Anne: The type of data that you see allows you to get and idea and sense of where you might be a real good fit and begin to narrow your choices. Also, to make schools visible to you that you may never have considered before.
Linda: Bob, do you have anything to add to that?
Bob: People are using the website more broadly than just focusing on the top schools. Ninety percent of the people who are visiting USNews.com are looking at schools outside of the top 10. There’s also a growing need gap to inform people of the difference between different schools, so they can make comparative choices. That’s the core reason that we’re still doing the rankings – the consumer purpose.
Linda: I’m going to be very frank about it. What I frequently tell people is that the U.S. News rankings are a treasure-trove of data. The rankings themselves, the fact that a school is ranked 1, or 3, or 5, or 15, with a comparison between 6, 7, and 8 or 16, 17, and 18 is less valuable than the actual data that the rankings contain. To me, just this rich collection of data is the biggest benefit of the U.S. News rankings. So, in my work, I frequently suggest to applicants to use the data in the rankings to do research. I frequently say it’s a great place to start your research, it’s a terrible place to end it.
Bob: We would agree. There’s not a meaningful difference in the ranking range you cited, but there is one between the 10th ranked school and the 80th.
Anne: What you said about the data is so key. With college costs so high, people make the decision of where to go without considering things like how long it will take to graduate and freshman retention.
Bob: We definitely agree with you that students and their parents should visit a school, consider the environment, what it costs, what the activities are, what kind of majors are offered, etc. These are all things that are vitally important to consider, and you have to go way beyond the ranking.
Linda: I told my staff that I was going to be talking to you folks, and someone asked why some programs are included in the rankings and some of them aren’t. For example, Tufts’ PA program isn’t in the rankings of Physician Assistant programs.
Bob: For the graduate rankings, you have to have an accredited program at the time we create the ranking universe.
Linda: So for graduate schools it’s a matter of accreditation, and there’s a minimum size of the class, isn’t there? The survey responses have to have a minimum size?
Bob: To be included in the full-time MBA rankings you have to have a minimum number of graduates, and a percentage of graduates that are job-seeking. So there’s a minimum size for the graduating class. For the other grad rankings there isn’t an enrollment threshold to be included in the rankings. It’s whether you meet the accreditation standard. For some others it’s whether you offer a PhD.
Linda: It varies on what’s actually being ranked. That frankly makes a lot of sense.
Do you ever audit the information from the schools, or are you concerned about the veracity of the information you’re using, given that admissions directors can lose their jobs if they fall in the rankings and this kind of thing? There’s tremendous incentive to falsify the information you’re basing the rankings on.
Bob: There may have been some cases where schools have misreported, and we’ve taken them out of the rankings. We believe that, since we’ve written about it, there have been rare cases of this happening. We don’t audit the data but do cross-check it with public sources as a way of verifying the data.
Linda: I mentioned that I do use that data that you provide, but I also use AMCAS’s MSAR for medical schools, and sometimes I see differences in the data.
So for example, U.S. News shows that a total of 8068 applicants applied to Mayo Clinic, 606 were interviewed, 168 were accepted, and 97 enrolled. MSAR showed 4991 verified applications, 437 as opposed to 606 interviewed and 52 matriculated. Those are fairly significant differences. And the funny thing is that the school supplied the data to both MSAR and to U.S. News. Do you have any idea what’s causing the difference?
Bob: I’m not sure the school actually supplied the data to MSAR. Doesn’t the medical school have an application system? They may be tallying the data themselves.
[NOTE: Subsequent correspondence confirmed that the discrepancy has nothing to do with unverified applications, which are not counted, and has everything to do with MSAR not including the Arizona apps. That correspondence also confirmed that MSAR and U.S. News take data integrity seriously.]
Linda: One possible thought that I had, since you showed 8068, that your number may be the total number of applicants who actually clicked the box, I want to apply to Mayo Clinic in this case. The MSAR number may be the actual number who submitted secondary applications. There would be a certain weeding-out there. That’s one possibility that I thought of. If you know, you do, and if not, that’s fine.
One of the things that you alluded to Bob and Anne, is the increasing importance of test scores for admissions. Do you agree with this assessment? Why or why not?
Anne: It’s probably true that the rankings make test scores visible, and schools may be trying to raise their test scores. They may just be trying to improve the quality of students that they’re bringing in. At the same time, a growing number of schools are going “test-optional,” and making tests not a necessary part of the admission process any more.
Bob: There are many trends with test scores. Of the 1400 schools that we rank, nearly 30% of them are now test-optional. At least among that 30%, they’re less important than when they were required for all students. Among schools where testing is still required by all, their test scores are still emphasized and important for those schools.
Anne: Although there are many factors that are looked at when considering admitting a student, academics are still foundational. The test scores are one measure that help them compare students across the very diverse school systems they’re accepting students from. AP classes are also important because that shows a certain ability to handle the rigors of college courses. Admissions departments are very concerned that they bring in students that are going to succeed.
Bob: Test scores count 8.125% in our ranking. They’re not the most important factor but are considered.
Linda: I don’t have any problem whatsoever in including the test score in the rankings. I’ve seen articles over the last 20 years where deans were very, very critical of U.S. News in particular for including test scores in the ranking. It seems to me, especially on the graduate school level, that if I look at business school, or to a lesser extent, medical school, and at law school prior to 2008, test score numbers were going up. I don’t know that people are really getting smarter or that they’re that much more prepared, but I think the way they are prepared is obviously that more and more are taking test prep. Test prep is getting better and better.
The other factor, I think, that has very much unintended consequences, and has heightened competition enormously at the elite undergraduate universities is the common app.
The fact that it’s very easy to send off multiple applications to many top universities has caused people to do so. In terms of the time, it’s lowered the cost of doing so. And I think that has contributed to a certain frenzy to undergraduate higher-ranked school admissions market.
In the graduate market, what I’ve seen is an unending rise in average GMAT scores. So if you look at a U.S. News’ top 10 business schools, I think the average GMAT score is 724, and only one with an average score of under 720. When I first started doing this work, a score of 700 was considered excellent. So are schools looking more at the GMAT score? Is test prep improving? I think those are all legitimate questions. I also think the average GMAT score and the ranking is being used to impress alumni, potential donors, and potential students. In that sense, it may be used in ways that you folks never intended, and the test-makers never intended.
Bob: Generally speaking, the GMAT or GRE is required at the full-time MBA level. There are some part-time MBA programs where they’re not required. There are several theories: there is a bigger applicant pool, there are more international students, there’s test prep so the students are getting higher scores.
Anne: At the elite level, at least at the undergraduate level, full classes of qualified students are being turned away. At that level, they have many more people with perfect SAT scores than they can take. This is probably true at the most elite graduate programs too.
Linda: You don’t get the perfect scores as much as in the undergraduate level, but you do get very high scores, and these scores are going up. Like I said, if I take the average GMAT scores for the top 10 business schools, they’re going up.
AMCAS came out with a new scoring system for the MCAT, which we’re actually going to talk about in a second, and they hope that the average score will be right around 500, and schools would focus less on the highest MCAT scores and look at other things. The MCAT has been going up ever since they came out with the new scoring system for matriculating medical students in the U.S.
So I wouldn’t agree with blaming U.S. News for this. I can remember being with a law school dean years ago that got up and was blasting people who focused solely on GPA and LSAT scores as criteria for admissions to law schools. It’s a holistic approach, it’s much more diverse than that. He got off the stage and there was a panel discussion of law school admission people, and all they talked about was GPA and LSAT – that was it. Now obviously some schools are taking the GRE too. I’m not sure it’s fair or right to blame or credit U.S. News for this focus.
Bob: In the graduate programs you’re citing, MCAT is required, GMAT is generally required, the LSAT is required by the programs. U.S. News isn’t setting the standard – we’re reflecting the standard set by the accrediting body, or the schools themselves. The highly ranked schools have better scores – it takes higher scores to get into the better schools. Applicants can assess their competitiveness and see whether it makes sense for them to invest the time and money in applying.
Linda: Can you give us an overview of how you put together the rankings? What’s the process? Bob, why don’t you start in terms of the actual rankings, and then Anne, you can describe the additional content and production process.
Bob: I also have an answer for your Tufts question. We actually ranked the PA program the last time. The rankings were published in 2015. We did the surveys in 2014. Tufts first class started in January, 2013. So at the time we did it, they had provisional accreditation. At the time we did the current PA ranking, you had to be fully accredited to be included in the ranking.
In terms of how we put together the ranking: We conduct our own surveys. We collect all of the statistical and reputational data in-house. The surveys are updated and launched to the schools several months before the results are released. The data is collected and QA-ed over the course of a couple of months. The results are published on the website a few months after the process stared.
Anne: While he’s doing that, my staff and I are figuring out what are the most compelling trends and what schools do we want to take readers inside. The rankings are the centerpiece of the guidebooks, but there is a lot of other information for potential students. This is published both in the guidebooks and online. Chapters include visits to 12 schools, how a high school student should view their high school career to put themselves in the best light when applying to college, and financial information. The process takes about five months.
Linda: Why did U.S. News develop “The Best B-Schools” ranking – the paper version, as opposed to the old grad schools ranking – and can you tell us a little about what it is?
Anne: We’re still doing the graduate school project and ranking and online content. We realized that we weren’t able to offer depth in all of the different types of programs in one place. Business is one of the most popular undergraduate and graduate programs, and we felt it would benefit the most people to have an in-depth look at business schools. We have best MBA programs, part-time MBA programs, online MBA programs, and executive MBA programs. We also include the best undergraduate business programs. The majority of the book focuses on an overview of what the trends are in business education. MBAs are much more specialized now. One of the most exciting trends is U.S. business schools partnering with international business schools, allowing students to study in the U.S. and abroad, and finishing with more than one degree and/or certificate.
Linda: What are your plans for the rankings and guides in the future?
Anne: For the next year, we’re sticking with what we’ve got. If the business schools ranking proves popular, we may expand that into a series.
Bob: We’re exploring other ways of doing other rankings.
Linda: In terms of how I look at the rankings, one thing I find particularly useful is the more specific you are, the more valuable it is.
Bob: One of the things most valuable is undergraduate majors. We’re looking at doing things with more discreet majors.
• U.S. News Grad Rankings
• U.S. News College Rankings
• U.S. News Guides on Amazon
• Medical School Selectivity Index [Can I Get Into My Dream School?]
• Business School Selectivity Index [Can I Get Into My Dream School?]
• Fitting In and Stand Out: the Paradox at the Heart of Admissions
• Optimize Your Graduate School Application: Grades, Scores, Essays, Resume, Activity History, and More
• UVA MS in Global Commerce: 3 Continents, 2 Masters, 1 Amazing Year
• Interview with John Byrne
• LSAT, Debt, and Bar Passage with Law School Transparency