Dear Newspapers, Please Stop Ranking My School


Dear Businessweek,

Last week you ranked my school as the number six best school in America to attend for undergraduate business. Thanks for the honor! Before I get my party hat on, let me take a few moments to understand what exactly I owe this honor to.

You see, since as long as I can remember, you’ve been releasing annual reports which tens of thousands of students, parents, and counselors use to judge a school. After four years of watching my rank bounce around, it seems your math doesn’t really add up. So I took the liberty of reading your article explaining how the scores are calculated, but I still have a few questions for you.

Student Assessment (30% of Score)

I understand you surveyed almost 100,000 students from all over the country to ask them about topics such as their quality of education to career resources. That’s fantastic - but to what extent does the administration get involved? After all, who wouldn’t want to report their school as “awesome” if its for Businessweek rankings. I bet some schools would even go as far as creating a campaign to boost participation in the survey (and maybe a gentle nod to make the school look good?).

You also mention that the student assessment score is a weighted average of the past 3 years of data. For schools that were not surveyed the past 3 years, you “used estimates calculated by a team of statisticians”. I’m an aspiring data scientists myself, so how exactly do you estimate data that never existed? [This would be super helpful for my thesis!!!]

Academic Quality (30% of Score)

The metrics used in this section were a lot of the usual suspects, but some were particularly interesting. Ratio of full-time students to full-time faculty (reported by the school). What do you classify as full time faculty? Lecturers? Tenured professors? Grad students? I looked into historical records to see if at any point you published this ratio in your results, but you only show the average class size. Why show us average class size when the metric that counts is student to faculty ratio? Conveniently, none of these school publicly list their student to faculty ratio either. A little suspicious - it seems pretty easy to inflate this metric as the term “full time faculty” is open to interpretation here.

Also what’s up with the “Average number of hours students spend preparing for class per week?” Is this supposed to represent how hard our school is or how smart our students are? Let me know so I can adjust my answer accordingly for this year’s survey.

Employer Opinion (20% of Score)

The process of calculating employer opinion is

  1. Request a list of recruiting companies from each school
  2. Narrow down on this list and contact a subset of recruiters
  3. Record their response and include them in your calculations.

My hunch is that schools are incentivized to provide as many local companies as possible. If a company recruits exclusively (or heavily) from X University, of course its recruiter will respond with the highest satisfaction of students from the same university. If they didn’t, they’d probably stop hiring from that school! The truth is that different companies have different levels of standards for hiring, by allowing schools to select which employers to survey, you have no sense of normalization in your data set.

Median Salary (10% of Score)

Fair enough. We all gotta get paid. Moving along…

Feeder School (10% of Score)

A metric to measure “which schools send the most grads to the top MBA programs”. Let’s pause for a moment here - wasn’t the whole point of MBA programs to provide a business education to those who didn’t have any formal background in business? Why is this metric included in the survey? Let me guess, you guys also rank MBA schools don’t you?

Look, I Get It

You guys have mouths to feed too. Changing the rankings every year is exciting, sexy, and sells. I’ve been reading Businessweek since I was 14, so don’t get me wrong here, but your rankings misguide thousands of students each year as they make one of their life’s most important decisions. Not only are you confusing innocent high schoolers who already know little about the real world (18 year old me), but you’re sending my school administration crazy trying to make sure our rank doesn’t plummet this year because your “statisticians” invented a new metric. Think about all the real work/value they could be producing instead of worrying about you guys!!!

If you’re a news company, then you owe your readership some integrity in your findings. You can start by validating your work (ensuring accurate responses from schools) and being a little more thoughtful about your metrics (Hours spent on homework? That’s just silly). I day dreamed about business school thanks to your editorial staff, so this letter really is coming from the bottom of my heart. Please don’t misguide parents and future college students though, we’ll have mouths to feed too you know.


Disappointed,
Almost College Grad

The thoughts in this post are solely mine and are in no way affiliated with the University of Texas at Austin administration