- School Closures Are Hitting Preschools Hard - May 5, 2020
- The Boutique Schools Of Our Future - August 1, 2019
- The Power of the School Visit as PD - April 30, 2019
- Responding to DeVos’s Negligent #SOTU19 Response - February 7, 2019
- This HS Senior Was Accepted to 149 Colleges. That’s a Problem. - April 6, 2018
- As a Teacher and Michigan State University Alum, I'm Embarrassed and Hurt - January 24, 2018
- The Devaluation of the School Counselor - August 14, 2017
- Summer Break: An Antiquated Institution That Needs To Go - June 26, 2017
- The Post's 'America’s Most Challenging High Schools' List Is Deeply Troubling - June 5, 2017
- I Tutored The Same College Student For 4 Years. Here's What I Learned. - May 15, 2017
As a College Counselor, a certain part of my world is dictated by ratings. I have parents and students who speak with me all the time asking about colleges that they have heard of because they may only appear in some ranking list. The most popular of the lists that I hear about is U.S. News’ canon, the Best Colleges report. This series of lists are so popular that students proclaiming that Princeton is better than Harvard are often just referring to their relative positions on the Best Colleges list, which are numbers 1 and 2.
The U.S. News has been publishing their college lists since the mid-1980s, so it is no surprise that families refer to them. Regardless of their age, their methodology has been hotly debated over the past fifteen to twenty years, and many would argue the actual value of these rankings. Some would say that lists are a necessary way to distil large quantities of research into digestible morsels. I would argue that this type of thinking, along with the U.S. News’ entire collection, have fueled our cultural desire to put things in some sort of ranked order. We love rankings and provide lists for nearly everything these days. From college ranking sites like CollegeChoice.net, Niche, and College Raptor to Vox’s rankings of Meat and Buzzfeed’s rankings of the hottest Vice Presidents, I would daresay that we are obsessed with rankings.
The trouble here is not so much that we enjoy putting things in order on a list -- I certainly love my to-do lists and could not go a day without them. Rather, the issue is that lists are fundamentally subjective. The search for a college is a deeply personal process that challenges students’ notions of themselves, their views of the world, and what they see for their own future. It pushes students to envision themselves beyond their current situation -- a significant developmental challenge for any adolescent -- and to imagine where they might like to be in a few more years time. Very little of this process can be distilled down to any sort of list. Sure there may be the deadlines and a need to create things to be submitted that might get added to a list but rarely do I work with a student and use any sort of ranked list of colleges, even for reference purposes.
Furthermore, we know that the ranking and list method is flawed. It has been written about by the Atlantic, the New Yorker, the San Francisco Chronicle, Inside Higher Ed, the Huffington Post… you get the idea. Yet we continue to be smitten by the idea that one college might be better than another for some reason or another; that for some reason the ranking of Princeton as number 1 and Harvard as number 2 matters.
With all of this turmoil surrounding the ranking of colleges, our obsession with lists and one school being better than another has led to the creation of another complete set of rankings for high schools. In particular, the U.S. News’ Best High Schools Rankings (first out in 2007) and The Washington Post’s America’s Most Challenging High Schools (first out in 1998). With the U.S. News’ rankings receiving most of anti-ranker’s wrath, this article will focus on the Post’s list.
Pioneering the field of high school rankings, the Post’s list is compiled by distinguished WP columnist Jay Mathews whose resume is inspiringly impressive. It uses a simple methodology:
“America’s Most Challenging High Schools ranks schools through an index formula that’s a simple ratio: the number of Advanced Placement, International Baccalaureate and Advanced International Certificate of Education tests given at a school each year, divided by the number of seniors who graduated that year. A ratio of 1.000 means the school had as many tests as graduates.”
The list is easy to navigate, and honestly quite addicting. I found myself searching for my alma mater high school (Did not make the list. Must not be rigorous enough?) and my current employer (Also did not make the list, because the list does not include independent schools). I rooted around for high schools that my friends went to, and local area high schools that I am familiar with. My searching was fun but troubling.
The first troubling idea is that the list is built on a hierarchy of challenging versus non-challenging schools. The notion of what may be challenging or not is incredibly subject, and the methodology grossly oversimplifies this subjectivity. For example, based on Matthews’ methods, four states simply do not have ANY “challenging” schools: Alaska, Mississippi, North Dakota, and New Hampshire. Sorry students that are in those states. Looks like you are out of luck even before you begin your high school career. Not a single school in all of your entire state is worth the list. That is 791,846 square miles of the United States of America completely void of a “challenging” high school.
The second troubling idea here is that no other factors outside of AP tests are able to measure challenge. That somehow a volume of a group of tests is able to point to the rigor a school presents for its students. With the list resting simply on this method of measurement, it is actually more of a way to measure the relative socioeconomic status of these schools. With each AP test costing $93 ($53 for students with financial need) plus time and resources to administer, removing the funding model from the methodology creates an entirely unbalanced means of measurement.Also missing is the fact that there are more than 26,000 high schools in the United States, and this list only includes 2,369. Click To Tweet
Funding alludes to other troubling aspects of this list. The rankings assume that a whole host of things simply does not matter. Things such as geography, school location, charter versus public, community relationship with testing, and countless others are just not included. Also missing is the fact that there are more than 26,000 high schools in the United States, and this list only includes 2,369. Where are the rest?
With our cultural obsession over rankings, it is difficult to avoid the creation of lists similar to those found in the U.S. News and Washington Post. Regardless of this cultural trend, the existence of these lists does far more harm to their consumers than they do good. By employing limited methodologies that often ignore fundamental differences in schools, it does not matter if it is colleges or high schools being ranked. Instead, what the consumer is left with is a subjective snapshot that increases their anxiety about whether they attend the best and most rigorous school in America.