College evaluation systems lack credibility

Every August, thousands of readers — including college applicants, parents, college admissions staff and even college presidents and boards of directors — anticipate the release of the U.S. News & World Report’s “America’s Best Colleges,” the premiere ranking guide that surveys 1,400 colleges and universities nationwide in an effort to classify and order schools according to statistical data and name recognition.
Prospective freshmen and their families eagerly wait to discover which institutions are the “best” while the institutions hope their rankings will catch the attention of these restive eyes.
Linfield College, however, doesn’t even seem to hold its breath.
“We don’t wait until August to see how we did in the rankings,” President of College Relations Bruce Wyatt said. “When we are asked just how good Linfield
is … we think we got a better feel of that.”
Since the rankings’ inception in 1983, U.S. News and World Report has drawn both praise and fire for its use of peer assessments, name recognition, financial data and applicant profiles to create a pecking order among American institutions of higher education.
Many colleges use the rankings as an outlet to provide abbreviated information about their own unique attributes and to recruit students.
It is to make themselves known to potential buyers in an otherwise crowded marketplace. However, such as in the case of a Clemson University professor admitting to the university’s distortion of numbers and data to improve their rankings, the values can have a superficial quality.
“It’s a beauty pageant,” Linfield College President Thomas Hellie said. “I have heard of an East Coast college calling a West Coast college and saying, ‘Hey, we are not even competitors, but if you rank me higher than my peers in my region, I will do the same for you.’ There are even college boards of directors who ask their presidents to work on increasing the college’s rankings.”
In more recent times, there has been a growing movement among colleges and universities to not cooperate with U.S. News & World Report’s ranking survey.
In May 2007, the Annapolis Group, a national organization of liberal arts colleges, published an article on its website that included statements from college presidents speaking out against college rankings.
Shortly after the article’s publication, the majority of the group voted against participating in the reputational part of the survey, which accounts for 25 percent of the rank.
As the current vice president of enrollment, Dan Preston is one of thousands of college administrators who receive the peer assessment survey in the mail and are asked to rank other schools.
“I rank one school [Linfield] and leave the rest as ‘I don’t know enough information,’” he said.
Preston has served at Linfield College since 1983, in both the admissions office and in his current position, and has observed the effects of the rankings on Linfield College.
“Rankings just don’t have a direct correlation,” he said. “When we were the No. 1 comprehensive college, we had a couple of years with lower numbers [of students enrolling] and had a couple of years of highest numbers ever. Last year, we were ranked No. 122, yet we have the highest enrollment ever.”
Preston said Linfield relies on what is real and authentic.
“Students are coming here, investing in their education and graduating at high rates — that is what is more important. Our graduation rate is higher than our predicted rate,” Preston said.
For some students, rankings did not have a significant role in their college decision.
“No [I didn’t use college rankings], I think most people already have an idea of what they want,” freshman John Portin said.
Freshman Walker Allen said he went by word-of-mouth when he chose Linfield.
“I did know [Linfield] was a nationally ranked school, but I didn’t look it up online,” sophomore Kate McMullan said.
Wyatt credited students with the ability to measure the true value of a college and ignore the brand name that may be attected to an institution.
“Linfield’s constituency is less ‘status-conscious,’” he said. “They judge us based on quality and by what they get — they are less concerned by how some magazine quantifies us. Alumni are more appreciative of their professors and of the friends they made — they are not into brand recognition.”
The incongruency of rankings with the complexity of a college community makes the simple answers that these rankings seek to provide questionable.
“Overall, I think [college rankings have] harmed the admissions process — the task of selecting a college requires a more nuanced and deeper look than what rankings provide,” Hellie said.
He said he is committed to not compromising Linfield’s integrity by manipulating its ranking and, rather, tries to convey the rankings as worthless.
“The very fact that different magazines and organizations use different ways to rank colleges show how foolish it is to rank colleges,” he said. “Colleges cannot be ranked.”
Rather, he said, it is about finding the right fit.
“One time, I went to a store to buy a suit, and I wanted to buy this name-brand suit,” Hellie said. “But the people in the store said, ‘No, you shouldn’t buy that suit because the shoulders are too narrow. You should buy this suit.’ I did buy that suit — selecting a college is kind of like that.”
While rankings can be useful for sifting through the mountain of information regarding colleges, at the end of the day, it’s about looking at the data beyond the numbers, Preston said.
“The data [rankings] collect on schools is accurate, the calculation formula they choose is generally objective and the formulas have sound calculable mathematical principle to them,” he said. “But is that a really good way to figure out where you want to go to school?” Joshua Crisp

Freelancer Joshua Crisp can be reached at

Leave a comment

Your email address will not be published.