So Your Hospital's Ranked #1? Not So Fast

You've seen them.  The hospital rankings blasted across the covers of magazines (and probably, if she was named, your doctor's office, too).

But how real are they really?  Do they just reflect good PR?  And instead of their "reputations," shouldn't they be more proud of their decrease in number of hospital-acquired infections (HAIs)?

That's a position a Wall Street Journal op-ed writers want to know.

The always eagerly-awaited U.S. News & World Report list of best-ranked hospitals for 2012-2013 was released last week

Ezekiel K. Emmanuel, chairman of the Department of Medical Ethics and Health Policy at the University of Pennsylvania, and Andrew Steinmetz, the senior research assistant, write that the highest-ranked hospitals "are always quick to tout their rankings in hopes of attracting new patients who will pay top dollar," and, of course, many Americans will use this list when they are selecting a hospital.

But what these authors feel is that "these rankings are not all they're cracked up to be."

They point to the fact that the methodology used by U.S. News & World Review is "flawed to the point of being nearly useless," because the criteria it uses can "encourage investments in higher-cost and lower-quality care."

Emmanuel and Steinmetz claim that the criteria "are unrelated to quality, easily manipulated, and incentivize the wrong choices and behaviors."

The way the rankings work is by sending out surveys to randomly selected specialists, and then averaging the responses over the past three years. But the sample is too small and not "generalizable," they report.  Of the 200 physicians per specialty sent the survey, only about 30% respond.  So that's 55 to 75 doctors per specialty who answer which hospitals they consider to be the best in their specialty for complex or difficult cases.

And most physicians have little direct knowledge of the quality of care delivered at neighboring hospitals, much less at ones clear across the country, Emmanuel and Steinmetz say.  It's a vicious cycle.  So reputations assessed by physicians tend to be based on rankings—like the U.S. News ones, the writers state.

Another area of contention is that "the main focus is on the degree to which hospitals use certain 'cutting-edge' technologies." The problem, the writers allege, is that many of these technologies are not linked to higher quality, citing hospitals that use robotic surgery for prostatecomies.  Sure, they're cutting edge.  But there's no proof so far that these surgeries, 'way more expensive than the open ones they replace, have outcomes any better.

Other criteria, like patient safety and mortality, also go into the survey, but Emmanuel and Steinmetz point out that's what being measured is not a reliable means of comparing.

The writers say that with response rates and sample sizes so low, this information would not come close to qualifying "for publication in any reputable medical journal."

So how do you find out which hospitals really are the best?  Check ones running trials for your illness, research them online, or, last resort, ask your doctor.


Popular posts from this blog

Think You're Pretty Smart? You May Actually Stink at Visual Skills, Crucial in Today's Digital World

Leave Your Ego at the Door

End Your Texts With a Period? Don't