Monday, June 09, 2008

More Rain on the "Report Card" Parade

It seems hospital "report cards" documenting "quality measures" fail to predict "preventable" deaths after bypass surgery.

Maybe the idea of Medicare reimbursing for "pay-for-performance" measures really misses the point: they should pay for outcomes, not for the performance of mere documentation. It's like asking a student to grade their own class performance: funny how almost all of them get A's that way.

For instance, here's the "quality data" of three different hospitals in our area:

Percent of Surgery Patients Who Received Preventative Antibiotic(s) One Hour Before Incision92% of 885 patients95% of 435 patients94% of 357 patients
Percent of Surgery Patients Who Received the Appropriate Preventative Antibiotic(s) for Their Surgery98% of 909 patients97% of 448 patients97% of 362 patients
Percent of Surgery Patients Whose Preventative Antibiotic(s) are Stopped Within 24 hours After Surgery90% of 856 patients90% of 417 patients87% of 339 patients
Percent of Surgery Patients Whose Doctors Ordered Treatments to Prevent Blood Clots (Venous Thromboembolism) For Certain Types of Surgeries92% of 182 patients92% of 150 patients95% of 178 patients
Percent of Surgery Patients Who Received Treatment To Prevent Blood Clots Within 24 Hours Before or After Selected Surgeries to Prevent Blood Clots92% of 182 patients92% of 150 patients88% of 178 patients


Now, which of these three hospitals will give you the lowest bypass mortality?

Stumped? Gosh, how can you be? I mean the data are so, well, CLEAR!

Just think: how many chart-reviewers were required to review these charts and gather these data? How many hours? How much money do we spend annually to assure these "quality" data are posted to make the governmental "grade?" Most importantly, with hospital reimbursements tied to such worthless performance measures, how are our patient "consumers" ever going to use this data when the variance between centers is so slight and skewed consistently toward perfection?

You get what you pay for, alright. Pay for "good" data and you'll get "good" data. After all, it's the "Skinner Box" effect: if hospitals push the right levers, they get their "conditional reward" of slightly higher Medicare reimbursements.

Even if these measures don't mean dog-doo-doo.

-Wes

2 comments:

Ian Furst said...

Hey Wes, my comments were too long for here so I posted them at my site - I know the author of that study and he does a lot of work on the reasons for error (random and otherwise) in scorecards. it's no wonder that they really are not comparable. Ian

Rogue Medic said...

Perhaps the people who come up with this stuff never really understood how to use statistics. Not having competence in the use of statistics is not going to stop them.