For evidence-based treatments, hospitals achieved around 95 percent composite performance on 12.5 million opportunities to perform care related to the measures, the Joint Commission said. In 2002, the hospitals got 82 percent composite performance on 957,000 opportunities.While I understand the theory (and no doubt patients with heart disease in particular have benefited), if everyone consistently achieves near-perfect benchmarks because that's how they're paid and resources are expended to make sure that's the case, how do these quality managers justify their continued existence?
For instance, in 2009, almost 98 percent of hospitals provided proper heart attack care, such as giving aspirin at arrival or beta-blockers at discharge, compared with only 89 percent seven years earlier. For surgical care, such as giving antibiotics an hour before surgery, overall performance improved to 96 percent, up from 77 percent in 2004. For pneumonia care, results jumped more than 20 percentage points - in 2002, this measure was only at 72 percent. By 2009, it reached 93 percent.
The answer is simple: make more quality measures. This is, in fact, what has happened every year since this program's adoption in 2005.
The concern, of course, is that at some point more money will be spent on hoop-jumping in the name of quality scores and patient satisfaction surveys than on the actual care delivered, especially to areas not measured.
Then what will we have achieved?
-Wes
2 comments:
It's not the quality that's important, it's the high marks!
I cringe when I read emails at work which say that meeting these benchmarks means that we're delivering excellent care.
Me thinks it is a case of missing the forest for the trees.
CardioNP
Post a Comment