The paper from the New England Journal of Medicine that reports azithromycin might cause cardiovascular death is not new to electrophysiologists tasked with deciding antibiotic choices in patients with Long QT syndrome or in those who take other antiarrhythmic drugs. Heck, even the useful Arizona CERT QTDrugs.org website could have told us that.
What was far scarier to me, though, was how the authors of this week's paper reached their estimates of the magnitude of azithromycin's cardiovascular risk.
Welcome to the underworld of Big Data Medicine.
Careful review of the Methods section of this paper reveals that "persons enrolled in the Tennessee Medicaid program" were the subjects, and that the data collected were "Computerized Medicaid data, which were linked to death certificates and to a state-wide hospital discharge database" and "Medicaid pharmacy files." Anyone with azithromycin prescribed from 1992-2006 who had "not had a diagnosis of drug abuse or resided in a nursing home in the preceding year and had not been hospitalized in the prior 30 days." Also, they had to be "Medicaid enrollees for at least 365 days and have regular use of medical care."
Hey, no selection bias introduced with those criteria, right? But the authors didn't stop there.
This study used a "matched" control period in which no antibiotics were prescribed and were "frequency-matched according to a propensity score that was calculated from 153 covariates." (Editor's note: No doubt there are no more covariates in medicine than the 153 they studied.)
Then, as if to finally admit a smidge of bias to their study design, "to attempt to control for confounding by indication, we also included as additional control groups who took three other antibiotics." As if THAT will fix the data fields erroneously entered or neglected in the interlinked retrospective databases.
But why focus on the details? The authors had something to prove!
So they processed and pureed the data and checked "for misspecification of the propensity-score regression models" by evaluating "whether the covariate distributions were balanced across study groups." In other words, they made sure the data worked the way the authors thought it should.
Hey, why not?
Finally at the end, they "estimated" (their word, not mine) the difference between the cumulative incidence of cardiovascular death during a 5-day course of azithromycin and the incidence of a similar period of amoxacillin use.
Never mind that they admitted in their discussion that "as many of 25% of patients would be misclassified as having died from cardiovascular causes" and that "they cannot establish a specific causal mechanism."
To think that despite all of the confounding factors that the authors had the balls to state that "as compared with amoxacillin that there were 47 additional deaths per 1 million courses of azithromycin therapy; for patients with the highest decile of baseline risk of cardiovascular disease, there were 245 additional cardiovascular deaths per 1 million courses" is ridiculous. Seriously, after all the manipulation of data, they are capable of defining a magnitude to three significant digits out of a million of anything?
Please.
But we should not dwell on these details, should we? After all, this work was published in the journal with the largest impact factor out there: the infamous New England Journal of Medicine. No doubt we can look for more high quality retrospective database reviews in the years ahead as Big Data Medicine takes hold.
-Wes
5 comments:
Thanks, but I'm sticking with my philosophy; drugs only when absolutely needed and they must not be on the Arizona CERTS list. Other than the "You will absolutely die if you do not take this drug" situations I do not veer from this course.
It won't be long before the trial lawyers start running TV ads fishing for spouses of folks who died of sudden cardiac death while taking azithromycin.
Thanks for this, a question I have about this study is how the authors could get away with 'making up' data (for cardiovascular deaths); there were records for 347,795 prescriptions for azithromycin, but data were converted to deaths per million courses of antibiotic; who is to say that the next 652,205 people taking azithromycin would have had the same experience of sudden cardiac death as the first 347,795 people? May be that's what statisticians do, but I thought medicine was supposed to be evidence based. Can someone shed a light on this aspect of the study? Dr Robert
Was able to refer the pulmonologists to this info today. We were discussing tx w/ azithro for pts w COPD/chronic bronchitis and the attending was saying it could cause SCD.
CardioNP
Just a little math to put this in perspective:
If you saw 25 patients a day, and prescribed each one of them azithromycin (indiscriminately), you would have to work every single day for 109 years in order to see 47 additional cardiac deaths that were reported in the study.
Post a Comment