Saturday’s Telegraph reported that US News & World Report recently released its 2016-17 Best Hospitals rankings and listed the Medical Center, Navicent Health in Macon as tied for third best in Georgia. Yet according to another recent Telegraph article, the Centers for Medicare & Medicaid Services (CMS) recently gave the Medical Center a dismal two-star rating out of five.
What gives? Both of these rating agencies seek to guide patient choice, yet their opinions differ wildly.
I’ve been a Macon neurologist in private practice for 36 years, and throughout this time, I’ve been an on-call, nights and weekends neurological consultant to The Medical Center. This full-time private practice, part-time hospitalist perch gives me an excellent perspective. I see and hear of patient outcomes — both successes and failures — for hospitals throughout Georgia and particularly Middle Georgia. And I can say without equivocation that US News is right. The Medical Center is an excellent hospital. How then could CMS get it so wrong?
First of all, CMS is a ratings newbie, and its ratings system clearly is a work in progress that will be significantly modified and improved in coming years. As any teacher knows, fairly evaluating student performance requires uniform testing and circumstances. Generally, the problems facing one student must be those facing another. But that’s not how hospitals work. Some disproportionately serve the sickest patients, others not so much if at all. And treatment for sicker patients is far more complex. That means the quality of treatment is far more difficult to judge with raw statistics and algorithms.
To fairly evaluate hospital performance, CMS and other raters must appropriately adjust for what doctors would call “patient severity.” This is very difficult to do. And done poorly, one possible result is hospitals with the sickest patients receive the lowest ratings. That’s evidently what happened with this year’s CMS ratings. Consistently throughout Georgia and other states, hospitals with the sickest patients received the sickest CMS ratings.
Here’s why I think that bias occurred. CMS’ computerized, big data, algorithmic rating system was too simplistic to fairly judge the quality of care in hospitals with the sickest patients. It excessively relied upon compliance with routine order sets directed by physician specialists typically after the specialist has done an examination or a procedure.
Take a “hospital” that just does knee replacements. Following surgery, the physician order set would be pretty standard and routinely followed verbatim. So CMS typically gave these hospitals higher ratings because its computers saw that, with few exceptions, the hospital followed the knee specialist’s order set without fail.
The Medical Center does not specialize in knee replacements. A disproportionate number of its patients have “co-morbidities,” lots of things that can make you sicker or kill you. Quite routinely, the order set proposed by one specialist will, in some specifics, actually be inappropriate for the overall care of a patient with co-morbidities. So often after hours of discussion and thought by a panel of sub-specialists, part of a particular sub-specialist’s “order set” is deemed inappropriate under the particular patient’s circumstances and isn’t administered. I think CMS’ computer just couldn’t figure this out. It called strikes when it should have seen base hits.
CMS also looked at death rates from stroke and heart attacks. My expertise is strokes, but I strongly suspect CMS treated heart attacks quite similarly. Simple, uncomplicated stroke and heart attack patients are kept in local hospitals, where they rarely die. The truly sick ones, on the other hand, are quickly transferred to hospitals like the Medical Center. That’s because physicians throughout Georgia know really sick stroke and heart attack victims are far more likely to survive and flourish at level one trauma centers. That’s true, but because they are far sicker, more of them also die. CMS again did not appropriately adjust for patient severity
CMS also selected bladder catheter and IV line infection rates as a measure of quality, but once again, these complications are far more likely in diabetes patients or immune-compromised patients from HIV or cancer treatment — patients far more likely to be treated by level one trauma centers like The Medical Center.
Finally, a whopping 22 percent of CMS’ overall ranking assessment relied upon patient satisfaction charts primarily focused upon how quickly patients received their pain meds or bathroom assistance, although in my experience, nobody dies or experiences some significant deficit by missing a pain shot or waiting to urinate.
I take no issue with the concept that hospital and physician reimbursement should be tied to quality of care. But the evidence so far shows CMS should not be the judge and jury. If past is prologue, it will downgrade those struggling with the sickest patients, encouraging our best and brightest to treat others.
Dr. Tom Hope is a partner in Neurology Associates and also teaches at Mercer Medical School.