This is an effort by the United States Department of Health and Human Services (HHS). They themselves state that it is fledgling, but it is a start toward an attempt to solve the problem of the lack of accountability in healthcare, the inability of patients and others to make intelligent purchasing decisions in the information vacuum surrounding healthcare, and to do something about the way the current system rewards mediocre and poor performance. We are 100% in favor of their doing this, but. . .
Hospital Compare is based solely on that which is reported by people who rarely report. According to Wald and Shojania only 1.5% of patient harm problems get reported. According to a 2010 study by HHS itself, the very people running creating Hospital Compare, only about 2% of adverse events get reported accurately. 93% don't get reported at all. To see other studies and journal articles reaching similar conclusions, see Medical Reporting. If you talk to enough injured patients, you can start to wonder if 2% might be an over estimate.
No Success Rates
Look at all the studies you want, the consensus of studies from parties like HHS is that few adverse events get into the records that are the basis for efforts like Hospital Compare. 93% of the information patients most need to know about a hospital cannot be learned from Hospital Compare. The success rate is the most important thing to know when making a cost-benefit analysis. The success rate cannot be known without accurate information about adverse events. And accurate informatoin about that does not come from the care giving community.
Unfortunately, journalists and the caregiving community get their information from the caregiving community, and, so far, appear to be oblivious of the overwhelmingly strong bias that the caregiving community brings to what gets in the record.
Like Dr. Shannon Phillips (links to another site), a quality and patient safety officer at The Cleveland Clinic, who says that it appears as though they have a high rate of accidental tears and lacerations and serious blood clots only because at their institution "people are careful at documenting, almost to a fault, things that are incidental to the case." She appears to think that what she sees is all there is to see. And she is a quality and patient safety officer.
I have been to the Cleveland Clinic. I have experienced first hand having caregivers there putting lies in the record in order to coverup iatrogenic injuries. She cannot know about that. It is not in the record. Hospital Compare cannot know about it either. Even without that personal experience, "careful at documenting, almost to a fault" only is the view of people oblivious of the biases of the caregiving community. It is a view that flies in the face of the peer reviewed studies of the issue. And it flies in the face of the experience of injured patients other than me. But try getting anyone in medicine to consider the idea that their personal vision might not be both objective and all-inclusive.
And yes, I did complain about the record. I had to go out-of-state to get the injuries diagnosed. With those diagnoses in hand, I complained to the in-state medical board. They contacted the caregivers I complained about and were told that the caregivers are right and I am wrong. And that was the end of that. Which is the normal experience of injured patients filing complaints with state medical boards. Which is why we need state patients boards in order for patients to be able to appeal to someone who actually is on their side. But as it is now, all the information available for efforts like Hospital Compare comes from caregivers and their advocates. There isn't anyone on the side of patients collecting their information.
In addition to the problem of the lack of objective information is the problem of politics. Since Hospital Compare is governmental, political pressure will corrupt the information one way or the other. This is unavoidable in concerns that are governmental. This happens even outside of government. As an example, the list of "100 Top Hospitals" produced by Solucient every year is supposed to be based on empirical Medicare and state medical data, which they describe as 800 data elements for over 6,000 hospitals. Even if that data were not subject to the above described problems, what happens to it afterward it is collected is a problem as well. One of the people on the committee that makes the final decisions confided that they often are surprised, when the list is published, to see on it hospitals that had not even been on the final list under consideration.
Some hospital CEOs have pay packages and bonuses tied to getting on that list. Imagine how hard they work to influence it. How did they get themselves included as one of the top 100 when they were not even on the final list under consideration?
One Third of Patients Off Screen
Another large problem is that Hospital Compare is not allowed to collect any information on roughly a third of the patients in hospitals. If there are 10 or fewer patients in a Medicare DRG (diagnosis-related group), that data can not be made public. 30% of hospital discharges fall into that category. So awareness and incentives generated by Hospital Compare will apply only to the more common illnesses and problems.
People in medicine can manage the less common DRGs in ways that are perilous to patients and/or exploitive without it showing up in the results. The caregiving community has so much faith in its own selflessness that they will not even acknowledge the possibility of this, let alone protect patients from it.
Examining the records of the majority of patients could accomplish a lot of good if they were not corrupt, but even if they were not, operators in bureaucracies looking for safe and lucrative careers are experts at identifying blind spots, niches where profit is high and oversight is low. Predators thrive in such corners. This is something no one in the patient safety movement will acknowledge - people doing unnecessary work to make money (and things more sinister than that). One of the greatest failings of initiatives to fix health care is the extent to which the professionals in it are in denial about unfriendly practices, something possible in part by their remaining in denial about it and about how few adverse events get in the record.
Data Prefiltered by Vested Interests
The only information to which Hospital Compare has access already has been filtered by the people on the front line who have agendas that are in conflict with the well-being of patients. The agendas of patients need to balance that and report the information that the front line workers won't. According to a study in the Annals of Internal Medicine patients report more data and more accurate data than healthcare workers when given the opportunity. Patients are the only possible source for most of what other patients need to know to make intelligent purchasing decisions in medicine, but none of that becomes part of what hospital rating sites compare.
Most of the time patients don't understand what they know and cannot report it without having a knowledgeable person to decipher the experiences they report. Instead, commonly, patients are persuaded by their caregivers that everything was done perfectly and the poor outcome only is the result of the fact that once in a great while perfect treaters and treatments cannot prevent underlying problems from surfacing. Patients usually believe that and don't even know that there is anything to report. They need a knowledgeable person without a conflict of interest to make them wiser, which would be one of the major goals of Community Patient Agencies if we had those.
Even when patients do understand what happened to them and want to shout from the rooftops about it in order to protect other patients, Hospital Compare doesn't know about it.
We're glad for the HHS initiative's effort to do something about the information vacuum surrounding healthcare, but what they are doing amounts to very little.
Other Attempts at Evaluation
Some others rating healthcare are HealthGrades, RateMDs, The Leapfrog Group, Consumer Reports, US News, and Revolution Health Group. The health insurer WellPoint is working with Zagat Survey -- famous for its restaurant guidebooks -- to allow its policyholders to evaluate their physicians employing the same methods used to rate eateries and travel destinations (at least someone listening to patients, even if they are not providing the patients with the means to understand what is worth reporting). And Angie's List has gotten in on the act too, but providing sites at which patients can report mostly results in information about how long the wait was in the waiting room and such like.
It's nice to see how many different groups would be interested in helping patients report to each other the information that medicine will not report to anyone, but currently the information is not being collected in a way that can cull anything of much value. The patients with the most important information to report either don't understand it or cannot tell anyone about it without getting sued.
And What About the Rest of Medicine?
HospitalCompare is collecting information only on hospitals. Most care does not take place in hospitals. In the state of Ohio alone Medicare-certified home health agencies provided over 10,000,000 (ten million) home visits in 1996 to patients of all ages. Initial estimates indicate there are at least as many non-certified home health agencies in operation in Ohio as there are Medicare-certified agencies. Since the state does not license home health agencies we don't know how many of them there are, let alone anything about the services provided by them.
No One Knows
That is the state of most of the information about success rates in medicine whether in or out of hospitals. No one knows. A surgeon can disable you and write in the post op report that the operation was perfect and without complications. The caregivers around that patient persuade the patient of that. No one looks further. Most patients with iatrogenic injuries do not even understand that there is anything to complain about, let alone file complaints. When they do, that does not correct the lies put in the record that is used to rate hospitals.
At present, patients still can find out more useful information about hair dryers they might buy than about the surgeons who are going to cut them open or the facilities in which that will be done.
There is a site trying to rate physicians, called Physician Compare. An article about it is at this link (on another site) by one of the best writers on the subject of patient safety, Michael L. Millenson, President of Health Quality Advisors LLC, that says things about Physician Compare that are similar to what I have said about Hospital Compare.