White Wall of Silence
Denial Isn't Always Conscious
As early as the 1950's, the psychologist Solomon Asch did a series of laboratory experiments in which he put groups of people in rooms and asked them simple questions that any child could answer correctly, such as which of three lines drawn on a piece of paper was the longest. The answer is obvious, unless other people answer it incorrectly first. All of the people in the room were plants except one. The plants would agree on an incorrect answer. After witnessing that agreement, other people usually agreed with it too. Things that obviously could be seen to be wrong by an independent observer, were not seen to be wrong because of the suggestion of others.
Three out of four people gave an incorrect answer to a simple question after overhearing others give that incorrect answer. Recent research (lead by Dr. Gregory Berns, a psychiatrist and neuroscientist at Emory University in Atlanta) using MRI's shows that they are not lying. They actually see solid, physical things differently based on what others have said. They do not believe their own eyes as much as they believe what the group says.
Seeing is believing what the group believes
These studies were done with groups of strangers with little in common. Imagine how much more influence the false answers would have had if uttered by members of the same profession, for instance if all the subjects had been police officers. Imagine how much more influential than that it could have been if all the subjects were not just members of the same profession, but knew each other and worked with each other everyday, like members of a surgical team. This is beyond merely learning not to report Hodads, learning not to criticize other health care professionals in the record, and similar conscious refusals to make honest records. This is believing oneself to be objective while interpreting the evidence of one's sense in an entirely different way than it would be interpreted by a third party with nothing at stake in the matter. Self-interest and group-think have more to do with what health care professionals believe and record when things go wrong than anything else. If they didn't, the rate at which they accurately report problems would be higher than 2%.
Could there be a less sinister explanation for how Charles Cullen, Micheal Swango, James Burt and many similar examples were protected by all other health care workers, by the hospitals in which they worked, and even by their state medical boards (see OSMB)? More sinister explanations are covered elsewhere on this site because sometimes they are the more accurate ones. But for another less sinister one, about people focused on their own specific parts in the process and on getting that right, than on what effect they are having on patients, see psychology, a page taught to pharmcology students by at least one professor trying to do something about the problem (at last).
Even the terminology that is acceptable to use about problems in medicine is dedicated more to advocacy than to an accurate understanding of what is wrong, like the word "error" used to describe everything that harms patients. This is a facet of the culture in medicine that starts to feel sinister at a very basic level. More about it is at this link: nequamitis.
2% rate of accurate reporting
According to the United States Department of Health and Human Services, and others, when things go wrong in medicine they are not reported anywhere at any level 93% of the time. When they are reported they are reported inaccurately 2/3 of the time. Which means that they report negative information accurately 2% of the time (footnoted at Medical Reporting).
Try to find anyone in medicine who believes that. How are they going to protect you from the problems, how are they going to fix them, when they don't even see them?
What they see is that everything is fine when it is not.
Self Confidence is Contagious
Dr. E. James Potchen at Michigan State University in East Lansing studied the accuracy with which certified radiologists studied x-rays. One of the things he found was that even those who were the worst at it were highly confident in the accuracy of their work. That's how health care professionals see the world.
Asking health care professionals to make medicine more transparent is ignorant. Not only don't they see it when their colleagues are the problems. They especially don't see it when they themselves are the problem. So how would they recognize the wall they have built between patients and the information patients must have to make safe choices when they don't see it themselves?
The Great Sidewalk of Medicine
Something I hear repeatedly from people in medicine is that the rest of the world could not "understand" the tough calls, the gray areas, the complexity of medical knowledge. As though a patient would need a medical degree to know if she were being raped. And would be incapable of recognizing when no one reported it. And would have no way of learning that it was because the injuries were iatrogenic that no one would diagnose them (see blacklisting).
When a self-interested behavior is the norm accepted by all of your colleagues, perspective has little chance. Walls appear to be merely the sidewalks you walk on, not the walls you have created for someone else.
And they don't know that about themselves.
Community Patient Agencies
That's why mechanisms, like Community Patient Agencies, need to be established to gather information about medicine without the interference of the health care community. Otherwise we never will learn their success rates and failure rates. It is not possible to give informed consent without that. It is not possible to make safe choices without knowledge of failure rates. Health care professionals never have had, and never will have, the objectivity to collect the information necessary to make safe decisions for patients. We should stop complaining when they don't. We should recognize that they aren't saints and erect solutions that enable patients to protect themselves.More about Silence->