"Flirting With Disaster"

Friday, 26 February 2010 08:29 By Leslie Thatcher, Truthout | name.

"Flirting With Disaster"

On the occasion of the paperback release of Marc Gerstein's, "Flirting with Disaster," reviewed here last year, Truthout's Leslie Thatcher corresponded and spoke with Dr. Gerstein in mid-February about his book and about our society's continuing penchant for producing avoidable disaster ...

Leslie Thatcher for Truthout: Since your book was published in the spring of 2008, a lot has happened, a great deal of it unpleasant. Your worries about subprime seem to have come to pass, we've had the Madoff scandal, millions of Toyota's cars have been recalled, commuter airlines appear to be dangerous to fly, and so on. Are things falling apart, or am I now excessively sensitized to this type of news?

Marc Gerstein, author of "Flirting With Disaster:" I tend to agree with you that a lot seems to be going wrong, and many of the problems you mention speak to my underlying motivation for writing the book in the first place.

In the "good old days," whenever that might have been, most problems were localized to specific countries, certain industries, or even to single products in the case of consumer goods. As our world has become increasingly interdependent, what happens in one place affects what happens in others to an unprecedented degree. The global economic crisis touched off by irresponsible subprime mortgage lending in the U.S. spread around the world with remarkable speed, and we're far from out of the woods.

Toyota provides another important illustration of my concern, which comes in two parts. First, the inclusion of the same technology - both hardware and software - in so many of Toyota's vehicles has meant that a flaw is more far-reaching in its consequences than it might otherwise have been. Secondly, it now appears that the National Highway Traffic Safety Administration, the federal agency charged with regulating the automakers, seems to have been asleep at the switch, and perhaps too friendly with those they regulate.

Compromised regulators have turned into a major problem, one that appears to be getting worse, in cases in food production, air safety, pharmaceuticals, patient safety, and finance all illustrate.

Can you be more specific?

In "Flirting With Disaster," I discuss the Vioxx story at some length. The FDA has been deeply transformed by the so-called "user fees" that are paid by the pharmaceutical industry and now play a powerful role in the FDA's budget and the culture of the organization.

Another example is credit-rating agencies, which are private companies but play a role very much like regulators in the bond and derivative markets. Like the FDA, the role of credit-rating agencies has evolved from a protector of the investing public to an ally of the industry. They appear to have lost their independence.

One of the most potent forces for compromising government regulators is the lucrative private sector employment opportunities provided to former employees of these agencies. Since regulation often involves demands that do not comply with the desires of those who are regulated, cooperative and friendly regulators are a lot more valuable to industry than those who are strict and demanding. If individuals who cooperate with corporations are rewarded with good job opportunities in industry, it's easy to see that conditions for a conflict of interest exist, even where no ethical rules are broken. Every regulator has to decide how to focus their efforts, and the decisions concerning what to pursue and what not to pursue are a critical set of decisions that are very influence-prone, even unconsciously.

Beyond these examples of conflicts of interest directly associated with the regulators themselves, there are political influences that apply pressure. As we all know, industry lobbyists are a powerful force in Washington, as are political campaign contributions. Taken together, these forces add up to a regulatory regime that has become prone to be very cozy with those they regulate. To make matters worse, many regulatory agencies have woefully inadequate resources to perform inspections, to perform investigations, and to enforce the rules that do exist with sanctions strong enough to discourage others. Sanctions, in particular, are very prone to influence.

For example, in many accounting scandals there is no criminal wrongdoing sought. This means that corporations may put bylaws in place that indemnify their officers against the legal fees they pay and the loss of the bonuses they receive on the basis of bogus profits. In the case of the Xerox accounting fraud of the late 1990s, according to The New York Times, the company repaid Board Chairman Paul Allaire's disgorgement of the $5.2 million he received from selling shares of Xerox during the fraud, $500,000 in bonuses he received for meeting profit goals the SEC determined were met because of the fraud's earnings impact, and $1.9 million of interest payments on these amounts.

While such circumstances would be bad enough under any circumstances, there are many more "systemic vulnerabilities," as I described earlier, capable of turning what once might have once been a local mistake into a global one. Toyota's recall covers more than eight million vehicles and is affecting the livelihood of many people - exacerbating the woes of Toyota dealers and the communities they serve that are already suffering because of the recession.

How do these models apply to the case you just documented for Truthout in "Blood Simple," where people involved in medical trials at Columbia University were exposed to conditions associated with harm?

There are two parallels, Leslie. First is the issue of the hiring of staff from the regulator to the private sector. George Gasparis, the assistant vice president and senior assistant dean for research ethics at Columbia University Medical Center and the executive director of the Human Subjects Protection Program for both Columbia University and Columbia University Medical Center, was, prior to his move to Columbia in June 2003, the director for the Division of Assurances and Quality Improvement at the Office for Human Research Protections (OHRP), Columbia's Institutional Review Board (IRB) regulator within the federal Department of Health and Human Services.

I am in no way making any allegations of inappropriate action on Mr. Gasparis' part. He arrived at Columbia after the study was completed. Rather, my point is simply that there is a movement between sectors that might potentially encourage regulatory personnel to behave in a manner that facilitates their movement into the private sector but may also compromise their role as regulators.

Another parallel is the influence of commercial interests on safety-related decisions. In the Columbia study known as Protocol #9256, the subject of "Blood Simple," two commercial companies, BioTime, Inc. and Abbott Laboratories would benefit from the superiority of a newly approved surgical fluid over its older, more well-established competitor. The study's lead investigator had a financial relationship with both companies as well as prior research experience studying the clinical impact of the fluids. Abbott Labs also made an unrestricted grant that subsidized the study.

Under current rules, there is nothing specifically unethical about any of these arrangements. In fact, much progress in pharmaceuticals, methods, and devices arises from such relationships. In this case, however, in order to definitively establish the superiority of the new fluid the investigators had to confront the problem that differences in the fluids were only likely to emerge at high dosages, dosages that were generally considered unsafe because of a long history of bleeding complications. Columbia, as well as many other hospitals, had put in place routine safe practices to avoid high dosages by limiting the amount of fluid a patient could be given over a certain time period.

Nevertheless, the study altered this practice by exposing patients undergoing bypass or valve replacement surgery to unlimited levels of one of four randomly assigned study fluids, including the two just mentioned. The risks associated with this change in procedure were omitted from the description of the study to Columbia's IRB and from the patient's consent form. In fact, the study investigators went so far as to request a minimum risk waiver that would have eliminated the need for formal patient consent.

And some patients were harmed?

Yes, patients exposed to the older product, in particular, bled more, required more transfusions, demonstrated impaired kidney function, and had to return to the OR to address their bleeding more often. Although much of this data was available in 2001, it took until 2009 for patient harm to be acknowledged by Columbia.

What about the OHRP?

That's an interesting but somewhat complex question. In 2003, OHRP closed the case by accepting Columbia's decision to strip the investigator of his research responsibilities and his position on the tenure track as well as the University's promise to improve its flawed IRB process, a step that included the hiring of Gasparis. On the other hand, OHRP later claimed when there appeared to be more to the story than Columbia had revealed that it lacked the power to dictate the <em>quality</em> of Columbia's investigations into the study, investigations that had resulted in the counterfactual conclusion that patient harm had not occurred. The prior studies lacked independence, transparency, and did not even involve a review of patient charts until the third investigation in 2007.

However, it was the OHRP that finally ended the long-running dispute about patient harm in June 2009, so in another way they played a positive, decisive role in this case.

Rather than recount this complicated tale in its entirely, perhaps it's best to read my story and the linked Huffington Post article for the details.

Can we do anything about these issues? You're usually an optimist, but I don't sense your usual hopefulness about this issue.

You're right, Leslie, I'm not as optimistic as I once was. One reason comes from a paper called "Dark Secrets" I wrote with an MIT colleague, Ed Schein, a couple of months ago. The title is not just a catchy phrase; it is a sociological term that stems from Erving Goffman's concept of "face work" - the effort each of us put into showing our good sides and minimizing our faults.

We all do this: on our resumes, in the clothes we wear, in what we say about ourselves when we meet someone new. We also go to some lengths to hide information that is embarrassing or worse, and we expect our family and close friends to keep these secrets.

The essence of the paper is that organizations keep secrets, too, and just like the rest of us they consider revealing such secrets by others to be an act of betrayal. Revealing dark secrets is, at least in the institutional world, experienced as a form of treason. Treason, as we all know, is punishable by death.

I love the term, and the concept certainly fits a lot of what we see. What does it imply?

Well, one of Ed's most important insights was that in order to counter the cultural and psychological forces that prevent revealing dark secrets, it is necessary to provide some perspective and power originating outside of the specific organizational culture.

You can immediately see that the closeness between the regulated and their regulators robs the process of precisely the psychological and cultural distinctions needed for appropriate governance.

Unfortunately, our primary way of dealing with such problems is to make rules and laws. While they help, it is impossible to legislate a state of mind, and that's exactly what is compromised by too close an alignment between parties that need to be, in some ways, parts of an adversarial system. When it comes to enforcing the rules, public safety is preserved by a certain amount of conflict.

A second implication of our work is that organizations will not naturally or voluntarily protect whistleblowers. The natural thing for them to do is to punish them, and punish them severely. This is not only seen as just, but as a necessary lesson to others of the costs of violating the code of omerta.

That fits with what I see. Such punishment stories are a regular theme in the stories we publish at Truthout. Can anything realistically be done?

What needs to be done is obvious, but it is very difficult. In a recent program that was webcast from the Paley Center in New York on Whistleblowers, David B. Amerine, a Babcock and Wilcox Corporation executive with long experience in the nuclear power industry, revealed their approach. Of all industries, nuclear power has perhaps more experience preventing accidents than any other. They learned the hard way from Chernobyl and Three Mile Island just how damaging to their business an accident can be. No new nuclear facilities have been built in the US for three decades, and President Obama's recent support is a big opportunity for the industry that they cannot afford to squander through an accident that could have been prevented.

Amerine's point was a simple one. Babcock and Wilcox has a "zero tolerance" policy about whistleblower retaliation, or "retribution" as he called it. I think that this is an essential part of the solution. Ultimately, however, building a culture in which it is truly safe for people to raise concerns can stem only from consistent leadership behavior that supports truth telling irrespective of the consequences. The difficulty arises, as Dan Ellsberg so aptly put it in the afterword to Flirting with Disaster,"when the leaders are the problem."

Many of the disasters that we have seen have either been performed at the explicit request of management - the willful habituation of large numbers of people to the dangers of tobacco, the long-term denial of the harms of tetraethyl lead in gasoline, the Peanut Corporation of America salmonella outbreak, Enron, and Madoff - or have been natural side effects or direct consequences of policy decisions.

In illustration of the latter case, we can cite the accidents at Chernobyl and Bhopal, Vioxx, Bextra, sub-prime lending, Abu Ghraib and, not incidentally, Hurricane Katrina's destruction of New Orleans.

When management is either directly instigating the risk-taking or driving the organization with demands that directly encourage imprudent practices or unethical actions, expecting people to speak up is unrealistic - a fantasy, in fact. That it happens at all is one of life's miracles and why - at least in some quarters - people like Dan Ellsberg are held in such high regard.

But most people are quite rightly terrified of speaking out. Many organizations are ruthless in their punishments, and being a whistleblower is rarely a good resume item. Many employers see you as an untrustworthy trouble-maker, a fact that in and of itself conveys just how powerful the cult of dark secrets really is.

Union Square Press/Sterling Publishing company provided Miss Thatcher with a reviewer's copy of the hardcover edition of "Flirting with Disaster."

Last modified on Friday, 26 February 2010 20:36