Showing posts with label normal accident. Show all posts
Showing posts with label normal accident. Show all posts

20120202

Lessons from Hemophilia

Recently I had the pleasure of spending some time with Corey Dubin, thinker, activist, and president of the Committee of Ten Thousand. Corey is a really interesting person (this article gives a decent overview of his past activities), child of the 60s, has amazing Zardoz hair (click at your own risk), and finally, he's a member of the "Triple H club": hemophiliac, HIV+, and hepatitis+, and has been for many many years. What happened to Corey Dubin was not an accident of fate, genetics, or public policy. Rather, it was the direct consequence of decisions made about the American blood supply, and his experience has important lessons to teach us about what counts as an acceptable risk in a highly connected world.


First a little context. Not all that long ago, hemophilia was an invariably fatal disease. Internal bleeding caused extremely painful swelling, blood corroded the bones and damaged the organs, and it was rare for somebody with the condition to live beyond their teens. The most famous historical hemophiliac was Prince Alexei Nikolaevich Romanov, who's condition played a minor, but significant role in the Russian Revolution, as it allowed Rasputin to rise in the court and alienated the Tsar from his most natural supporters in the aristocracy.

The 1960s saw the first effective treatments for hemophilia, with the discovery of Cryoprecipitate and then the concentrated blood-clotting proteins Factor VIII and IX. With these treatments, hemophiliacs were able to lead normal lives. Science and medicine had triumphed in reduced hemophilia from a fatal disease to a chronic condition. Of course, this lead to a whole new industry in supplying blood products. Plasma was collected from paid donors, mixed into very large batches containing blood for over 30,000 donors, processed into Factor, and then sold to doctors and patients.

This system worked fine until the early 80s, when a virulent new disease emerged on the marginal fringes of society. Homosexuals, IV drug users, and hemophiliacs were dying of strange lesions and secondary infections. The Center for Disease Control soon realized that it was a blood-born disease, but lacked the political clout to make the Food and Drug Administration and pharmaceutical companies act. The FDA vacillated, refusing to take Factor off the market for several years, and knowingly allowed contaminated blood to be shipped overseas. The end result was that an entire generation of hemophiliacs were infected with a fatal disease.

The point here is not that regulators made terrible, and in some cases unethical decisions in the midst of the AIDS crisis-although they did (and if you find this interesting, I highly recommend the documentary Bad Blood). The point is that the blood system was set up to fail.

The blood supply was contaminated from the beginning with hepatitis. Everybody involved knew so, but they believed that hepatitis was a fair trade for a cure for hemophilia. Perhaps they were right, but through a combination of greed, arrogance, and laziness authorities ignored techniques that could have purified the blood supply; things as simple as running plasma through columns of detergent. Similarly, mixing donor samples into large batches increased profits, but also increased the transmission rate by orders of magnitude. A single bad donor could infect thousands of people.

We have to be very careful about what counts as an "acceptable risk." New technologies present novel risks, and do not have adequate safety mechanisms. Risk is part of the process of innovation, but technologies that do not become safer over time deserve a critical revaluation. The other lesson is that we are all connected. Hemophiliacs are intimately connected to thousands of strangers through the blood supply, but to a lesser extent, we all have the same problems of trust and reliability. The food supply is highly commoditized, which means that food poisoning affects the entire nation, and account for an estimated 48 million illnesses, 128,000 hospitalizations, and 3000 deaths. As we become more dependent on internet-enabled and 'cloud' services, we become more vulnerable to hackers. The stability of countries on the other side of the world can shake the US economy, as proven by repeated oil price shocks. And pollution does not respect national boundaries; we all breathe the same air and drink the same water.

There is no cure for risk. Regulation is an inherently difficult task: the barrier of specialized expertise and the lure of industry money can eventually lull even the most dedicate watchdog agency into passivity. Independent citizens' groups and hard-hitting journalism are the only long-term antidotes to regulatory capture, and they require continual social investment and support. When industry or the experts say that "this is too complex" or that "this will be too expensive", we should demand clearer explanations and sensible alternatives. To do otherwise is to invite disaster. Maybe not today, maybe not tomorrow, but eventually.

Even if the blood supply had been safe in the 1980s, some hemophiliacs would have been exposed to AIDS and some would have died, but the scale of the human tragedy would have been far lower. To this day, the Center for Disease Control uses hemophiliacs as the 'canary in the coal mine' for signs of contamination in the national blood supply. But the story of hemophiliacs and the blood supply also serves as a lesson about techno-social systems and 'normal accidents', and how they can be prevented. Good system design and careful monitoring saves lives.


20110404

Risky Business

Twelve deep thinkers over at The Edge have a series on risk after the Fukushima disaster. I won’t try and reproduce the complexity and subtlety of their arguments, but risk and risk management are at the heart of what the Prevail Project is about. How can we think about risk in a domain of technological uncertainty? What does risk actually mean?

Risk is modern concept, compared with the ancient and universal idea of danger. Dangers are immediate and apparent; a fire, a cougar, angering the spirits. Risk is danger that has been tamed by statistics; this heater has a 0.001% of igniting over the course of its lifespan, there are cougars in the woods, and so on. Risk owes its origins to the insurance industry, and Lloyd’s of London, which was founded to protect merchant-bankers against the dangers of sea travel. While any individual ship might sink, on average, most ships would complete their voyages, so investors could band together to prevent a run of bad luck from impoverishing any single member of the group.

This kind of risk is simple and easy to understand. It is what mathematicians refer to as linear: a change in the inputs, like the season, correlates directly to an outcome, like the number of storms, and the number of ship sunk. The problem is that this idea of risk has been expanded to cover complex systems, with many inter-related parts. As complexity goes up, comprehensibility goes down, and risks expand in complicated ways. Modern society is “tightly coupled”, a concept developed by Charles Perrow in his book Normal Accidents. Parts are linked in non-obvious ways by technology, ecology, culture, and economics, and failure in a single component can rapidly propagate through the system.

The 2007 financial crisis is a perfect example of a normal accident caused by tight coupling. Financiers realized that while housing prices fluctuate, they are usually stable on a national basis, and so developed collateralized debt obligations based on ‘slices’ of the housing market nation-wide, which were rated at highly secure investments. When the housing bubble collapsed, an event not accounted for in their models, trillions of dollars in investments lost all certain value. Paralysis spread throughout the financial system, lead to a major recession. While this potted history is certainly incomplete, normal accidents are the defining feature of the times. The 2009 Gulf of Mexico oil spill, and the Fukushima meltdown are both due to events which were not accounted for in statistical models of risk, but which in hindsight appear inevitable over a long enough timescale.

Statistics and scientific risk assessment are based on history, but the world is changing, and the past is no longer a valid guide to the future. Thousand year weather events are more and more frequent, while new technologies are reshaping the fundamental infrastructure of society. When the probabilities and the consequences of an accident are entirely unknowable, how can we manage risk?

One option is the precautionary principle, which says that until a product or process is proven entirely safe, it is assumed to be dangerous. The problem with the precautionary principle is that it is different in degree, not in kind. It demands extremely high probabilities of safety, but doesn’t solve the problem of tight coupling. Another solution is basing everything off of the worst possible case: what happens if the reactor explodes, or money turns into fairy gold. System which can fail in dangerous, expensive ways, are inherently unsafe and should be chosen in favor that have more local consequences. This solution has the twin problem of demarcating realistic vs fantastic risk, after all, Rube Goldberg scenarios starting with a misplaced banana peel might leads to the end of the world. The second problem is that this discounts ordinary, everyday risk. Driving is far more dangerous per passenger-mile than air travel, yet people are much more afraid of plane crashes. A framework based on worst-case scenarios leads to paralysis, because everything might have bad consequences, and prevents us from rationally analyzing risk. The end state of the worst-case scenarios is being afraid to leave the house because you might get hit by a bus.

So the ancient idea of danger no longer holds, because we can’t know that is dangerous anymore, and mere fear of the unknown cannot stand against the impulse to understand and transform through science and technology. Risk has been domesticated in error; a society built on risk continually gambles with its future.

The solution involved decoupling, building cut-outs into complex systems so they can be stopped in an orderly manner when they begin to fail, and decentralizing powerful, large-scale infrastructure. Every object in the world is bound together in the technological and economic network that we call the global economy. We cannot assume that it will function the way it has forever, rather we should trace objects back to their origins, locate the single points of failure, the places where large numbers of threads come together, and develop alternative paths around those failure points. Normal accidents are a fact of life, but there is no reason why they have to bring down people thousands of miles away from their point of origin