Permitting researchers to engineer the genes of one of the most dangerous infections known to man would make it easier to develop new drugs against smallpox, the scientists said. But the man who led the successful global vaccination campaign to eradicate smallpox from the wild said he opposed the move on the grounds that the scientific benefits were not worth the risks to public health.” (Independent.UK)
This item has a particular puissance for me here in Boston, where there is mounting community concern over Boston University’s plan to build a Biosafety-Level-4 laboratory in a crowded urban neighborhood, especially after the recent news that three BU researchers were infected with a lethal strain of tularemia they mistakenly thought was harmless. And this was reportedly not the first biosafety lapse at the BU lab. Proponents of highly risky science have always argued by cost-benefit ratio, but even if we can be assured that the probability of a risk is vanishingly low, aren’t there cases in which the potential magnitude of a disaster is almost infinitely high? In other words, when does the product of a number whose limit is zero and another whose limit is infinity tend toward zero, and when toward infinity? Moreover, the probability of risk often, to my mind, relies on the hubristic assumption that people and procedures can be infallible, when thre reality is quite the contrary — time and again, it seems, if a mistake can occur, it will.