Outcry over creation of GM smallpox virus

“Senior scientific advisers to the World Health Organisation (WHO) have recommended the creation of a genetically modified version of the smallpox virus to counter any threat of a bioterrorist attack.

Permitting researchers to engineer the genes of one of the most dangerous infections known to man would make it easier to develop new drugs against smallpox, the scientists said. But the man who led the successful global vaccination campaign to eradicate smallpox from the wild said he opposed the move on the grounds that the scientific benefits were not worth the risks to public health.” (Independent.UK)

This item has a particular puissance for me here in Boston, where there is mounting community concern over Boston University’s plan to build a Biosafety-Level-4 laboratory in a crowded urban neighborhood, especially after the recent news that three BU researchers were infected with a lethal strain of tularemia they mistakenly thought was harmless. And this was reportedly not the first biosafety lapse at the BU lab. Proponents of highly risky science have always argued by cost-benefit ratio, but even if we can be assured that the probability of a risk is vanishingly low, aren’t there cases in which the potential magnitude of a disaster is almost infinitely high? In other words, when does the product of a number whose limit is zero and another whose limit is infinity tend toward zero, and when toward infinity? Moreover, the probability of risk often, to my mind, relies on the hubristic assumption that people and procedures can be infallible, when thre reality is quite the contrary — time and again, it seems, if a mistake can occur, it will.

Police hunt poo protesters

Over the past year or so, German pranksters have placed miniature American flags in 2-3000 piles of dog excrement in public parks in what has been construed as a graphic protest against US policy in Iraq and, more recently, against Bush’s reelection. Police seek to catch the culprits red-handed (or… would it be brown?) even though “legal experts say there is no law against using faeces as a flag stand and the federal constitution is vague on the issue.” (Ananova)

Nine, ten, never sleep again

Boing Boing’s David Pescovitz comments on an Ananova story about a man who has puzzled medical experts by being unable to sleep for the past twenty years. There are a series of follow-up posts listing novels about insomniac characters (many of whom seem to be private eyes). I would love to see some more detailed medical investigation of real-world insomniacs. Although the ultimate necessity of spending an average of a third of our lifespan asleep remains a mystery, we are garnering knowledge about the variety of necessary functions it serves, both in terms of cognitive housekeeping and tissue repair and restoration of physiological equilibrium. How does this guy function, on both interpersonal, psychological, and physiological levels?

I’m not sure, in any case, about the veracity of the Ananova story, given that there’s a machismo about not sleeping (perhaps because sleeplessness turns us into the worst caricature of macho??) and I often run into people who boast that they need less sleep than the rest of us. There is something culturally consonant about sleep deprivation, too, as society is more and more frenetic and productivity-driven. Performance in many fields (especially medicine; more about that below) seems to be measured at least partly by how long and how far and how fast one can go on. People in general sleep less than they used to, and we are intrigued by ‘alertness agents’ like modafinil (about the value of and concerns about which I have written here), which appear to treat fatigue and compensate for sleep deprivation with fewer consequences than stimulants of the amphetamine family.

//www.zyworld.com/vampirelore/Mara1med.jpg' cannot be displayed]There is also a separate but related allure of the wee small hours per se. I guess it is true of many children who are curious about what mysterious and magical things might happen after they are asleep, as I was. There was always a frisson, when I went to the zoo or the natural history museum, at seeing the somehow more eerie nocturnal creatures. And, in the 1931 film, one of my childhood favorites, Dracula’s ecstatic celebration of “the children of the night” as the air was suffused with the distant howls of wolves always sent a delicious chill up my spine. I began trying to stay up late as soon as I could tell time. I would sneak my transistor radio — if any of you know what those were — into bed and put it under my pillow (it was especially exciting when I finally got an earphone for it) and try to stay awake to break the magical barrier of midnight; it was a long time before I succeeded. Since then, I have always been a night owl, as you can tell from the timestamps on many of my posts here at FmH. I have never gotten over the romance of the middle of the night, both the stillness and aloneness, the cold hard clarity of a world reduced by starlight and moonlight to nocturnal hues, and the seedy quality of the covert activities that transpire, in reality or imagination, in the dark, beyond the ring of illumination thrown by our streetlights. Many of the insomniac characters in literature seem to enjoy walking deserted city streets in the middle of the night, and so too did I. There is an element of transgressing boundaries, the thrill of doing something forbidden, in being up when no one else is, when no one is supposed to be. One of the subliminal attractions of being sleeepless may also be that one challenges the Big Sleep, pushing to transgress the ultimate boundary at the end of life. It is a medical truism, by the way, that Death comes for people disproportionately in the wee hours. Perhaps I have always wanted to be staring her in the eye when she arrives. Sensuality, too, if of course intimately associated with the nocturnal.

“I’ll sleep when I’m dead,” Warren Zevon and others have said. One can cheat death too by packing more into life, I have thought, by spending more of one’s living hours as waking hours. For most of my life, I felt that I did not have the time to waste on sleeping, and (here comes that macho boast?) felt that I could get away for many days running with shorting myself on sleep if there was something compelling to read, write or watch instead. There is a sort of machismo associated with being able to function while sleep-deprived during medical training, when the sleep deprivation is, of course, outrageous and, I am convinced, gratuitous. Training directors, or caricatures of them, supposedly reason that decision-making skills are shaped, and character built, by sleep deprivation, and that “if it was good enough for me when I trained, it’s good enough for the new generation of whiners.” When there is an egregious medical error, like that which caused the celebrated death of Libby Zion in New York some years ago, there is anguished handwringing about the liability and morbidity caused by our proclivity for sleep-depriving medical house officers making life-or-death decisions, but it never seems to change anything. The real incentive the system has to make residents do round-the-clock shifts, of course, is not a training need at all; it is the easiest way to use the indentured servitude of medical residency to meet the manpower needs of a modern healthcare facility.

My acceptance of sleep deprivation during my medical school years had an added momentum, though. When I went through medical school, I was hellbent on not ‘becoming’ a doctor, in the sense of that being all there was to my identity forever after. It needed to be just one of the things I did in my life, not my defining attribute. That created an added impetus to stay up late to do other things after keeping up with the literature in my field, writing consultation reports, etc. And after my wife and I started a family, I also took to my parenting responsibilities by carving out the wee small hours for my other pursuits after a full day of being present and active as a father as well as a doctor.

There is an intimate relationship between sleep disruption and depression, as we know in psychiatric practice. Depressions come in sleepless and hypersomnic varieties. Part of the difference is surely biological, but people are also built differently in terms of their characteristic coping strategies. Some people are escapists, and may sleep more in an effort to avoid distress. (They also seem to be the ones, in my experience, who can can entertain thoughts of suicide for the purposes of relief or escapism, among the various purposes that suicide can serve in my patients’ psyches.) But, even for people who try to use sleep as an escape mechanism, I have long suspected that sleep can promote depression, and there is a body of literature supporting me, even speculating that some sort of depressogenic neurochemical is produced during sleep. Depressed patients often feel most depressed upon awakening and their mood improves as the day proceeds (so-called “diurnal mood variation”). If they take a daytime nap, they often face another period of renewed depression after they get up from the nap. Even if it is not biological, you can imagine how difficult it is to face depressing realities immediately upon awakening from a period of blissful ignorance. The possibility that sleep promotes depression has led to speculation that some people may be sleep-depriving themselves as a sort of inadvertent self-medication for depressive tendencies. In other words, is the sleeplessness of some depressions a consequence of, or an attempt at compensation for, the depressed mood? Noting that I tend to push my bedtime further when my mood is bluer, I have wondered as well. It may also be that those are the times it is more urgent to do more for myself.

It took me literally several decades to realize that burning the candle at both ends was making me far more impatient and irritable than I needed or wanted to be, and that sleep-depriving myself was not a free lunch. This dawned on me at approximately the same time as, studying the physiological necessity of sleep and the psychiatric consequences of sleep disruption, I began to take note of medical research showing that sleep deprivation shortened organisms’ lifespans. So, ironically, cheating death by shoe-horning more wakefulness into a fixed lifetime turns out not to be as simple as I had assumed. Of course, it is also well-known that sleep deprivation reduces cognitive efficiency in certain empirically measurable respects. So even if one is up more, one may end up paying for that quantity of waking hours with quality. Moreover, I realized, sleep deprivation is cumulative; the commonsense notion that you can pay back your deficit by ‘sleeping in’ the next weekend doesn’t work. If you are supposed to sleep eight hours a night, let’s say, you can’t go three nights in a row with four hours a night and then erase the damage with a twenty-hour night’s sleep.

One of the other skills I developed as a medical resident on call was the ability to rapidly return to sleep after I had dealt with a challenge in the middle of the night. It was never as extreme for me as for some of my colleagues, however, who could seemingly conduct their on-call duties without waking up fully at all. One of my friends, a surgical resident, eventually learned that she was managing many of her patients’ problems — always clinically appropriately, to hear her tell it — over the phone in the middle of the night without remembering what she had done when her surgical team did morning rounds on the patients the next day. She finally arranged for the hospital operator who paged her to listen in on the calls and take notes about what orders she issued. She would swing by the switchboard in the morning, before rounds, and use the notes as a cribsheet when reporting on the care she had given the night before. (I don’t know if this was a liability or an adaptive strategy to her work as a surgeon; she has since gone into a different field of medicine. Dream on…)

So now I want to sleep more. Now that I realize it is not necessarily desireable to short myself so much on sleep, when I am awakened in the middle of the night by my beeper going off from the hospital, I want to get back to sleep again as soon as I have dealt with the call. But, in middle age, I am finding, ironically, that I can no longer get back to sleep rapidly. If I am awakened, I am typically going to be up for at least a couple of hours. Of course, I could do something boring and soporific with the time, to hasten my return to sleep, but it still sticks in my craw to waste wakefulness. So some of the middle-of-the-night FmH entries you will see these days are, in a sense, under duress. Enjoy them anyway; I do. I still do some of my clearest thinking in the holy stillness, or at least so I imagine.

Atrocities in Plain Sight

Andrew Sullivan on Abu Ghraib:

“I’m not saying that those who unwittingly made this torture possible are as guilty as those who inflicted it. I am saying that when the results are this horrifying, it’s worth a thorough reassessment of rhetoric and war methods. Perhaps the saddest evidence of our communal denial in this respect was the election campaign. The fact that American soldiers were guilty of torturing inmates to death barely came up. It went unmentioned in every one of the three presidential debates. John F. Kerry, the ”heroic” protester of Vietnam, ducked the issue out of what? Fear? Ignorance? Or a belief that the American public ultimately did not care, that the consequences of seeming to criticize the conduct of troops would be more of an electoral liability than holding a president accountable for enabling the torture of innocents? I fear it was the last of these. Worse, I fear he may have been right.”

I had missed this essay, originally published on the front page of the New York Times Book Review. I have long been a proponent of a take similar to Sullivan’s about how the rhetoric about the war and the duplicitous shaping — from the top — of the American attitide about Iraqis, terrorists, and other poorly differentiated spooks created a culture in which these atrocities could happen. I differ with Sullivan on one account, which is his assertion that those who “unwittingly made this torture possible” were not as guilty as those who inflicted it. First of all, it is hard for me to see how it was “unwitting.” And secondly, decisions from the president and the upper echelon of his administration henchmen not only “made the torture possible” but essentially mandated it. Early in the essay, Sullivan is unsure whether to take solace in the fact that the torture occurred in a free society where the chilling evidence of it was able to come to light.

“Whatever happened was exposed in a free society; the military itself began the first inquiries. You can now read, in these pages, previously secret memorandums from sources as high as the attorney general all the way down to prisoner testimony to the International Committee of the Red Cross. I confess to finding this transparency both comforting and chilling, like the photographs that kick-started the public’s awareness of the affair. Comforting because only a country that is still free would allow such airing of blood-soaked laundry. Chilling because the crimes committed strike so deeply at the core of what a free country is supposed to mean. The scandal of Abu Ghraib is therefore a sign of both freedom’s endurance in America and also, in certain dark corners, its demise.”

I am afraid that the pieties about the persistence of freedom in America are gross self-deception. Free expression and inquiry are the merest, illusory, window-dressing on a society that permits such atrocity as a matter of policy, fails to make a meaningful inquiry into or condemnation of the abuses, and reelects those responsible, enabling them to claim a ‘mandate’ for business as usual. What did the American people do other than stand by and shake their heads in the face of the war crimes committed in their name, and allow ourselves to be sated by the punishment of some sacrificial lambs? The failure to make the atrocities, and the similar demonization of those we hold prisoner in Guantanamo, Afghanistan (and God knows what other places around the world we have not even heard of), a core campaign issue was scandalous. The moral failures involved must be kept in the forefront of American consciousness if those who act in our name are to be prevented from permitting and encouraging further atrocities.