The author may be forgiven for her Proustian aspirations (or was the title the responsibility of her editor??); her article grapples with one of the unappreciated distressing issues of modern life, with which I have been struggling both professionally and personally.
A writer working on a book about mid-life memory and memory loss, she goes on to note that the embarrassment and inconvenience of these “senior moments” hide a more primitive emotion of fear that they are a portent of further memory loss to come, and of “decades of dependence, of life with a diminished mind trapped in a still vigorous body.” Now in my early fifties and having the same changes in my memory, I resonate with this. And as a psychiatrist caring for patients who have descended into the deep memory loss of Alzheimer’s and other dementias, I am even more spooked on a daily basis. Even though I can, intellectually, quote you the statistics, it is difficult not to be viscerally misled by the skewed sample of aging I see in my work. And so are my patients. Those who, even though not profoundly demented, are experiencing some memory difficulty inevitably ask me, “Doc, please tell me, is it Alzheimer’s?” (or, as one patient put it to me recently, “…Oldtimer’s?) But most of us do not even whisper these unspoken fears or, if we do, we frame them as humorous.
The author suggests that, in this ‘information age’, memory loss and lack of easy access to your own databank feel like more of a vulnerability than they were in the past. I find this dubious, however, since self-knowledge has always been the essential human attribute and awareness of its impairment likely to have been distressing since the origin of consciousness. However, the information we must coordinate and access in modern life may arguably be more diverse and voluminous than in ‘the old days.’ This may be in part the price we pay for the breakdown of community and family stability in the modern world — there is less of the routine, unchanging, the comfortably familiar to tether us. In that sense, experiences may tax memory more than they would have done for our forebears of one or two generations previous. What counterbalances this, however, is the evidence that mental activity ‘exercises’ memory and keeps it strong; that intellectually active individuals develop dementia at a lower rate than the intellectually stultified casualties of modernity. (Recall the research finding about aging nuns who do crossword puzzles?)
Another sense in which the information age is a hedge against memory loss, of course, is technological aids. Both in my professional and my personal life, there is less trivial data I have to remember as long as I can carry around my laptop or my PDA or access an internet search engine. I am reminded of a vignette in one of the Sherlock Holmes novels in which, if I remember it correctly (grin — forgive me! In an essay about memory loss, it is hard not to be self-referential!), he is annoyed that Watson has told him that the earth orbits the sun because the fact, useless to his daily life, takes up valuable space in his memory. Although there is no evidence that human memory has a fixed capacity that can get filled to the brim, this is an instinctual reaction, and one which is reinforced by the electronic memory metaphor. In this sense, I have at times sardonically referred to my PDA as a peripheral brain and think in terms of dumping off less essential data to this peripheral storage. This works better for some forms of data than others. For example, just considering the two most common functions of the PDA, no matter how completely my appointment schedule is entered into my PDA’s calendar, unless I am going to have alarms going off all the time I have to keep some notion of the shape of my day in mind. On the other hand, between the contact lists on my PDA and my cellphone, I no longer have more than a handful of phone numbers memorized. It sometimes seems that I am well on the way to the obsolescence of the ten-digit phone number (and why not? Who accesses websites by their IP addresses?). Of course, there is nothing unique about using a PDA for this purpose. I relied on my Daytimer for a decade or so before I got my first handheld (an HP 95LX, if memory serves). In a more recent and profound change, with the instant information access of the internet, my peripheral brain in a broader sense now be said to be embodied by a weblike network extending around the world. Could the legacy of the ‘information age’ be that memory is becoming a property of the hive mind instead of the individual?
The essayist interviews Oliver Sacks about why we are so bothered by memory lapses. Sacks points out that it also has much to do with our personality style. Someone who prides themselves on control and order may be less tolerant than someone more “easygoing”, he observes. I am certainly in the former category, so much dependent on my intellectual functions for my self-valuation that it is difficult for me to recognize that anyone could be “easygoing” about their mental capacities! (That may go a long ways in explaining why I have so much difficulty with George Bush…) For a moment, the essayist feels reassured, that her memory loss is normal, and all she has to do is get used to it.
The article makes some important stipulations about the multifaceted nature of memory function and dysfunction. Memory for names, words, facts, for procedural capabilities (“how-to memory”), prospective obligations, are all differently mediated and differentially susceptible to impairment. Perhaps because she is a writer, she pays far less attention to an entire other realm, nonverbal memory, equallly complex and in parallel with the verbal. I wonder — is the fact that I often find the face or the voice of someone I meet familiar because I have reached a point in fifty-plus years of encounters that everyone is virtually certain to remind me of someone else? or is it more that I can no longer retrieve the nonverbal memory behind the sensation of familiarity with as much precision as I once could?
The author goes on to describe her pursuit of a comprehensive neuropsycholigical evaluation of her memory difficulties which confirms that she has some deficits worse than those considered typical for her age range. One of the specialists she confers with pushes the hypothesis that such complaints often relate to mild traumatic brain injuries. She scoffs at first but realizes on reflection that, between childhood horseback riding and horseplay with her brother, she has certainly shaken her brain around abit. In reality, head trauma preferentially affects attentional processes, but the results are often mistaken for memory difficulty. Separate from memory function and equally complex, the machinery for attention is crucial. To recall something, one must not only store it effectively and access it efficiently on demand but must pay attention — which involves focusing, filtering out the competing and the extraneous, efficiently shifting when demanded, etc. etc. — to it sufficiently to acquire the memory in the first place. Looking back at my discussion above of the relationship between memory problems and the conditions of modernity, it is strikingly evident how susceptible attentional processes are to interference by the constant datastream overload to which modern life subjects us. Furthermore, there is some evidence that attentional processes are quite vulnerable to other erosions of our information-processing capacities by modern life which are more of a neurochemical, organic nature, e.g. due to accumulating environmental toxins and dietary effects. In this light, I have long felt that the attention deficit disorder concept — with which I have been clinically involved since it was recognized in its modern sense around two decades ago — is so overly broad as to be clinically meaningless. While there is a real entity of ADHD that involves physiological and neurochemical dysfunction in the brain’s attentional circuitry, most of the patients diagnosed with this disorder or ‘recognizing’ it in themselves are really just at sea with the information processing demands of their lives. (Not to mention those for whom it is merely a pretext, consciously or unconsciously, to earn a stimulant prescription — which, by the way, will help almost anyone feel and function better, ADHD or not, although in a self-fulfilling-prophecy manner, benefit from a stimulant will often ‘confirm’ the diagnosis in the mind of the patient or the prescribing clinician.)
The issue of head traumas is critical, in my opinion, and often neglected. Whenever I see the dramatic cognitive deficits that occur after serious head traumas — e.g. in a fall or a motor vehicle accident, or the even more dramatic entity of dementia pugilistica after the repeated blows to the head incurred in prizefighting — I wonder about the cumulative effects of more subtle, easily overlooked knocks to the head I would venture to say we all incur throughout life. Think about it; how many times have you hit your head, even without loss of consciousness or even “seeing stars”? There is a selection bias here; if a neuropsychologist presses you to recall childhood head trauma, memory will usually serve. This would be a very very difficult hypothesis for which to design a research protocol (since you cannot very well follow a population prospectively for the thirty or forty years necessary; and even if you had the will and the funding to do so, most mild head traumas go unnoticed and leave no telltale signs on imaging or lab studies) but I am convinced of the likelihood that cumulative mild head trauma is very important to deficits in cognitive functioning. I was knocked senseless at least a number of times during my youth playing touch football (no helmet), being body-checked or tackled during steal-the-flag or ring-o-levio [does anyone know what that is anymore?], taking a spill off my bicycle or while skiing (no helmet), or wrestling with my brother. Call me paranoid if you will, but I have told my children, for example, not to ‘head’ the ball when they play soccer. We are born with a fixed number of neurons, and conserving them is one of the unrecognized priorities of modern life, it seems to me. As the author’s consultant suggests, direct knocks to the head might not even be necessary to experience brain damage. Rapid acceleration-deceleration and rotational torque on the brain can also be injurious. Are we ‘moderns’ more frequently doing things to the CNS that outstrip its evolutionary protections? Think of the reports, for example, on the nasty G-shock to which the brain is subjected during roller coaster rides.
The author does a good job of reconciling herself to the alarming reality of her memory difficulties and why their onset and recognition took the course they did. She shares the experience that my wife and many of the women friends with whom she has compared notes have had, in noticing some cognitive derangement dating from pregnancy and childbirth. This is likely due to the combination of hormonal effects on the brain during pregnancy and the novel demands on attention caused by the life-changing arrival of an infant. The author was prescribed stimulant medication, and she considers in a sophisticated way the costs one pays both personally and more existentially for treating such ‘deficits’.. Ultimately, she is challenged to ask whether one can do other than pathologize and attempt to remediate age-related changes in memory function:
“At what point might I stop dwelling on what had been lost, I wondered, and begin to relish what I had gained with age? Perspective and insight, fused with acceptance, formed the cornerstone of wisdom. The rest, presumably, I could get from Google.”
Perspective, insight, acceptance, wisdom with age — perhaps one of the other things we have lost with the erosion of the social, community and family fabric in the 20th century has been a place for these.
Related?
Lost time reclaimed, indeed. Syncronistically, while I was posting this piece about the loss of one’s past, Mark Wood was linking here to this piece that reminds us about the flipside of timelessness in psychopathology, the loss of one’s future:
To return to the words of Rollo May (1958, p. 68):
Severe anxiety and depression blot out time, annihilate the future. Or, as Minkowski proposes, it may be that the disturbance of the patient in relation to time, his inability to “have” a future, gives rise to his anxiety and depression. In either case, the most painful aspect of the sufferer?s predicament is that he is unable to imagine a future moment in time when he will be out of the anxiety or depression.
It is important to note the potential usefulness of psychedelic studies for treatment of an existential crisis such as depression or anxiety — because these are times when hopefulness and the meaning of life wane. It seems that — through their “unbinding” effect on time perception — psychedelics open new temporal possibilities that can motivate the person to take a new look at the future value and meaning of life.
Intriguing as this suggestion to use psychedelic time dilatation properties to address this futurelessness, the conception is not new. Freud’s dictum a century ago that the goal of psychoanalysis was not to turn unhappiness to happiness but rather to turn neurotic unhappiness to normal everyday unhappiness largely rested upon restoring a patient’s future to them. A teacher of mine, the late, consummate psychologist John Perry, put it only slightly differently when he described the alleviation of suffering achieved in psychotherapy as largely being to persuade a suffering patient that s/he is not in Hell but merely Purgatory. The fires burn just as hot in Purgatory, he would say, the only distinction being whether you know your stay there is time-limited and not eternal damnation.