Connections are being drawn between animal abuse and other kinds of violence (New York Times Magazine)
Next Big Thing in Literary Theory: ‘At a time when university literature departments are confronting painful budget cuts, a moribund job market and pointed scrutiny about the purpose and value of an education in the humanities, the cross-pollination of English and psychology is a providing a revitalizing lift.
Jonathan Gottschall, who has written extensively about using evolutionary theory to explain fiction, said “it’s a new moment of hope” in an era when everyone is talking about “the death of the humanities.” To Mr. Gottschall a scientific approach can rescue literature departments from the malaise that has embraced them over the last decade and a half. Zealous enthusiasm for the politically charged and frequently arcane theories that energized departments in the 1970s, ’80s and early ’90s — Marxism, structuralism, psychoanalysis — has faded. Since then a new generation of scholars have been casting about for The Next Big Thing.
The brain may be it. Getting to the root of people’s fascination with fiction and fantasy, Mr. Gottschall said, is like “mapping wonderland.” (New York Times )
‘When making moral judgements, we rely on our ability to make inferences about the beliefs and intentions of others. With this so-called “theory of mind“, we can meaningfully interpret their behaviour, and decide whether it is right or wrong. The legal system also places great emphasis on one’s intentions: a “guilty act” only produces criminal liability when it is proven to have been performed in combination with a “guilty mind”, and this, too, depends on the ability to make reasoned moral judgements.
MIT researchers now show that this moral compass can be very easily skewed<. In a new study published in the Proceedings of the National Academy of Sciences, they report that magnetic pulses which disrupt activity in a specific region of the brain’s right hemisphere can interfere with the ability to make certain types of moral judgements, so that hypothetical situations involving attempted harm are perceived to be less morally forbidden and more permissable.’ (Neurophilosophy)
Does this scare you?
“Of all human psychology, self-defeating behavior is among the most puzzling and hard to change. After all, everyone assumes that people hanker after happiness and pleasure. Have you ever heard of a self-help book on being miserable?
So what explains those men and women who repeatedly pursue a path that leads to pain and disappointment? Perhaps there is a hidden psychological reward…” (New York Times )
Wonderful behavioral science writer Jonah Lehrer (Proust Was a Neuroscientist) writes for the New York Times Magazine on the idea that depression may be adaptive. It is not a new idea; I have followed the intriguing literature about possible evolutionary reasons for the persistence of depression ever since I was a psychiatric resident troubled by how readily we in the field want to obliterate any signs of the condition whenever our patients present with it. Some theories have focused on the advantages of resource preservation, given the social isolation, decreased motivation and lessened self-indulgence the depressed person displays. It has also been suggested that the depressive alteration in cognition, in the direction of impaired self-esteem, decreased sense of efficacy and control over one’s circumstances, and pessimism , may actually be more realistic, at least in some circulstances, than the rose-colored glasses with which we usually walk around.
But recent research adds neuropsychological evidence of increased brain activity in depressed patient in regions of the prefrontal cortex associated with problem-solving, proportional to the degree of depression. It is certainly not the whole explanation, as critics counter, because some of the maladaptive impact of depression, including poor self-care, impairment in childrearing, increased susceptibility to other illness, and last but not least suicide, will outweigh the problem-solving advantages it might confer. Furthermore, there are many different kids of depression both in terms of precipitant and symptomatology. At one extreme, a person may become depressed in response to an acute recent loss (or even a future anticipated one); on the other hand, some people can develop either a dense acute depression or a smouldering chronic one without substantial stresses or losses. The imprecisions in both the lay person’s use of the term depression and its more technical clincal utilization muddy the waters in this regard.
Still, it is worth asking why a condition that is so painful and takes such a heavy toll would persist if it were not at least some of the time of some use… and whether, at least some of the time, we do more harm than good in leaping to treat it. Except, of course, the unequivocal good done to the pockets of the shareholders and executives of the pharmaceutical companies, reaping the profits from the explosive growth in antidepressant sales of the last few decades. (New York Times Magazine)
“Despite the scientific implausibility of the same disease—addiction—underlying both damaging heroin use and overenthusiasm for World of Warcraft, the concept has run wild in the popular imagination. Our enthusiasm for labeling new forms of addictions seems to have arisen from a perfect storm of pop medicine, pseudo-neuroscience, and misplaced sympathy for the miserable.” — Vaughan Bell (Slate)
“The tenuousness of modern life can make anyone feel overwrought. And in societal moments like the one we are in — thousands losing jobs and homes, our futures threatened by everything from diminishing retirement funds to global warming — it often feels as if ours is the Age of Anxiety. But some people, no matter how robust their stock portfolios or how healthy their children, are always mentally preparing for doom. They are just born worriers, their brains forever anticipating the dropping of some dreaded other shoe. For the past 20 years, [Jerome] Kagan and his colleagues have been following hundreds of such people, beginning in infancy, to see what happens to those who start out primed to fret. Now that these infants are young adults, the studies are yielding new information about the anxious brain.
These psychologists have put the assumptions about innate temperament on firmer footing, and they have also demonstrated that some of us… are born anxious — or, more accurately, born predisposed to be anxious. Four significant long-term longitudinal studies are now under way: two at Harvard that Kagan initiated, two more at the University of Maryland under the direction of Nathan Fox, a former graduate student of Kagan’s. With slight variations, they all have reached similar conclusions: that babies differ according to inborn temperament; that 15 to 20 percent of them will react strongly to novel people or situations; and that strongly reactive babies are more likely to grow up to be anxious.” (New York Times Magazine)
“Absurdist literature, it appears, stimulates our brains. That's the conclusion of a study recently published in the journal Psychological Science. Psychologists Travis Proulx of the University of California, Santa Barbara and Steven Heine of the University of British Columbia report our ability to find patterns is stimulated when we are faced with the task of making sense of an absurd tale. What's more, this heightened capability carries over to unrelated tasks.” (Miller-McCune Online Magazine)
“Some people feel that nobody should read the book, and some feel that everybody should read it. The truth is, nobody really knows. Most of what has been said about the book — what it is, what it means — is the product of guesswork, because from the time it was begun in 1914 in a smallish town in Switzerland, it seems that only about two dozen people have managed to read or even have much of a look at it.
Of those who did see it, at least one person, an educated Englishwoman who was allowed to read some of the book in the 1920s, thought it held infinite wisdom — “There are people in my country who would read it from cover to cover without stopping to breathe scarcely,” she wrote — while another, a well-known literary type who glimpsed it shortly after, deemed it both fascinating and worrisome, concluding that it was the work of a psychotic.” (New York Times Magazine)
“This summer could come to be known as the summer when baby boomers began to turn to the obituary pages first, to face not merely their own mortality or ponder their legacies, but to witness the passing of legends who defined them as a tribe, bequeathing through music, culture, news and politics a kind of generational badge that has begun to fray.” (New York Times )
But was Rogers right? Before we toss out mainstream discipline, it would be nice to have some evidence. And now we do.
In 2004, two Israeli researchers, Avi Assor and Guy Roth, joined Edward L. Deci, a leading American expert on the psychology of motivation, in asking more than 100 college students whether the love they had received from their parents had seemed to depend on whether they had succeeded in school, practiced hard for sports, been considerate toward others or suppressed emotions like anger and fear.
It turned out that children who received conditional approval were indeed somewhat more likely to act as the parent wanted. But compliance came at a steep price. First, these children tended to resent and dislike their parents. Second, they were apt to say that the way they acted was often due more to a “strong internal pressure” than to “a real sense of choice.” Moreover, their happiness after succeeding at something was usually short-lived, and they often felt guilty or ashamed…” (New York Times )
Cocksure: did overconfidence bring down Wall Street? — Malcolm Gladwell (The New Yorker)
“Cricket and Psychoanalysis: “Both test cricket and psychoanalysis are out of tune with a world that demands quick results. That’s our loss, argues former England cricket captain Mike Brearley, now Britain’s leading psychoanalyst.” (Prospect Magazine)
“No one could accuse the American Psychiatric Association of missing a strain of sourness in the country, or of failing to capitalize on its diagnostic potential. Having floated “Apathy Disorder” as a trial balloon, to see if it might garner enough support for inclusion in the next edition of the Diagnostic and Statistical Manual of Mental Disorders, the world’s diagnostic bible of mental illnesses, the organization has generated untold amounts of publicity and incredulity this week by debating at its convention whether bitterness should become a bona fide mental disorder.” (Psychology Today)
- I looked You Up – I know what your Problem is!!! (dummidumbwit.wordpress.com)
- REVISING THE DSM-IV INTO THE DSM-V. There are high stakes here – a DSM-sanctioned diagnosis can m… (pajamasmedia.com)
- Update: DSM-V Major Changes (psychcentral.com)
- The Next Attention Deficit Disorder? (time.com)
- At the Doctor’s Office – How Not to Get Sick – TIME (time.com)
- Redefining Crazy: Researchers Revise the DSM (time.com)
- Sadly not insanity: psychiatrists want bitterness classed as a mental disorder (inquisitr.com)
“…[Howard] Gardner, a professor of cognition and education at the Harvard Graduate School of Education, who won a prestigious MacArthur Foundation “genius award” in 1981, has had enormous influence, particularly in our schools. Briefly, he has posited that our intellectual abilities are divided among at least eight abilities: verbal-linguistic, logical-mathematical, visual-spatial, bodily-kinesthetic, naturalistic, musical, interpersonal, and intrapersonal. The appealing elements of the theory are numerous.
Multiple intelligences put every child on an equal footing, granting the hope of identical value in an ostensible meritocracy. The theory fits well with a number of the assumptions that have dominated educational philosophy for years. The movements that took flower in the mid-20th century have argued for the essential sameness of all healthy human beings and for a policy of social justice that treats all people the same. Above all, many educators have adhered to the social construction of reality — the idea that redefining the way we treat children will redefine their abilities and future successes. (Perhaps that’s what leads some parents to put their faith in “Baby Einstein” videos: the hope that a little nurturing television will send their kids to Harvard.) It would be difficult to overestimate the influence of Gardner’s work, both in repudiating that elitist, unfair concept of “g” and in guiding thought in psychology as it applies to education.
The only problem, with all respect to Gardner: There probably is just a single intelligence or capacity to learn, not multiple ones devoted to independent tasks. To varying degrees, some individuals have this capacity, and others do not. To be sure, there is much debate about Gardner’s theory in the literature, with contenders for and against. Nonetheless, empirical evidence has not been robust. While the theory sounds nice (perhaps because it sounds nice), it is more intuitive than empirical. In other words, the eight intelligences are based more on philosophy than on data.” (The Chronicle of Higher Education)
- Building the 21st-Century Mind (3quarksdaily.com)
- The Eight Intelligences (slideshare.net)
- Episode 90: The Learning Styles Myth: An Interview with Daniel Willingham (thepsychfiles.com)
In a series of recent studies, psychologists have found that reddening cheeks soften others’ judgments of bad or clumsy behavior, and help to strengthen social bonds rather that strain them. If nothing else, the new findings should take some of the personal sting out of the facial fire shower when it inevitably hits.’ (New York Times )
Jonah Lehrer writing in the Boston Globe: “…[S]cientists have begun to dramatically revise their concept of a baby’s mind. By using new research techniques and tools, they’ve revealed that the baby brain is abuzz with activity, capable of learning astonishing amounts of information in a relatively short time. Unlike the adult mind, which restricts itself to a narrow slice of reality, babies can take in a much wider spectrum of sensation – they are, in an important sense, more aware of the world than we are.
This hyperawareness comes with several benefits. For starters, it allows young children to figure out the world at an incredibly fast pace. Although babies are born utterly helpless, within a few years they’ve mastered everything from language – a toddler learns 10 new words every day – to complex motor skills such as walking. According to this new view of the baby brain, many of the mental traits that used to seem like developmental shortcomings, such as infants’ inability to focus their attention, are actually crucial assets in the learning process.
In fact, in some situations it might actually be better for adults to regress into a newborn state of mind…”
This is the cultural moment of the narcissist. In a New Yorker cartoon, Roz Chast suggests a line of narcissist greeting cards (“Wow! Your Birthday’s Really Close to Mine!”). John Edwards outed himself as one when forced to confess an adulterous affair. (Given his comical vanity, the deceitful way he used his marriage for his advancement, and his self-elevation as an embodiment of the common man while living in a house the size of an arena, it sounds like a pretty good diagnosis.) New York Times critic Alessandra Stanley wrote of journalists who Twitter, “it’s beginning to look more like yet another gateway drug to full-blown media narcissism.” And what other malady could explain the simultaneous phenomena of Blago and the Octomom?” — Emily Yoffe via Slate.
Marx was wrong: The opiate of the masses isn’t religion, but spectator sports. What else explains the astounding fact that millions of seemingly intelligent human beings feel that the athletic exertions of total strangers are somehow consequential for themselves? The real question we should be asking during the madness surrounding this month’s collegiate basketball championship season is not who will win, but why anyone cares.” via TThe Chronicle of Higher Education.
Thank heavens someone is thinking about one of the most troublesome experiences I have — my inability to remember a joke I have heard, no matter how funny and no matter how determined I am to retain it to share with others later.
“Really great jokes… work not by conforming to pattern recognition routines but by subverting them. “Jokes work because they deal with the unexpected, starting in one direction and then veering off into another,” said Robert Provine, a professor of psychology at the University of Maryland, Baltimore County, and the author of Laughter: A Scientific Investigation. “What makes a joke successful are the same properties that can make it difficult to remember.”
This may also explain why the jokes we tend to remember are often the most clichéd ones. A mother-in-law joke? Yes, I have the slot ready and labeled.
Memory researchers suggest additional reasons that great jokes may elude common capture. Daniel L. Schacter, a professor of psychology at Harvard and the author of The Seven Sins of Memory, says there is a big difference between verbatim recall of all the details of an event and gist recall of its general meaning.
“We humans are pretty good at gist recall but have difficulty with being exact,” he said. Though anecdotes can be told in broad outline, jokes live or die by nuance, precision and timing. And while emotional arousal normally enhances memory, it ends up further eroding your attention to that one killer frill. “Emotionally arousing material calls your attention to a central object,” Dr. Schacter said, “but it can make it difficult to remember peripheral details.” via NYTimes.
This may be a special case of something over which I have more generally puzzled — what is the difference between those raconteurs, who always seem to have a moving story or stories (funny or dreadful) to tell on any occasion, and others who are at a loss for words in social settings. I’m not talking about people who are shy or painfully inhibited so much as those who seem to have the material and those who don’t.
Is there that much of a difference in the content of people’s lives? Is it something about how observant they are? Or, again, something about memory function? I am fascinated by storytelling (for instance, I love the Moth podcast) and have always been intrigued by advertisements about storytelling workshops promising to develop attendees’ skills.
To some extent, there is a cultural influence as well. I suspect storytelling is a dying art, along with letter-writing and reading fiction, a way we used to interact and divert ourselves which is progressively and inexorably being supplanted in modernity. But there are still enough good conversationalists around to astound me.
Of course, other people may find it far easier than I do to talk about what happened to them during their workday, one of the important sources of our stories. As a therapist, I am privileged to hear in detail about a broad range of the lives of others, but all of what I am told, I am told in confidence. Perhaps I gravitated toward psychotherapy because I sensed myself to be a far better listener to the stories of others than I am a storyteller myself. In fact, some construe the work of psychotherapy as training our clients to become better storytellers about their own lives, as largely a matter of imposing coherence and pattern on their recollections and observations about themselves, making better sense of their lives, consequently appreciating and tolerating the humor and the pathos in their lives better, and developing an empathic connection to the life stories of those around them.
- Absentmindedness (oup.com)
This article in Scientific American by David Dobbs reports on the growing concern that “the concept of post-traumatic stress disorder is itself disordered”. The writer is critical of a culture which “seemed reflexively to view bad memories, nightmares and any other sign of distress as an indicator of PTSD.” To critics like this, the overwhelming incidence of PTSD diagnoses in returning Iraqi veterans is not a reflection of the brutal meaningless horror to which many of the combatants were exposed but of a sissy culture that can no longer suck it up. As usual, the veil of ‘objective’ ‘scientific’ evidence is used to cloak ideological biases.
FmH readers know that I too am critical of the frequency of PTSD diagnosis in modern mental health practice, but I think that is not a problem with the theoretical construct of PTSD but its slapdash application. With respect to domestic PTSD, the problem is one of overzealous and naive clinicians ignoring the diagnostic criteria and, more important, misunderstanding the clinical significance and intent of the diagnosis, labelling with PTSD far too many people who have ever had anything more than a little upsetting or distressing happen to them. Essentially, PTSD is meant to refer to the longterm consequences of either an experience or experiences that are outside the bounds of what the human psyche can endure. Both emotionally and neurobiologically, the capacity of the organism is overwhelmed and the fact of the trauma assumes an overarching and inescapable central role in future information processing, functioning and sense of self. Experience that occurs when the body is flooded with unimaginably high levels of stress hormones, when the nervous system is in the throes of the fight-or-flight response, and when the normal processes for making sense of what we are going through utterly break down are encoded differently in the body and mind, with immeasurable effects. Only someone who did not grasp this at all could misrecognize simple anxiety, depression or adjustment difficulties as PTSD. But it happens all the time, especially in the treatment of depressed women, largely because of do-gooder clinicians’ desires to be politically correct and not be seen as insensitive to their clients’ suffering. Unfortunately, what it mostly does is train these clients to remain lifelong inhabitants of a self-fulfilling inescapable victim role.
The concern, on the other hand, with soldiers returning from the wars in central Asia, is the opposite. All evidence is that PTSD is being underdiagnosed, because of systematic biases within the government and the military to deny the scope of the problem. Articles such as this, and the research that it depicts, should be seen as nothing but a conservative backlash, an effort to blame the victims. If coping with the scope of PTSD is a problem, deny the reality of PTSD. Certainly considerable research suggests that a proportion of soldiers returning from the battle front in bad shape will have shown their resilience, will no longer show a high magnitude of emotional disturbance, and will not warrant a diagnosis of PTSD if reassessed months or years later. Research also suggests that early intervention using a trauma paradigm may do more harm than good, perpetuating the vulnerability of the patient. And most Defense Dept. research on the effects of combat trauma is intended to figure out how to block the stress reaction so that a soldier can remain functional and return to a combat role as soon as possible. But it remains the case that the human nervous system did not evolve to endure the horrors of modern war, and that the indefensibility and anomie of this war in particular, based as it has been entirely on lies, amplifies the intolerability and makes it far less likely that a veteran can find sustaining meaning in the suffering they endured. This will inevitably turn into higher rates of PTSD than among veterans of other wars.
To deny the scale of PTSD in our returning veterans is to be an unquestioning apologist for the untrammelled American imperialist projection of power in lawless aggression. As Dobbs describes it, the PTSD deniers construe us as having a cultural obsession with PTSD which embodies “a prolonged failure to contextualize and accept our own collective aggression.” What horse manure. Our cultural neurosis, rather, lies in the unquestioning acceptance of suggestions like Dobbs’ that we should mindlessly embrace such aggression as natural. This was the neurosis that made it possible to elect Bush and his handlers to enact an administration that set about violating every supposed principle of our democracy and our humanity. I know we are not supposed to draw this particular analogy, but this brand of PTSD denial strikes me as akin to nothing as much as Holocaust denial. Via Scientific American.
Teen Conflicts Linked To Potential Risk For Adult Cardiovascular Disease:
‘…[I]n a study of otherwise healthy, normal teens who self-reported various negative interpersonal interactions, researchers found that a greater frequency of such stress was associated with higher levels of an inflammatory marker called C-reactive protein, or CRP. CRP has been identified as an indicator for the later development of cardiovascular disease (CVD).
“Although most research on stress and inflammation has focused upon adulthood, these results show that such links can occur as early as the teenage years, even among a healthy sample of young men and women,” [an investigator] said. “That suggests that alterations in the biological substrates that initiate CVD begin before adulthood.” ‘ via Science Daily.
“It can be, but it can be good for you, too—a fact scientists tend to ignore and regular folks don’t appreciate.” via Newsweek.
Predictable that we will see a spate of articles like this as the economy continues to melt down.
- Are You In Control? (indenialhealth.com)
- Is that glass half-empty, or half-full? Be careful – your answer may result in telomere shortening! (ouroboros.wordpress.com)
“For a small band of shrinks, intervening in catastrophic situations is an everyday event. But their experience at the edge has deep consequences for us all: It is altering our understanding of the true nature of human nature.” via Psychology Today.
But is this really such a good idea? A growing number of cautionary voices from the world of mental health research are saying it isn’t. They fear that the increasing tendency to treat normal sadness as if it were a disease is playing fast and loose with a crucial part of our biology. Sadness, they argue, serves an evolutionary purpose – and if we lose it, we lose out.
“When you find something this deeply in us biologically, you presume that it was selected because it had some advantage, otherwise we wouldn’t have been burdened with it,” says Jerome Wakefield, a clinical social worker at New York University and co-author of The Loss of Sadness: How psychiatry transformed normal sorrow into depressive disorder (with Allan Horwitz, Oxford University Press, 2007). “We’re fooling around with part of our biological make-up.”
Perhaps, then, it is time to embrace our miserable side. Yet many psychiatrists insist not. Sadness has a nasty habit of turning into depression, they warn. Even when people are sad for good reason, they should be allowed to take drugs to make themselves feel better if that’s what they want.
So who is right? Is sadness something we can live without or is it a crucial part of the human condition?
…there are lots of ideas about why our propensity to feel sad might have evolved. It may be a self-protection strategy, as it seems to be among other primates that show signs of sadness. …it helps us learn from our mistakes. …even full-blown depression may save us from the effects of long-term stress. Without taking time out to reflect, he says, “you might stay in a state of chronic stress until you’re exhausted or dead”. …By acting sad, we tell other community members that we need support….Then there is the notion that creativity is connected to dark moods. …There is also evidence that too much happiness can be bad for your career…” (More)
via New Scientist.
Posting articles on this theme is, readers may have noticed, a recurrent event here on FmH. I began to be introduced to this notion, that depression might serve a useful purpose and that we had to rethink our knee-jerk readiness to vanquish it (and normal sadness as well, which is difficult to disentangle from pathological depression) whenever we encountered it, early in my career. I think it has fundamentally informed my skepticism about the way we organize and administer psychiatric services in this society. In addition, there are concerns that too readily resorting to antidepressant therapy may reinforce future propensity for depressive reactions and need for medication (which I’m sure will please the pharmaceutical industry to hear). I have always said that getting people off of medications, or refraining from prescribing them, are equally important functions of a psychopharmacologist as is prescribing astutely.
- When Sadness Is a Good Thing (Time)
- Is There Really an Epidemic of Depression? (Scientific American)
- Are shy people mentally ill?: the DSM and SAD (Shelved) [NB: This article uses “SAD” to refer to “social anxiety disorder”, whereas usually it denotes “seasonal affective disorder”. — FmH]
- Why There’s No Epidemic of Depression (Psych Central)
Great mind hacks, including the Ganzfeld procedure and Purkinje lights, via the Boston Globe.
“…[G]enuine excuse artisans — and there are millions of them — don’t wait until after choking to practice their craft. They hobble themselves, in earnest, before pursuing a goal or delivering a performance. Their excuses come preattached: I never went to class. I was hung over at the interview. I had no idea what the college application required.
“This is real self-sabotage, like drinking heavily before a test, skipping practice or using really poor equipment,” said Edward R. Hirt, a psychologist at Indiana University. “Some people do this a lot, and often it’s not clear whether they’re entirely conscious of doing it — or of its costs.”
Psychologists have studied this sort of behavior since at least 1978, when Steven Berglas and Edward E. Jones used the phrase “self-handicapping” to describe students in a study who chose to take a drug that they were told would inhibit their performance on an exam (the drug was actually inert).
The urge goes well beyond a mere lowering of expectations, and it has more to do with protecting self-image than with psychological conflicts rooted in early development, in the Freudian sense. Recent research has helped clarify not just who is prone to self-handicapping but also its consequences — and its possible benefits.”
via New York Times.
Recent research shows that our moods are far more strongly influenced by those around us than we tend to think. Not only that, we are also beholden to the moods of friends of friends, and of friends of friends of friends – people three degrees of separation away from us who we have never met, but whose disposition can pass through our social network like a virus.
Indeed, it is becoming clear that a whole range of phenomena are transmitted through networks of friends in ways that are not entirely understood: happiness and depression, obesity, drinking and smoking habits, ill-health, the inclination to turn out and vote in elections, a taste for certain music or food, a preference for online privacy, even the tendency to attempt or think about suicide. They ripple through networks “like pebbles thrown into a pond”, says Nicholas Christakis, a medical sociologist at Harvard Medical School in Boston, who has pioneered much of the new work.
via New Scientist.
“Some researchers say johns seek intimacy on demand; others believe these men typically want to use and dominate women…”
via Scientific American.
“The dead stay with us, that much is clear. They remain in our hearts and minds, of course, but for many people they also linger in our senses—as sights, sounds, smells, touches or presences. Grief hallucinations are a normal reaction to bereavement but are rarely discussed, because people fear they might be considered insane or mentally destabilised by their loss. As a society we tend to associate hallucinations with things like drugs and mental illness, but we now know that hallucinations are common in sober healthy people and that they are more likely during times of stress.”
“Professor says that nobody should be fooled by ‘dangerous’ myths about boosting creativity”
‘…[P]eople who actively seek lifestyle changes may have a more developed connection between two specific brain areas: the hippocampus, a site for storing and retrieving new and old memories, and the ventral striatum, a reward system which is responsible for those carpe diem moments, said researcher Dr. Bernd Weber of the Life & Brain Center at the University of Bonn in Germany. Turns out, if the hippocampus identifies an experience as new, it then relays signals to the striatum to release neurotransmitters which lead to positive feelings.
"The strength of the connection is positively correlated to novelty seek[ers] …’
via Yahoo! News
Distinguished developmental psychologist Jerome Kagan argues that the current spate of childhood mental health diagnoses such as ADHD and bipolar disorder do not represent biological diseases but rather convenient explanations that get us off the hook by covering up social problems. He discusses social trends that may account for childhood behavioral difficulties.
I agree that childhood disorders are overdiagnosed and that, in general, we are in an era of overmedicalization of behavioral problems for a variety of reasons, not the least of them being the influence of Big Pharma. I hope no one thinks any longer that psychiatric diagnoses are immutable gospel truths. From revision to revision, the nomenclature changes. The boundaries of what is considered psychopathology expand and contract (in this era, mostly expand) and the internal pigeonholes are everchanging. Our research practices, supposed to contribute to “evidence-based” medical reasoning, compound the errors, because drug companies have a subtle and not-so-subtle vested interest in the results, they fund much of it, and there is an inherent bias against the publication of negative or disconfirmatory results.
On the other hand, let us not throw the baby out with the bathwater. We should be long past the need to debate nature vs. nurture in mental ilness, social context vs. biology. There are of course contributions of both, and Dr Kagan’s argument should not be seen as dismissing the biological bases of behavioral problems whole hog. I do agree with him, vehemently, though, that overdiagnosis and overattribution is rife, and that it is obscene when you look at the major consequences, the pathologizatioon and the foisting of enormous volumes of medication on our children and youth. A good psychiatrist’s role should be as much to take patients off medication as to get them on it.
- What Determines the Diagnosis of Adhd?
- ADHD drugs should be last resort: new guidelines
- Magazine Preview: The Bipolar Puzzle
- 3 Ways to Be Wise About Psychiatric Drugs for Kids
- Treatment And Diagnosis Of Mental Disorders In Children
- Tranquillisers putting lives at risk
- Coping With Bipolar Disorder in Children
- When Are Therapeutic Interventions Recommended?
“Manhattan is the capital of people living by themselves. But are New Yorkers lonelier? Far from it, say a new breed of loneliness researchers, who argue that urban alienation is largely a myth.”
The author argues that loneliness is relative. Just as widows do better in a housing development with alot of widows, people living alone do better in New York, with the largest proportion of single-person households in any major city (around 1:2). And suicide rates, which since Emile Durkheim‘s classic sociological study Suicide have been tied to loneliness and isolation, run at a lower rate in New York than other urban areas.
“The key to a con is not that you trust the conman, but that he shows he trusts you. Conmen ply their trade by appearing fragile or needing help, by seeming vulnerable. Because of THOMAS, the human brain makes us feel good when we help others–this is the basis for attachment to family and friends and cooperation with strangers. “I need your help” is a potent stimulus for action.”
via Psychology Today
Health Professionals Fear Web Sites That Support Theories on Mind Control (New York Times ). The internet may have fundamentally changed the experience of those who believe they are stalked or persecuted. Sites filled with stories from people calling themselves victims of “mind control” or “gang stalking” offer support and validation, in contrast to the isolation and pejoration with which they were treated in the pre-internet era. Many mental health professionals are alarmed that such sites encourage delusional thinking. The growth of such a community of sufferers with shared beliefs presents a fundamental challenge to the definition of delusions, as beliefs that are at odds with those shared by one’s culture or subculture.
The interest of law enforcement and government agencies in covert surveillance, mind-control and chemical interrogation techniques (cf. MK-ULTRA)is enough evidence to encourage such beliefs, and their dismissal by health professionals and others is seen as evidence of a cover-up of the truth.
However, others who see the isolation and quiet torment in which people with psychotic disorders live feel that the growth of a supportive community could be a good thing. In my own work with patients who believe they are subject to mind control or gang stalking, I do not find confronting and contradicting their beliefs is effective. In fact, I am sensitive to the ways in which it perpetuates the violence and persecution that has been done to them by other powerful individuals in their lives. Treatment, the aim of which after all is to relieve suffering, cannot be done in an intellectually dishonest way in which one acts out a charade of sharing the patient’s beliefs. But treatment must be experienced as a safe place in which to have one’s thoughts, whether agreed with or not. Contrary to the opinion of one psychiatrist interviewed for this article, who says that but for these internet sites reinforcing the thinking, it would fade away because never validated, the essence of delusional thinking is that it is logically self-validating. The sufferer has constructed an airtight explanation for disturbing experiences and perceptions they have, an explanation which is not falsifiable. Its assertions are self-fulfilling. That is the logic and, if you will, the beauty of delusional thinking. In my experiences, such thinking is not malleable and precisely does not fade away. To attempt to confront it is to invalidate the person in front of you, doing profound existential violence to an already quite vulnerable person. This is the essence of what I have always taught my students as a core approach to a psychotic individual.
This has been known for a long time in psychological circles, and it is merely the self-anointed but misguided role of mental health providers as arbiters of thought and vanquishers of mental illness that prevents our acceptance of immutable delusional thinking. My uncle, the psychologist Milton Rokeach, wrote in his 1964 book The Three Christs of Ypsilanti of an experiment in which he brought together three psychiatric patients each of whom believed he was Christ… sort of meeting irresistible force with immovable object. He hoped that the coexistence of logically incompatible beliefs would correct the delusions. He later wrote that he regretted the experiment, because as it turned out all that it had done had been to vastly amplify the distress and confusion of the three subjects.
In addition to my uncle, several of my mentors and teachers were influential in grappling with how to situate themselves properly with respect to the challenging beliefs of their patients, if they were neither to fraudulently say they agreed nor to contradict by brute force. R.D. Laing took a radical stance of refusing to make distinctions between ‘patients’ and ‘treaters’ as arbiters of the truth. This is an incredibly useful position to take, although I think Laing went too far in that the relationship is inherently asymmetrical; the patient is the one who comes to us with suffering, seeking guidance and succor. Leston Havens devoted himself to the technical craft of finding language and therapeutic stance that would allow the therapist to situate him- or herself as an ally, rather than an opponent, of people so difficult to ally with. John Mack’s work with alien abductees exemplified finding a way to be helpful with a subset of those sufferers whose beliefs are so at odds with prevailing notions.
It has been an area of my own fascination, teaching and research to watch how the lay public’s knowledge and beliefs about mental health issues are spread in the popular media, word of mouth and, more recently, the internet. These means of communication are not a cause of mental illness, but clearly important variables in shaping it. I wonder, WWLD (what would Laing do?) with the internet?
New research suggests that the type of television you watched as a child has a profound effect on the colour of your dreams.” (Telegraph.UK)
Interestingly, dreams were in color before television.
The CLL conducts experiments via the Web. You may participate by clicking here, see results from previous experiments by clicking here. The experiments are short — some take as little as 2-3 minutes to complete. All are anonymous.”