‘Music is often labelled a “universal language,” and according to the philosopher Arthur Schopenhauer, there is a good reason for that….’
— Tim Brinkhof via Big Think
‘Music is often labelled a “universal language,” and according to the philosopher Arthur Schopenhauer, there is a good reason for that….’
— Tim Brinkhof via Big Think
‘When asked to tell their life stories, people with schizophrenia tend to tell unusual ones. First, the basic chronology of the life story is shifted. Most people experience a ‘reminiscence bump’ in early adulthood, with many personally significant and relatively well-remembered events occurring between ages 15 to 30 (and especially between ages 20 and 24).
For instance, we might form memories of graduating college, getting a first job, or starting or ending significant romantic relationships. These events become centrepieces of our life stories, defining who we are for decades to come.
However, schizophrenia causes profound disruptions during these same years. People diagnosed with schizophrenia often become unable to care for themselves, lose valued roles and relationships, and undergo treatment. These experiences seem to curtail the reminiscence bump: rates of personally significant memories might steadily increase in the teenage years, then drop sharply following a diagnosis of schizophrenia.
People living with schizophrenia also tend to include unusual kinds of experiences in their life stories, focusing on psychotic episodes, hospitalisations and traumatic events. Their life stories might even include vivid, emotionally intense experiences of psychotic symptoms themselves – for instance, vivid memories of being spied on, conspired against or chosen by God to save the world.
In short, people with schizophrenia tell unusual life stories about unusual kinds of personal experiences. But why do these stories matter? How might they impact mental health and wellbeing? And how might they change through treatment and recovery?…’
— via Psyche Ideas
‘There’s a new plan to find extraterrestrial civilisations by the way they live. But if we can see them, can they see us?…’
— via Aeon Essays
‘…Studies from around the world have confirmed that jabs are safe and provide good protection against severe forms of the virus. Now a recent report from the Centres for Disease Control (cdc) in America has produced a novel, and even mysterious, reason to be glad for a covid-19 vaccination. The cdc data show that people vaccinated with the Pfizer or Moderna covid-19 jabs are one-third as likely to die of other causes too.
The result is bewildering, all the more so for its scale. The cdc’s study started with the health records of more than 11m Americans. Researchers followed these people from December 2020 to July 2021, recording any deaths and their causes. During this period around 6m people in the cohort received jabs for covid-19. …’
— via The Economist
A reprise of my traditional Hallowe’en post of past years:
It is that time of year again. What has become a time of disinhibited hijinx and mayhem, and a growing marketing bonanza for the kitsch-manufacturers and -importers, has primeval origins as the Celtic New Year’s Eve, Samhain (pronounced “sow-en”). The harvest is over, summer ends and winter begins, the Old God dies and returns to the Land of the Dead to await his rebirth at Yule, and the land is cast into darkness. The veil separating the worlds of the living and the dead becomes frayed and thin, and dispossessed dead mingle with the living, perhaps seeking a body to possess for the next year as their only chance to remain connected with the living, who hope to scare them away with ghoulish costumes and behavior, escape their menace by masquerading as one of them, or placate them with offerings of food, in hopes that they will go away before the new year comes. For those prepared, a journey to the other side could be made at this time.
With Christianity, perhaps because with calendar reform it was no longer the last day of the year, All Hallows’ Eve became decathected, a day for innocent masquerading and fun, taking its name Hallowe’en as a contraction and corruption of All Hallows’ Eve.
All Saints’ Day may have originated in its modern form with the 8th century Pope Gregory III. Hallowe’en customs reputedly came to the New World with the Irish immigrants of the 1840’s. The prominence of trick-or-treating has a slightly different origin, however.
The custom of trick-or-treating is thought to have originated not with the Irish Celts, but with a ninth-century European custom called souling. On November 2, All Souls Day, early Christians would walk from village to village begging for “soul cakes,” made out of square pieces of bread with currants. The more soul cakes the beggars would receive, the more prayers they would promise to say on behalf of the dead relatives of the donors. At the time, it was believed that the dead remained in limbo for a time after death, and that prayer, even by strangers, could expedite a soul’s passage to heaven.
Jack-o’-lanterns were reportedly originally turnips; the Irish began using pumpkins after they immigrated to North America, given how plentiful they were here. The Jack-o-lantern custom probably comes from Irish folklore. As the tale is told, a man named Jack, who was notorious as a drunkard and trickster, tricked Satan into climbing a tree. Jack then carved an image of a cross in the tree’s trunk, trapping the devil up the tree. Jack made a deal with the devil that, if he would never tempt him again, he would promise to let him down the tree.
According to the folk tale, after Jack died, he was denied entrance to Heaven because of his evil ways, but he was also denied access to Hell because he had tricked the devil. Instead, the devil gave him a single ember to light his way through the frigid darkness. The ember was placed inside a hollowed-out turnip to keep it glowing longer.
Nowadays, a reported 99% of cultivated pumpkin sales in the US go for jack-o-lanterns.
Folk traditions that were in the past associated with All Hallows’ Eve took much of their power, as with the New Year’s customs about which I write here every Dec. 31st, from the magic of boundary states, transition, and liminality.
The idea behind ducking, dooking or bobbing for apples seems to have been that snatching a bite from the apple enables the person to grasp good fortune. Samhain is a time for getting rid of weakness, as pagans once slaughtered weak animals which were unlikely to survive the winter. A common ritual calls for writing down weaknesses on a piece of paper or parchment, and tossing it into the fire. There used to be a custom of placing a stone in the hot ashes of the bonfire. If in the morning a person found that the stone had been removed or had cracked, it was a sign of bad fortune. Nuts have been used for divination: whether they burned quietly or exploded indicated good or bad luck. Peeling an apple and throwing the peel over one’s shoulder was supposed to reveal the initial of one’s future spouse. One way of looking for omens of death was for peope to visit churchyards
The Witches’ Sabbath aspect of Hallowe’en seems to result from Germanic influence and fusion with the notion of Walpurgisnacht. (You may be familiar with the magnificent musical evocation of this, Mussorgsky’s Night on Bald Mountain.)
Although probably not yet in a position to shape mainstream American Hallowe’en traditions, Mexican Dia de los Muertos observances have started to contribute some delightful and whimsical iconography to our encounter with the eerie and unearthly as well. As this article in The Smithsonian reviews, ‘In the United States, Halloween is mostly about candy, but elsewhere in the world celebrations honoring the departed have a spiritual meaning…’
Reportedly, more than 80% of American families decorate their homes, at least minimally, for Hallowe’en. What was the holiday like forty or fifty years ago in the U.S. when, bastardized as it has now become with respect to its pagan origins, it retained a much more traditional flair? Before the era of the pay-per-view ’spooky-world’ type haunted attractions and its Martha Stewart yuppification with, as this irreverent Salon article from several years ago [via walker] put it, monogrammed jack-o’-lanterns and the like? One issue may be that, as NPR observed,
‘“Adults have hijacked Halloween… Two in three adults feel Halloween is a holiday for them and not just kids,” Forbes opined in 2012, citing a public relations survey. True that when the holiday was imported from Celtic nations in the mid-19th century — along with a wave of immigrants fleeing Irelands potato famine — it was essentially a younger persons’ game. But a little research reveals that adults have long enjoyed Halloween — right alongside young spooks and spirits.’
Is that necessarily a bad thing? A 1984 essay by Richard Seltzer, frequently referenced in other sources, entitled “Why Bother to Save Hallowe’en?”, argues as I do that reverence for Hallowe’en is good for the soul, young or old.
“Maybe at one time Hallowe’en helped exorcise fears of death and ghosts and goblins by making fun of them. Maybe, too, in a time of rigidly prescribed social behavior, Hallowe’en was the occasion for socially condoned mischief — a time for misrule and letting loose. Although such elements still remain, the emphasis has shifted and the importance of the day and its rituals has actually grown.…(D)on’t just abandon a tradition that you yourself loved as a child, that your own children look forward to months in advance, and that helps preserve our sense of fellowship and community with our neighbors in the midst of all this madness.”
That would be anathema to certain segments of society, however. Hallowe’en certainly inspires a backlash by fundamentalists who consider it a blasphemous abomination. ‘Amateur scholar’ Isaac Bonewits details academically the Hallowe’en errors and lies he feels contribute to its being reviled. Some of the panic over Hallowe’en is akin to the hysteria, fortunately now debunked, over the supposed epidemic of ‘ritual Satanic abuse’ that swept the Western world in the ’90’s.
The horror film has become inextricably linked to Hallowe’en tradition, although the holiday itself did not figure in the movies until John Carpenter took the slasher genre singlehandedly by storm. Googling “scariest films”, you will, grimly, reap a mother lode of opinions about how to pierce the veil to journey to the netherworld and reconnect with that magical, eerie creepiness in the dark (if not the over-the-top blood and gore that has largely replaced the subtlety of earlier horror films).
The Carfax Abbey Horror Films and Movies Database includes best-ever-horror-films lists from Entertainment Weekly, Mr. Showbiz and Hollywood.com. I’ve seen most of these; some of their choices are not that scary, some are just plain silly, and they give extremely short shrift to my real favorites, the evocative classics of the ’30’s and ’40’s when most eeriness was allusive and not explicit. And here’s what claims to be a compilation of links to the darkest and most gruesome sites on the web. “Hours and hours of fun for morbidity lovers.”
Boing Boing does homage to a morbid masterpiece of wretched existential horror, two of the tensest, scariest hours of my life repeated every time I watch it:
‘…The Thing starts. It had been 9 years since The Exorcist scared the living shit out of audiences in New York and sent people fleeing into the street. Really … up the aisle and out the door at full gallop. You would think that people had calmed down a bit since then. No…’
Meanwhile, what could be creepier in the movies than the phenomenon of evil children? Gawker knows what shadows lurk in the hearts of the cinematic young:
‘In celebration of Halloween, we took a shallow dive into the horror subgenre of evil-child horror movies. Weird-kid cinema stretches back at least to 1956’s The Bad Seed, and has experienced a resurgence recently via movies like The Babadook, Goodnight Mommy, and Cooties. You could look at this trend as a natural extension of the focus on domesticity seen in horror via the wave of haunted-house movies that 2009’s Paranormal Activity helped usher in. Or maybe we’re just wizening up as a culture and realizing that children are evil and that film is a great way to warn people of this truth. Happy Halloween. Hope you don’t get killed by trick-or-treaters.’
In any case: trick or treat! …And may your Hallowe’en soothe your soul.
‘For the vast majority of our species’ history, those were the two principal categories of human relations: kin and gods. Those we know who know us, grounded in mutual social interaction, and those we know who don’t know us, grounded in our imaginative powers.
But now consider a third category: people we don’t know and who somehow know us. They pop up in mentions, comments, and replies; on subreddits, message boards, or dating apps. Most times, it doesn’t even seem noteworthy: you look down at your phone and there’s a notification that someone you don’t know has liked a post. You might feel a little squirt of endorphin in the brain, an extremely faint sense of achievement. Yet each instance of it represents something new as a common human experience, for their attention renders us tiny gods. The Era of Mass Fame is upon us…
With the possibility of this level of exposure so proximate, it’s not surprising that poll after poll over the past decade indicates that fame is increasingly a prime objective of people twenty-five and younger. Fame itself, in the older, more enduring sense of the term, is still elusive, but the possibility of a brush with it functions as a kind of pyramid scheme.
This, perhaps, is the most obviously pernicious part of the expansion of celebrity: ever since there have been famous people, there have been people driven mad by fame. In the modern era, it’s a cliché: the rock star, comedian, or starlet who succumbs to addiction, alienation, depression, and self-destruction under the glare of the spotlight. Being known by strangers, and, even more dangerously, seeking their approval, is an existential trap. And right now, the condition of contemporary life is to shepherd entire generations into this spiritual quicksand…
I’ve come to believe that, in the Internet age, the psychologically destabilizing experience of fame is coming for everyone. Everyone is losing their minds online because the combination of mass fame and mass surveillance increasingly channels our most basic impulses—toward loving and being loved, caring for and being cared for, getting the people we know to laugh at our jokes—into the project of impressing strangers, a project that cannot, by definition, sate our desires but feels close enough to real human connection that we cannot but pursue it in ever more compulsive ways…’
— Chris Hayes, host of “All In with Chris Hayes,” on MSNBC, and the podcast “Why Is This Happening?”, via The New Yorker
This new ambitious and earthshaking history of humanity by two anthropologists finds that established narratives of modernity have been based on constricting and false premises which make it impossible to believe in our inherent cooperativeness. The errors arise from the gospel that
‘…for most of human history our ancestors lived an egalitarian and leisure-filled life in small bands of hunter-gatherers. Then, as [Jared] Diamond put it, we made the “worst mistake in human history”, which was to increase population numbers through agricultural production. This, so the story goes, led to hierarchies, subordination, wars, disease, famines and just about every other social ill – thus did we plunge from Rousseau’s heaven into Hobbes’s hell.
— Andrew Anthony via The Guardian
Synthesizing a wealth of recent archeological data, they replace the idea that humanity was forced along through preordained evolutionary stages, in which humans are passive objects of material forrces, with a picture of prehistoric communities shaping their own political organization and social realities. The upshot is that it is not inevitable that we are stuck in the modern system of hierarchies and conspicuous inequalities of wealth and consumption. In fact, I might add, the accepted narrative comes to appear as dictated by the prevailing ideology in a self-serving justification of the status quo.
A more detailed account of the thesis can be found in this piece in The Atlantic by William Deresiewicz, “Human History Gets A Rewrite.” Graeber and Wengrow show that hunter-gatherer societies were more complex and varied than we knew and that these people made deliberate and collective decisions about how to organize themselves, in other words “practicing politics.”
When you think about it for a moment, since these were essentially human beings like ourselves, how could we not have realized that it would not be otherwise? From my own anthropological studies before I became a psychiatrist, I do know however that they are not the first to dispute the commonplace notion of the simple savages unselfish-consciously dwelling in “a kind of eternal present or cyclical dreamtime, waiting for the Western hand to wake them up and fling them into history.” A counterpoint to that can be found as far back as the work of Claude Lévi-Strauss,’ whose eye-opening The Savage Mind (1962) established how prehistoric knowledge organization was based on a sophisticated “scientific” approach.
Carrying their narrative forward to the development of city-states, Graeber and Wengrow show that they were not an inevitable consequence of agriculture, as we thought, but in many instances preceded it. A related assumption they turn on their head is that large populations inherently need layers of bureaucracy to govern them and that scale leads inevitably to hierarchy and political inequality. They describe data showing that many early cities with populations of thousands show no signs of centralized administration. If anything, they claim, aristocracy emerged in the smaller settlements of warrior societies which were in tension with the agricultural states. So ‘the state’ is not the inevitable apex form of human social organization but one of
‘…a shifting combination of, as they enumerate them, the three elementary forms of domination: control of violence (sovereignty), control of information (bureaucracy), and personal charisma (manifested, for example, in electoral politics). Some states have displayed just two, some only one—which means the union of all three, as in the modern state, is not inevitable (and may indeed, with the rise of planetary bureaucracies like the World Trade Organization, be already decomposing). More to the point, the state itself may not be inevitable. For most of the past 5,000 years, the authors write, kingdoms and empires were “exceptional islands of political hierarchy, surrounded by much larger territories whose inhabitants … systematically avoided fixed, overarching systems of authority.” …’
They are suggesting that civilization could be organized around mutual aid and cooperation without the loss of basic freedoms as seen in modern bureaucratic capitalism enforced by state violence. No surprise that Graeber is a committed anarchist.But —
‘…The Dawn of Everything is not a brief for anarchism, though anarchist values—antiauthoritarianism, participatory democracy, small-c communism—are everywhere implicit in it. Above all, it is a brief for possibility, which was, for Graeber, perhaps the highest value of all…
“How did we get stuck?” the authors ask—stuck, that is, in a world of “war, greed, exploitation [and] systematic indifference to others’ suffering”? It’s a pretty good question. “If something did go terribly wrong in human history,” they write, “then perhaps it began to go wrong precisely when people started losing that freedom to imagine and enact other forms of social existence.” It isn’t clear to me how many possibilities are left us now, in a world of polities whose populations number in the tens or hundreds of millions. But stuck we certainly are…’
‘Once upon a time, Republicans portrayed themselves as the party of small government and family values. Recently, though, GOP leaders have been cobbling together a new coalition, welcoming insurrectionists, white-nationalist tiki-torchers and people who think Bill Gates is trying to microchip them.
The latest recruit to the Big Tent? Tax cheats.
Here’s the backstory. Each year, about $600 billion in taxes legally owed are not paid. For scale, that’s roughly equal to all federal income taxes paid by the lowest-earning 90 percent of taxpayers, according to Treasury Department data.
These unpaid taxes — often called the “tax gap” — are predominantly owed by wealthy individuals. The richest 1 percent alone duck an estimated $163 billion in income taxes each year.
To be clear, rank-and-file wage-earners are not necessarily more honest or patriotic. It’s just much harder for them to shortchange Uncle Sam….’
— Catherine Ramped via Washington Post
‘True, we’re hearing a lot about Covid-19 and QAnon-related conspiracies. But just because they are more visible does not mean that belief in them has gone up.
I’ve been doing work with Joseph Uscinski of the University of Miami—the leading specialist studying conspiracies theories—and we’ve carried out a number of studies, assessing whether Covid-19 conspiracy theories have proliferated over the course of the pandemic and whether we’ve seen a general increase in belief in conspiracy theories in the last fifty years. Our paper is currently under review, but our findings may surprise you: Belief in conspiracy theories has, if anything, decreased over the pandemic….’
— Hugo Drochon via Persuasion
‘As millions around the world have settled in to working from home, it’s hard to imagine the office tower ever being a viable proposition again. Planning applications for tall buildings in London plummeted by a third last year, while New London Architecture’s 2021 tall buildings survey found that work started on just 24 buildings of 20 storeys or more – down by almost half from 44 in 2019. Has the age of piling people into great glass shafts, of cities competing for ever higher spires, finally come to an end?…’
— via The Guardian
‘…over the past two decades, a small group of theorists mostly based in Oxford have been busy working out the details of a new moral worldview called longtermism, which emphasizes how our actions affect the very long-term future of the universe – thousands, millions, billions, and even trillions of years from now. This has roots in the work of Nick Bostrom, who founded the grandiosely named Future of Humanity Institute (FHI) in 2005, and Nick Beckstead, a research associate at FHI and a programme officer at Open Philanthropy. …
…humanity has a ‘potential’ of its own, one that transcends the potentials of each individual person, and failing to realise this potential would be extremely bad – indeed, as we will see, a moral catastrophe of literally cosmic proportions. This is the central dogma of longtermism: nothing matters more, ethically speaking, than fulfilling our potential as a species of ‘Earth-originating intelligent life’. It matters so much that longtermists have even coined the scary-sounding term ‘existential risk’ for any possibility of our potential being destroyed, and ‘existential catastrophe’ for any event that actually destroys this potential.
Why do I think this ideology is so dangerous? The short answer is that elevating the fulfilment of humanity’s supposed potential above all else could nontrivially increase the probability that actual people – those alive today and in the near future – suffer extreme harms, even death. Consider that, as I noted elsewhere, the longtermist ideology inclines its adherents to take an insouciant attitude towards climate change. Why? Because even if climate change causes island nations to disappear, triggers mass migrations and kills millions of people, it probably isn’t going to compromise our longterm potential over the coming trillions of years. If one takes a cosmic view of the situation, even a climate catastrophe that cuts the human population by 75 per cent for the next two millennia will, in the grand scheme of things, be nothing more than a small blip – the equivalent of a 90-year-old man having stubbed his toe when he was two.
Bostrom’s argument is that ‘a non-existential disaster causing the breakdown of global civilisation is, from the perspective of humanity as a whole, a potentially recoverable setback.’ It might be ‘a giant massacre for man’, he adds, but so long as humanity bounces back to fulfil its potential, it will ultimately register as little more than ‘a small misstep for mankind’.
Elsewhere, he writes that the worst natural disasters and devastating atrocities in history become almost imperceptible trivialities when seen from this grand perspective. Referring to the two world wars, AIDS and the Chernobyl nuclear accident, he declares that ‘tragic as such events are to the people immediately affected, in the big picture of things … even the worst of these catastrophes are mere ripples on the surface of the great sea of life.’…’
— Phil Torres, philosopher at Leibniz University in Hanover Germany, via Aeon Essays
‘For an infectious disease to be classed in the endemic phase, the rate of infections has to more or less stabilize across years (though occasional increases, say, in the winter, are expected).
“A disease is endemic if the reproductive number is stably at one. That means one infected person, on average, infects one other person,” explained Boston University epidemiologist Eleanor Murray.
“Right now, we are nowhere near that. Each person who’s infected is infecting more than one person.” That’s largely due to the hyper-contagious delta variant and the fact that most of the global population doesn’t yet have immunity — whether through vaccination or infection — so susceptibility is still high.
(For a while, there had been hope that the arrival of vaccines would mean we could reach herd immunity — that is, when enough of a population has gained immunity to confer protection to everyone. But those hopes have been dashed as we’ve failed to vaccinate enough people and more contagious variants have circulated widely.)
But getting the virus’s reproductive number down to one is just “the bare minimum” for earning the endemic classification, Murray said. There are other factors that come into play, too — and assessing these factors is a more subjective business…’
— via Vox
‘Thus far, the justices have not given much comfort to anti-vaxxers. That could change soon…’— via Vox
‘A Harvard scientist has an interesting theory as to how our universe was formed: in a laboratory by higher “class” of lifeform.
Avi Loeb, bestselling author and the former chair of Harvard’s astronomy department, penned an op-ed in Scientific American this week positing that the universe could have been formed in a lab by an “advanced technological civilization.” If true, he said the origin story would unify the religious idea of a creator with the secular idea of quantum gravity.
“Since our universe has a flat geometry with a zero net energy, an advanced civilization could have developed a technology that created a baby universe out of nothing through quantum tunneling,” Loeb wrote….’
— via futurism.com
‘A report from Financial Times‘ Demetri Sevastopulo and Kathrin Hille states that China has tested a nuclear-capable hypersonic glide vehicle that goes into space and traverses the globe in an orbital-like fashion before making its run through the atmosphere toward its target. There would be huge implications if such a system were to be operationalized, and according to this story, which says it talked to five officials confirming the test, the U.S. government was caught totally off-guard by it.
… The foundation of this Cold War-era concept is commonly referred to as a Fractional Orbital Bombardment System, or FOBS, but instead of carrying a traditional nuclear-armed reentry vehicle, this Chinese system would carry a hypersonic glide vehicle that would possess immense kinetic energy upon reentry. As such, it could make a very long maneuvering flight through the atmosphere at very high speeds to its target.
The FOBS concept has long been a concern because of its potential to bypass not just missile defenses, but even many early warning capabilities. Compared to a traditional intercontinental ballistic missile (ICBM), a FOBS can execute the same strikes but from highly unpredictable vectors. Range limitations also become a non-factor and the timing of an inbound strike is also far less predictable….’
— via The Drive
‘Back in 2016, Adam J. Calhoun wrote a fascinating Medium post in which he showed off something quite cool: What novels look like if you strip away the words, and show just the punctuation.
He’d written some Python code to do this, then processed several famous books. As Calhoun pointed out, it gives you a weird new form of literary x-ray vision.
This image [above]? On the left, it’s Calhoun’s analysis of Blood Meridian by Cormac McCarthy, compared to Absalom, Absalom! by William Faulkner, on the right ……’
— Clive Thompson via Creators Hub, Medium
‘The US Fish and Wildlife Service is finally recommending these creatures come off the endangered species list….’
— via Popular Science
‘Numerous theories exist around the origins of the SARS-CoV-2 virus, which causes COVID-19, but none has yet been proven.
Parts of the genome of the SARS-CoV-2 virus are so unusualTrusted Source that it has given rise to conspiracy theories that the virus must have been developed in a lab.
Researchers have now discovered in bats living in caves in Laos strains of viruses so similar to SARS-CoV-2 that they believe they could infect humans.
This discovery could prove the natural origins of the COVID-19 pandemic and that direct bat-to-human transmission of the virus is a possible cause of the pandemic….’
— via Medical News Today