How many of us lie below the stupidity line? How many runners exceed the fast line? How many Oxford undergraduates lie above the first-class line? Yes, we in universities do it too. Examination performance, like most measures of human ability or achievement, is a continuous variable whose frequency distribution is bell-shaped. Yet British universities insist on publishing a class list, in which a minority of students receive first-class degrees, rather a lot obtain seconds (nowadays subdivided into upper and lower seconds), and a few get thirds. That might make sense if the distribution had several peaks with more-or-less shallow valleys in between, but it doesn’t. Anybody who has ever marked an exam knows that the distribution is unimodal. And the bottom of one class is separated from the top of the class below by a small fraction of the distance that separates it from the top of its own class. This fact alone points to a deep unfairness in the system of discontinuous classification.
These examples illustrate the ubiquity of what I am calling the discontinuous mind. It can probably be traced to the ‘essentialism’ of Plato – one of the more pernicious ideas in all history. At what precise moment during development does an embryo become a ‘person’? Only a mind infected with essentialism would ask such a question. An embryo develops gradually from single-celled zygote to newborn baby, and there’s no instant when ‘personhood’ should be deemed to have burst on the scene. The world is divided into those who get this truth, and those who wail: ‘But there has to be some moment when the fetus becomes human. Doesn’t there?’ No, there really doesn’t, any more than there has to be a day when a middle-aged person becomes old. The discontinuous mind can lead people to describe abortion as murder, even when the embryo has no more brain than a worm. And they may therefore feel righteously justified in committing real murder against a doctor – a thinking, feeling, sentient adult, with a loving family to mourn her.
Paleontologists may argue passionately about whether a particular fossil is, say, Australopithecus or Homo. But, given that the second evolved gradually from the first, there must have existed individuals who were intermediate. It is essentialist folly to insist on shoehorning your fossil into one genus or the other. There never was an Australopithecus mother who gave birth to a Homo child. Quarrelling fiercely about whether a fossil is ‘really’ Australopithecus or Homo is like having a heated argument over whether George is ‘tall’. He’s five foot ten, doesn’t that tell you everything you need to know?
Every creature who ever lived belonged to the same species as its mother. If a time machine could serve up your 200 million greats-grandfather, you would eat him with sauce tartare and a slice of lemon. He was a fish. Yet you are connected to him by an unbroken line of intermediate ancestors, every one of whom belonged to the same species as its parents and its children. ‘I’ve danced with a man who’s danced with a girl who’s danced with the Prince of Wales,’ as the song goes. I could mate with a woman, who could mate with a man, who could mate with a woman who . . . after a sufficient number of steps . . . could mate with an ancestral fish, and produce fertile offspring. It is only the discontinuous mind that insists on drawing a line between a species and the ancestral species that birthed it. Evolutionary change is gradual: there never was a line between any species and its evolutionary precursor.
Monday, March 31, 2025
The Discontinuous Mind
Friday, March 28, 2025
Meta Values - 39
Those who can make you believe absurdities can make you commit atrocities.
- Voltaire
Meta Values - 38
Meta Values - 9 - - There shouldn't be confusion about courage.
Men and women who perceive the world as us vs them and have nothing better to do than fight has nothing to with courage.
Their boredom and need to perpetually fight sometimes masquerade into patriotism, or other bullshit.
These men and women cannot tolerate being in peaceful moments and leave alone a peaceful world.
Standing up for truth needs courage. Violence and fighting are just one part of courage.
Courage means changing one's mind. Courage means civil disobedience to tackle wrongs. Courage means being silent when time calls and not making things worse. Courage means patience. Courage has multitudes of faces.
Tuesday, March 25, 2025
Theories To Old Truth - Roots of Cancer
Strictly speaking, genetics do not play a known role in human cancer,” says Carlos Sonnenschein, MD, a professor of integrative physiology and immunology at Tufts University School of Medicine. “Most, if not all, cancers are due to environmental factors.
Those factors, Sonnenschein explained by email, include things we have some control over and things we don’t, from what we eat and drink to whether we smoke, where and how we live, how much physical activity we get, plus societal factors such as pollution and exposure to hormone-disrupting chemicals found in pesticides, plastics and processed foods.
- More here and I told you so.
I lost Max because of this.
Humans never take responsibility for their actions. Some "magic" caused x and some "magic" y will fix x while I sip my beer, play golf, and go for vacation in sunny weather (hey I work hard you know).
For the past few years since Max left me, I don't feel any emotions for someone who is willfully ignorant and gets a deadly disease. I just say, sorry to hear and they are out of my mind.
On the other hand, I will do everything I can for someone who regrets their choices they made, and they are paying the price. I am yet to meet one in person.
Monday, March 24, 2025
Cat Owners Asked To Share Pets’ Quirks For Genetic Study
Cat owners are being asked share their pet’s quirky traits and even post researchers their fur in an effort to shed light on how cats’ health and behaviour are influenced by their genetics.
The scientists behind the project, Darwin’s Cats, are hoping to enrol 100,000 felines, from pedigrees to moggies, with the DNA of 5,000 cats expected to be sequenced in the next year.
The team say the goal is to produce the world’s largest feline genetic database.
“Unlike most existing databases, which tend to focus on specific breeds or veterinary applications, Darwin’s Cats is building a diverse, large-scale dataset that includes pet cats, strays and mixed breeds from all walks of life,” said Dr Elinor Karlsson, the chief scientist at the US nonprofit organisation Darwin’s Ark, director of the vertebrate genomics group at the Broad Institute of MIT and Harvard and associate professor at the UMass Chan medical school.
“It’s important to note, this is an open data project, so we will share the data with other scientists as the dataset grows,” she added.
The project follows on the heels of Darwin’s Dogs, a similar endeavour that has shed light on aspects of canine behaviour, disease and the genetic origins of modern breeds.
Darwin’s Cats was launched in mid-2024 and already has more than 3,000 cats enrolled, although not all have submitted fur samples.
Participants from all parts of the world are asked to complete a number of free surveys about their pet’s physical traits, behaviour, environment, and health.
However, at present, DNA kits – for owners to submit fur samples – can be sent only to US residents, and a donation of $150 (£120) for one cat is requested to cover the cost of sequencing and help fund the research.
Karlsson added the team had developed a method to obtain high-quality DNA from loose fur without needing its roots – meaning samples can simply be collected by brushing.
The researchers hope that by combining insights from cats’ DNA with the survey results they can shed light on how feline genetics influences what cats look like, how they act and the diseases they experience.
“Understanding the genetics behind personality traits could even shed light on human neurodevelopmental conditions,” said Karlsson.
The team also hopes to learn more about the genetic diversity of different breeds and unpick the ancestry of modern cats, with Karlsson adding she is particularly interested in many-toed cats.
- More Here
Sunday, March 23, 2025
We Like Royality & We Don't Know It
And hence, I not only have theoretical immense gratitude for what I have but I thank every day, every moment for this uttermost comfortable life we all have.
Most importantly, I act mindfully for this gift of riches I am endowed with and I don't have any wants nor desires in life.
Wants are road to hell when I and most of the human kind have their needs fulfilled.
We live like royality and we don't know it (and read the entire series on How system works):
But when I mentioned how remarkable it was that a hundred-plus people could parachute into a remote, unfamiliar place and eat a gourmet meal untroubled by fears for their health and comfort, they were surprised. The heroic systems required to bring all the elements of their dinner to these tables by the sea were invisible to them. Despite their fine education, they knew little about the mechanisms of today’s food, water, energy, and public-health systems. They wanted a better world, but they didn’t know how this one worked.
This is not a statement about Kids These Days so much as about Most People These Days. Too many of us know next to nothing about the systems that undergird our lives. Which is what put me in mind of Thomas Jefferson and his ink.
Jefferson was one of the richest men in the new United States. He had a 5,000-acre plantation worked by hundreds of slaves, a splendid mansion in Virginia that he had designed himself, one of the biggest wine collections in America, and one of the greatest private libraries in the world — it became the foundation of the Library of Congress. But despite his wealth and status his home was so cold in winter that the ink in his pen sometimes froze, making it difficult for him to write to complain about the chill.
Jefferson was rich and sophisticated, but his life was closer to the lives of people in the Iron Age than it was to ours. This is true literally, in that modern forms of steel and other metal alloys hadn’t been invented. But it is most true in the staggering fact that everyone at the rehearsal dinner was born and raised in luxury unimaginable in Jefferson’s time.
The young people at my table were anxious about money: starter-job salaries, high rents, student loans. But they never worried about freezing in their home. They could go to the sink and get a glass of clean water without fear of getting sick. Most of all, they were alive. In 1800, when Jefferson was elected president, more than one out of four children died before the age of five. Today, it is a shocking tragedy if a child dies. To Jefferson, these circumstances would have represented wealth and power beyond the dreams of avarice. The young people at my table had debts, but they were the debts of kings.
Jefferson lived in a world of horse-drawn carriages, blazing fireplaces, and yellow fever. But what most separates our day from his is not our automobiles, airplanes, and high-rise apartments — it is that today vast systems provide abundant food, water, energy, and health to most people, including everyone at the rehearsal dinner. In Jefferson’s time, not even the president of the United States had what we have. But few of us are aware of that, or of what it means.
The privilege of ignorance was not available to Jefferson. Monticello’s water supply was a well, which frequently ran dry. The ex-president had to solve the problem on his own. Even if he had had a telephone, there was nobody to call — water utilities did not exist. To make his water supply more reliable, he decided to create a backup system: four cisterns, each eight feet long, wide, and deep, that would store rainwater. His original designs leaked and were vulnerable to contamination. Jefferson, aided by hired architects and slave labor, spent a decade working out how to improve them. He was immersed in his own infrastructure.
We, too, do not have the luxury of ignorance. Our systems serve us well for the most part. But they will need to be revamped for and by the next generation — the generation of the young people at the rehearsal dinner — to accommodate our rising population, technological progress, increasing affluence, and climate change.
The great European cathedrals were built over generations by thousands of people and sustained entire communities. Similarly, the electric grid, the public-water supply, the food-distribution network, and the public-health system took the collective labor of thousands of people over many decades. They are the cathedrals of our secular era. They are high among the great accomplishments of our civilization. But they don’t inspire bestselling novels or blockbuster films. No poets celebrate the sewage treatment plants that prevent them from dying of dysentery. Like almost everyone else, they rarely note the existence of the systems around them, let alone understand how they work.
Saturday, March 22, 2025
Kevin Kelly's Words Of Wisdom On AI, Simulation et al.,
Thinking (intelligence) is only part of science; maybe even a small part. As one example, we don’t have enough proper data to come close to solving the death problem. In the case of working with living organisms, most of these experiments take calendar time. The slow metabolism of a cell cannot be sped up. They take years, or months, or at least days, to get results. If we want to know what happens to subatomic particles, we can’t just think about them. We have to build very large, very complex, very tricky physical structures to find out. Even if the smartest physicists were 1,000 times smarter than they are now, without a Collider, they will know nothing new.
[---]
There is no doubt that a super AI can accelerate the process of science. We can make computer simulations of atoms or cells and we can keep speeding them up by many factors, but two issues limit the usefulness of simulations in obtaining instant progress. First, simulations and models can only be faster than their subjects because they leave something out. That is the nature of a model or simulation. Also worth noting: The testing, vetting and proving of those models also has to take place in calendar time to match the rate of their subjects. The testing of ground truth can’t be sped up.
- More Here
Friday, March 21, 2025
Happy Birthday Max!
Max would have been 19 today!
It's been 18 years and 10 months since we met.
Happy Birthday my love. I miss you every moment.
Thank you for the life you gave me.
Thank you, thank you da.
Wednesday, March 19, 2025
Former Golf Courses Are Going Wild
Gallons and Gallons of water are wasted everyday and worse, zillions of acres of ecosystems destroyed every day in every nation for the stupidest of fads.
For what?
One simple reason - Rich freaking humans want to hit a small round thing with an iron rod because there is nothing better to do. And not so rich want to emulate and signal their richness and coolness.
I do wonder how we got this far as a species.
Thank goodness for this good news and thank you Exploration Green!
And although residents were happy to see their flooding problems vanish, they wanted more than just detention ponds: They wanted green space, walking trails and a place for nature to flourish. It took a while, but at last, in the fall of 2023 the engineering and water retention part of the project was complete, and other than some ongoing planting of native plant species, Exploration Green is a reality. The former golf course is now part of a 200-acre nature reserve, with a centerpiece of five interconnected lakes attached to the area’s stormwater infrastructure.
A bird habitat island on one of the artificial lakes provides a place for migrating birds to rest and feel protected from predators. Walking trails circumnavigate the lakes, and over 1,000 native plants grow with abandon on what were once perfectly manicured fairways and putting greens. The reserve is a community gathering place not just for recreation but for education, too. During Houston Bird Week in September, residents can register for guided bird walks to learn more about the many species that frequent the reserve. It is exactly what residents hoped for — including having dry homes.
Exploration Green is among the many golf courses that have been re-envisioned as places for people and nature to thrive in recent years.
In 2017, Hurricane Harvey dumped approximately 50 inches of rain on the Houston area. “The first lake was 90 percent complete when Harvey hit,” explains David Sharp, chairman of the board of directors of the Exploration Green Conservancy, the nonprofit created to manage the site’s ecological restoration and sustainability. “There were 200 houses in the immediate area that would regularly get flooded with any kind of heavy rain. Not one house flooded,” he recalls.
Exploration Green is not the only golf course that has seen a rewilding. As golf’s popularity has waned in recent years, other courses have also been re-envisioned as places for people and nature to thrive.
According to the National Golf Foundation, there were almost four million fewer golfers in 2024 than in 2003. The cost to operate a private golf club can be as much as $1 million annually, and with fewer golfers hitting the links, owners are not able to meet operating budgets, and courses have been sold. In 2022 alone, more than 100 golf courses shuttered across the U.S., leaving many acres of unused land ripe for reimagining. Couple this with a 2023 study which found that 97 percent of all metropolitan areas in the United States have insufficient open space, and unused golf courses become an invaluable resource.
The benefits of preserving open spaces, as the authors of the report note, are numerous. They provide opportunities for people to experience nature, socialize and participate in healthy recreational activities — something the residents of the municipalities of Churchill and Penn Hills outside of Pittsburgh are passionate about.
Bubba Becomes First Fish To Survive Chemotherapy
38 years ago, an anonymous donor dragged a large, sloshing bucket to the Shedd Aquarium in Chicago, USA, dropped it at the reception desk, and disappeared. When staff pried open the lid, they discovered Bubba – a giant grouper fish, presumably caught and determined too big to take care of. A note attached to the lid asked for the fish to get to a good home.
Upon deeper examination, doctors learned more about the Epinephelus lanceolatus. At the time, she was only 10 in long, and was a Queensland grouper – a species fast disappearing in nature. The "super grouper" needed treatment, so they nursed Bubba back to health and found her a new home in a tank in the coral fish exhibit, where the predator happily swelled to 4.5 ft (1.37 m) and a whopping 69.3 kilos (150 lbs).
While there, she became a popular attraction, as visitors marvelled at her mysterious origin story and compassionate change in circumstances. And when she was briefly removed from exhibition in 1998, fans were distraught.
"That's when we found out how popular [s]he was," said Shedd spokesman Roger Germann, to the Washington Post, "because we started getting letters from people saying they couldn't find Bubba on their last visit and wanted to know what had happened."
Midway though the 1990s, Bubba underwent her second big life change as she transitioned to male, as groupers often do. This is a common reproductive strategy in fish species, whereby the larger female fish in a tank change sex to male, while the smaller fish remain female – and since Bubba was so big, scientists weren’t exactly surprised!
But scientists were shocked to find in 2001 that Bubba, their beloved grouper, had cancer.
While this usually is a sure sign of a fish’s demise, because of Bubba’s size, scientists decided to take the unprecedented step of treating him with chemotherapy. This was never attempted before on a fish, but groupers can live 30 to 50 years, so if successful, they would be making advances in cancer treatments, while giving Bubba years of his life back.
Luckily, Bubba responded well to the treatment, and he became the first fish to survive chemotherapy – and cancer!
After his treatments, he spent many happy years entertaining visitors and serving as an inspiration for human cancer survivors. The Shedd Aquarium reported receiving many calls from people affected by the disease, especially children, asking how Bubba was and gaining strength and courage from the knowledge that he had survived his own ordeal and that chemotherapy had extended his life. And beyond that, he was a personal favourite for many at the aquarium.
"Bubba overcame some incredible odds over the years, and that's what made him so special to us," said George Parsons, director of the Shedd's Fish department, to the Underwater Times. "Every once in a while for the last three years we have been getting phone calls from kids with cancer or from their parents, wondering how he is doing."
After regaining his health, Bubba was moved to a new home in the 400,000-gallon main pool of the Shedd's new $43 million Wild Reef gallery, so his fans could properly appreciate his beauty. He even got a new 5-inch friend – a golden trevally fish, which swims around him and eats his scraps.
"He is such a character," said Rachel Wilborn, one of his keepers, to the Washington Post. "He is so curious, always coming around to see what you are doing. If you give him a food item that he doesn't like, he spits it right back at you, then looks you right in the eye, waiting to see what else you can come up with."
After many happy years in his new home, the magnificent fish passed away in August 2006 from age-related issues. A Shedd official said his autopsy shows only “evidence of multiple organ system failure consistent with [Bubba’s] age.”
"It's going to be tough now, if I have to tell people he's no longer with us," said Parsons.
But nevertheless, even though Bubba has passed, his story lives on as a testament to the compassion of his healthcare providers and all who loved him. His body was even donated to Chicago’s Field Museum across the street, where they will keep Bubba’s skeleton as a part of its enormous fish collection and cryogenically freeze his tissue samples, preserving them for study by future generations of scientists.
"If you want to know why we went to all this effort for a fish," Wilborn said, "all you have to do is look into his adorable face. We did it for Bubs because he is such a cool fish."
- More Here
Tuesday, March 18, 2025
Daniel Kahneman Chose To End His Own Life
The report, published on Friday, said that shortly before Kahneman died in March last year, he sent an email to his friends saying that he was choosing to end his own life in Switzerland.
“I have believed since I was a teenager that the miseries and indignities of the last years of life are superfluous, and I am acting on that belief. Most people hate changing their minds,” he said, “but I like to change my mind. It means I’ve learned something…” read the email Kahneman wrote to his friends before flying to Switzerland.
[---]
“Some of Kahneman’s friends think what he did was consistent with his own research. ‘Right to the end, he was a lot smarter than most of us,’ says Philip Tetlock, a psychologist at the University of Pennsylvania. ‘But I am no mind reader. My best guess is he felt he was falling apart, cognitively and physically. And he really wanted to enjoy life and expected life to become decreasingly enjoyable. I suspect he worked out a hedonic calculus of when the burdens of life would begin to outweigh the benefits—and he probably foresaw a very steep decline in his early 90s.. I have never seen a better-planned death than the one Danny designed'.”
His friends and family say that Kahneman’s choice was purely personal; he didn’t endorse assisted suicide for anyone else and never wished to be viewed as advocating it for others.
Some of his friends knew about his plans before he went to Switzerland. Despite their efforts to talk him into deferring his decision, he wouldn't budge. In fact, he had to ask a friend to stop after they relentlessly pleaded with him.
“Life was certainly precious to him. Kahneman and his Jewish family had spent much of his childhood hiding from the Nazis in southern France during the Holocaust.
His final words in his final email were: “I discovered after making the decision that I am not afraid of not existing, and that I think of death as going to sleep and not waking up. The last period has truly not been hard, except for witnessing the pain I caused others. So if you were inclined to be sorry for me, don’t be,” the report said.
“Thank you for helping make my life a good one.”
- More Here
Sunday, March 16, 2025
Let There Be More Biographies Of Failures
Let there be more biographies of failures, people who were ignored by the world, whose ideas came before their time, whose great work was left in ruins.
The point of biography is to set an example, to teach us how other people did the things we want to do. That might be something grand like live a good life, or it might be something more mundane like manage a small company. Whatever it is, the genre suffers from selection bias. Only the successful get biographies.
But we will not all be successful, and if that is our main criteria we won’t learn as much from biography as we could. There’s a lot of fascinating information in Cataloging the World: Paul Otlet and the Birth of the Information Age, by Alex Wright, but some of the most interesting is about how Otlet was repeatedly let down by the world.
Otlet was something of a genius. After an unusual education (tutors until he was 11 as his father thought school was stifling, then a Jesuit school) Otlet did what a lot of interesting people do when they are young. He made the mistake of going to law school. The real benefit that reluctant young lawyers like Otlet get from their career is boredom. Their minds wander.
His passion was bibliography, the organising of information, and he devised a system based on organising and cataloging chunks of information. Books imprison ideas in structures that authors arbitrarily impose on them. Otlet wanted to break the ideas down to chunks and make them retrievable by anyone. He was thinking his way towards an analogue version of the internet. This was in 1892.
[---]
In short, he was having ideas that sound remarkably like a prototype internet. And yet he was obscure, unknown even, to the people who did eventually create the internet and the world wide web. Unlike Otlet, who favoured a massive, systematic, centralised, categorisation of knowledge, the internet was built on ideals of distribution, flat hierarchy, and emergent order. As Alex Wright says, modern internet ideals make ‘the notion of “universal classification” seem like an enormous act of cultural hubris.’ Wikpedia would be foreign to Otlet.
Right to the end, Otlet’s vision was frustrated.
So what are the lessons we can learn? It doesn’t always help to be right. Ideas aren’t easy to implement without the right combination of technology, attitudes, and luck. The work is what’s important, not the result. Maybe the cranks who fill their houses with cart loads of ephemera aren’t so crazy. Don’t make political trouble. Get a PR department. Have a partner who can do these things if you can’t. Be in the right place at the right time. Don’t get cynical, or as Churchill said, don’t let the bastards grind you down. Keep working. Philosophical and ethical beliefs matter a lot to what work you do and how you do it. Don’t be so pragmatic you end up being a conformist. Conventional schooling isn’t always the best approach for your children. Worry less about imaginative young people becoming lawyers. Being bored might give them the opportunity they need to have their big idea.
- More Here
Monday, March 10, 2025
Mice Seen Giving First Aid To Unconscious Companions
When they find another mouse unconscious, some mice seemingly try to revive their companion by pawing at them, biting and even pulling their tongue aside to clear their airways. The finding hints that caregiving behaviour might be more common in the animal kingdom than we thought.
There are rare reports of large, social mammals trying to help incapacitated members of their species, such as wild chimpanzees touching and licking wounded peers, dolphins attempting to push a distressed pod mate to the surface so it can breathe and elephants rendering assistance to ailing relatives.
Over a series of tests, on average the animals devoted about 47 per cent of a 13-minute observation window to interacting with the unconscious partner, showing three sorts of behaviour.
“They start with sniffing, and then grooming, and then with a very intensive or physical interaction,” says Zhang. “They really open the mouth of this animal and pull out its tongue.”
These more physical interactions also involved licking the eyes and biting the mouth area. After focusing on the mouth, the mice pulled on the tongue of their unresponsive partner in more than 50 per cent of cases.
In a separate test, researchers gently placed a non-toxic plastic ball in the mouth of the unconscious mouse. In 80 per cent of cases, the helping mice successfully removed the object.
- More Here
Sunday, March 9, 2025
Addiction to Beliefs
This phenomena is grossly underrated and under researched. We laugh at cults and mass suicide driven by these cults,
But this belief thing is something omnipresent - slow motion version of the cult and driving addicted folks unable to shake their beliefs even at the cost of self destruction.
Personal Identity and Willful Ignorance
Ada sits alone at a table contemplating whether she should drink the liquid from the glass in front of her. She’s been promised that the result of doing so will be an immediate revision to her set of beliefs. If she drinks from the glass, she will believe only things that are true. She won’t become omniscient; she won’t know everything. The liquid will simply replace all false beliefs she has with corresponding true ones. Ada likes to think that she is intellectually humble. She likes to believe that she generally acts in accordance with reliable processes for forming beliefs. Most importantly, Ada believes that she values truth. Nevertheless, she can’t shake the feeling that drinking from the glass would be a kind of suicide.
In The Sources of Normativity, philosopher Christine Korsgaard argues that reasons for action spring from what she calls our “practical identities.” These practical identities are ways of conceiving of ourselves that we value and hold dear. For example, I may view myself as a friend, a mother, a lover, etc., and the reasons I have for behaving in various ways are picked out by what those identities permit or forbid. The identities that provide us with overriding reasons are those we’d rather die than give up. As Korsgaard says, “The only thing that could be as bad or worse than death is something that for us amounts to death—not being ourselves anymore.”
[---]
Ada is a volunteer for a local charitable organization. Her contributions to the organization provide a great sense of meaning to her life. She met most of her friends in this capacity and they’ve put together a bowling league that meets on Wednesday nights. One person from this group has become her closest friend. They both have mothers battling cancer, and Ada and her friend are one another’s sources of support in difficult times. The work of the charitable organization is predicated on three fundamental premises. If any of the premises turned out to be false, it would shatter her faith in the organization’s work. Where would her meaning come from then? Her friends? Her support?
Ada is married to a man with many opinions about which he seems unshakably assured. She and her husband have different interests. Because he is passionate about what he cares about, she trusts that he has good evidence for the things that he believes. Nevertheless, she is worried that, if she were to learn that the propositions he so boldly asserts were mostly false, she might come to disrespect him for his many flagrant displays of unearned confidence. What would happen to her love? Who would be her companion?
If we’re being honest with ourselves, we must acknowledge that some of our identities not only involve false beliefs, but actually depend upon them. We may not know which identities fall into this category, but it is probable that some, perhaps even many of them, do.
It isn’t uncommon to be mystified by the extent to which people seem unwilling to become better informed about social issues. We wonder why they won’t critically reflect on the coherence or consistency of their positions, especially when widely known and compelling evidence provides good reason to be skeptical. We wonder why they refuse to engage with sources that support any position other than those they were already inclined to believe anyway. Why, we ask, do people often seem so willfully ignorant?
It’s hard work crafting oneself into a fully formed person. We adopt certain aesthetics or roles because they feel authentic. Ohers are imposed upon us by our environment. Still more arise out of trauma and grief. At a certain point, for better or for worse, we’ve invested so much time and effort into our identities, we feel that there’s too much at stake to change them. We don’t want our social lives to change. We don’t want to feel differently about who we are and what we’ve done. We don’t want different kind of reasons to motivate our actions. We’d rather have stability than truth.
Ada knows she doesn’t have the best possible life, but it’s hers. She’s comfortable. If she is the source of the suffering of someone else, she’s not aware of it. If her decisions prevent someone from achieving full liberation, she can’t be blamed. If her choices put our most cherished institutions at risk, it surely couldn’t be her fault alone, or perhaps even at all. She doesn’t want her identity as she knows it to be shattered. She wants to go on being the person she recognizes.
She stands up, walks to the sink, and pours the liquid down the drain.
Monday, March 3, 2025
Finding Awe!
I am blessed.
I lived for over 13 years in a state of Max's Awe and I still do. has become our awe paradise.
Thank you my love. "I" became irrelevant living and one day soon dying with you.
Some attribute the beginning of the study of awe to the Apollo 8 mission. In December 1968, three astronauts entered a small capsule—the vehicle for mankind’s first trip to the moon. (They orbited ten times but didn’t land.) Major William Anders glanced out the window in time to see his blue home planet rising above the stark lunar horizon. “Oh, my God,” he said. Then he took a photo.
Later called Earthrise, the image became one of the most famous photographs ever taken. Fifty years after Anders captured it, he said that the view of Earth changed his life, shaking his religious faith and underscoring his concern for the planet. “We set out to explore the moon,” he wrote about the experience, “and instead discovered the Earth.”
Dubbed the overview effect, the profound experiences shared by Anders and many astronauts helped usher in a wave of academic interest in transcendent events and their attendant emotion—notably, awe. Experimental psychologists tried to induce the emotion in laboratories, showing people pictures of earth taken from space, as well as videos of a flash mob performing the “Ode to Joy” movement of Beethoven’s Ninth Symphony, or Susan Boyle wowing the world when she sang on Britain’s Got Talent. (If you haven’t seen Boyle doing her thing, look it up; I dare you not to feel some tingles.)
For research purposes, subjects let scientists measure their goose bumps, supplied cortisol samples before and after whitewater rafting, performed tedious cognitive tasks, and were fitted with suction probes to measure something that’s called “awe face.”
Researchers pondered many aspects of awe, including why experiencing it caused some people to feel greater belonging or generosity. They speculated that awe may be the primary pathway through which therapeutic psychedelics help so many patients suffering from trauma, depression, anxiety, and addiction. They even asserted that experiencing awe may be the defining feature of our species.
For an emotion with so much riding on it, what seems surprising is that it took the academic world so long to take awe seriously.
“Science got into the awe game really late,” says Dacher Keltner, a psychology professor at the University of California at Berkeley, and the author of the new book Awe: The New Science of Everyday Wonder and How It Can Transform Your Life.
Keltner grew up in 1960s California, raised by progressive parents. All around him people were exploring Buddhism, experimenting with mind-altering drugs, and communing with nature. It was also the golden age of spaceflight. “I was raised in a historical period that was in some sense devoted to awe,” he says. “But it was a neuroscientific and cognitive mystery.”
In 2003, Keltner and the psychologist Jonathan Haidt published one of the first academic papers on the experience. In “Approaching Awe, a Moral, Spiritual, and Aesthetic Emotion,” the two scientists tried to pinpoint what exactly awe is. They combed through historical accounts by philosophers and mystics; what they arrived at was both eloquent and expansive.
“We said that awe is really an emotion you feel when you encounter something vast and mysterious that transcends your understanding of the world,” he says. The vastness part, he explains, doesn’t have to be literally vast, like a view from a mountaintop. It can be conceptually vast, like the anatomy of a bee or string theory or a late-night stoner realization that every mammal on earth must have a belly button.
In the two decades of research that followed, an even more remarkable conclusion emerged: that this state of mind could potentially alter us by unleashing feelings like humility, generosity, and a desire to reassess our lives. And sometimes even existential terror. Whether it’s cataclysmic or gentle, an awe experience could be an effective antidote to burnout, post-traumatic stress, heartbreak, and loneliness.
[---]
I had to admit, I hadn’t really been thinking of this spectacle from the plant’s perspective. It suddenly seemed a totally reasonable thing to do. Most of these plants have been around a lot longer than humans have. The seeds that created this bloom were made in the past. They finally germinated during this precious wet year, but the whole thrust of the extravagant effort was to make seeds for a future bloom in an outrageous cycle of hope. Godoy and I were standing, accidentally, in the middle of a space-time continuum that had absolutely nothing to do with us. We humans just need to not screw it up.
Then it hit me: the risk of chasing awe, of making it about personal growth, is that you dilute its strongest power. Because improving ourselves really isn’t the point of awe at all. I’d been doing it wrong, and it had taken a 27-year-old human and a cluster of yellow tickseeds to help me realize it. The point is this: by listening, we find a small seam in the universe through which to feel ourselves entirely irrelevant.
Sunday, March 2, 2025
Yes, Shrimp Matter
I left private equity to work on shrimp welfare. When I tell anyone this, they usually think I've lost my mind. I know the feeling — I’ve been there. When I first read Charity Entrepreneurship's proposal for a shrimp welfare charity, I thought: “Effective altruists have gone mad — who cares about shrimp?”
The transition from analyzing real estate deals to advocating for some of the smallest animals in our food system feels counterintuitive, to say the least. But it was the same muscle I used converting derelict office buildings into luxury hotels that allowed me to appreciate an enormous opportunity overlooked by almost everyone, including those in the animal welfare space. I still spend my days analyzing returns (though they’re now measured in suffering averted). I still work to identify mutual opportunities with industry partners. Perhaps most importantly, I still view it as paramount to build trust with people who — initially — sit on opposite sides of the table.
After years of practicing my response to the inevitable raised eyebrows, I now sum it up simply: ignoring shrimp welfare would have been both negligent and reckless.
This may seem like an extreme stance. Shrimp aren't high on the list of animals most people think about when they consider the harms of industrial agriculture. For a long time — up until the last few years — most researchers assumed shrimp couldn't even feel pain. Yet as philosopher Jonathan Birch explains in The Edge of Sentience, whenever a creature is a sentience candidate1 and we cannot rule out its capacity for conscious experience, we have a responsibility to take its potential for suffering seriously.
We don’t know what it is like to be a shrimp. We do know that if shrimp can suffer, they are doing so in the hundreds of billions.
Why worry about shrimp in a world where so many mammals and birds live in torturous conditions due to industrial agriculture?2 The answer is that shrimp farming dwarfs other forms of animal agriculture by sheer numbers. An estimated 230 billion shrimp of various species are alive in farms at any given moment — compared to the 779 million pigs, 1.55 billion cattle, 4 33 billion chickens, and 125 billion farmed fish.
Shrimp are harvested at around 6 months of age, which puts the estimated number slaughtered annually for human consumption at 440 billion. For perspective: that’s more than four times the number of humans who have ever walked the earth. At sea, the numbers are even more staggeringly shrimpy. Globally, 27 trillion shrimp are caught in the wild6 every year, compared to 1.5 trillion fish.
Despite their size, shrimp are the proverbial “elephant in the room” when discussing animal welfare in food systems.
[---]
The future of shrimp welfare is one of the most underexplored areas in modern animal rights, but its potential for impact is immense. We are only at the beginning of a movement that could fundamentally shift the way we treat aquatic animals — both on farms and for those caught in the ocean. While challenges remain, including entrenched industry practices and global trade complexities, the path forward is becoming clearer with each step taken by animal NGOs and progressive food companies.
For the first time ever, shrimp welfare is becoming a relevant topic within the broader animal welfare movement, one that has traditionally focused on larger animals and more familiar causes. But the staggering number of shrimp affected, their capacity to suffer, and the emerging solutions make this a moral issue we can no longer ignore. Addressing shrimp welfare isn’t just about reducing suffering for billions of animals — it’s about redefining our relationship with the natural world, expanding our circle of compassion, and challenging the limits of our ethical responsibilities.
- More Here
Wednesday, February 26, 2025
Tuesday, February 25, 2025
Gastronomical Conversations Can Reflect Who We Are, & Who We Are Not
KR: One of the first settings where food and language converge is during family meals. How does this differ from country to country?
MSK: Research shows that in the United States, families talk about whether the food is healthy, whereas in Italy, they talk about whether it’s tasty, which is ironic since there are so many health problems in the US with obesity.
KR: Eating together is not the norm in all cultures. Those who do have family meals often don’t talk while eating — it’s considered distracting. What they want to represent to children is an attentiveness to their food and gratitude for it. In the Marquesas, I found that talking happens while procuring and preparing food, not at meals.
JC: We think of the family meal as something everyone does, but it is closely related to class and race. Those who can afford to, and people who work 9 to 5, can have regular family meals. But not shift workers, those working two or three jobs, or those who come from different cultural traditions. It’s become a moral issue too — the message is that if you don’t do it, you’re missing a really important socializing moment with your children. People are made to feel like they’re failing.
MSK: It’s put up as an ideal today but, at some time in history, children weren’t supposed to eat with parents or talk at the table, so this idea of the family meal as an eternal institution that’s crumbling is wrong.
- More Here
Tuesday, February 18, 2025
Against Optimization
For most of the big decisions we make—about how to govern our societies or how to structure our individual lives—there is a better, wiser strategy for us to follow. Topple the churches to the god of Optimization. Replace them with shrines to a wiser, more caring deity: Resilience.
To see why, we need to draw on lessons from unexpected places: the shells of molluscs, the carefully engineered robustness of ant colonies, and by debunking the mistaken interpretations of evolutionary biology that have infected the dominant—but incorrect—view as to how our world works.
The popular reduction of evolutionary principles to “survival of the fittest”—with overtones of relentless, flawless optimization—is a tragic mistake. (Many incorrectly attribute the phrase to Charles Darwin, but it was first coined by Herbert Spencer). While it is true that evolution does often fine-tune species to greater fitness over time through natural selection, the ultimate engine of evolution is survival and reproduction—which often requires robustness and the ability to adapt to uncertainty.
A hyper-optimized species that can only survive in one environment will get wiped out if that environment changes. That’s one reason why evolution routinely works in unexpected ways, through what the brilliant evolutionary biologist Zachary Blount calls “the genomic junk drawer.” The specific evolutionary path that a species took—along with plenty of accidental, contingent events along the way—leaves extra stuff in the genome that might at first appear to be junk.
The awe-inspiring genius of our natural world is that evolution provides a mechanism to repurpose that genomic “slack” into something more useful when the environment changes. It’s the evolutionary wizardry of resilient adaptation. That’s why, as Daniel Milo argues, a huge range of lasting species are defined not by optimal solutions, but by “good enough” ones. It’s not survival of the perfectly optimized, but survival of the resilient, as only the most robust inherit the Earth.
For example, nacre, or “mother of pearl,” is one of the oldest and most unchanged biomechanical structures on Earth. With a stunningly beautiful lustre, it gives pearls their sheen and adorns the inner shell of some molluscs. It is largely the same structure from when it first emerged roughly 530 million years ago. (Modern humans have been around for only about 250,000 years, so we might have something to learn from this longstanding byproduct of evolutionary pressure).
Nacre persists because nature is an engineering marvel, producing an ingenious structure that offers a parable for us. The short version is this: at the nano-level, the nacre on mollusc shells has a series of flawed, interconnecting parts that are decidedly un-optimized. The flaws lock together in an irregular brick and mortar pattern, where the “mortar” is organic material that, if needed, can be squeezed out when the material is put under strain.
Moreover, the unique structure creates discontinuities, so if one part cracks, the damage is contained, isolated, decoupled from the rest of the material. To an untrained eye, the structure looks woefully inefficient, wasteful, badly designed. Instead, it’s one of the strongest substances in the world.
This structure provides two initial key lessons for humans—both in our social systems and in our lives. Resilience can often be produced by systems that feature:
Diversity (lots of different kinds of components that work together are more robust than a uniform single structure, just as the Estonian power supply was augmented by a wide array of other electricity sources when one cable was severed);
Redundancy (systems that are designed to work even after an unexpected failure or setback are more robust, illustrated by the Suez Canal, which had no backup option when the route became blocked).
The third lesson comes not from molluscs but from ants. It’s resilience from what I call decoupled connectivity, the idea that robustness comes from interconnected support networks—but also that one needs to be able to sever a destructive node when it becomes toxic. Connectivity allows a system to repair itself when under strain, while decoupling allows isolation to contain a devastating cascade.3
When ant colonies face a disease outbreak, for example, they exhibit ingenious behaviors. If the outbreak is merely of a mild fungal infection, then connectivity saves the colony, as “nurse” ants are swiftly deployed to administer “a formic acid antimicrobial poison to their patients whilst grooming them.” Without the connected network, a mild outbreak could become a devastating epidemic.
However, if the outbreak does become more severe, maintaining connectivity could prove fatal. Then, the colony will pursue more extreme strategies, either of isolation—keeping infected ants away from the healthy ones—or of killing diseased individuals, pruning that node off from the colony altogether.4 Since outbreaks don’t happen all the time, keeping these mechanisms in place could be thought of as a form of inefficient slack. But when disease strikes, it’s the slack that saves the colony from existential risk.
One of nature’s overarching lessons is this: what may look to a naive human eye as waste, or inefficiency, or under-optimized slack is often evolution’s secret weapon, providing the adaptive resilience to survive in an ever-changing world.
- More Here
Friday, February 14, 2025
The Languages Lost To Climate Change
Scientists and linguists have discovered a striking connection between the world’s biodiversity and its languages. Areas rich in biological diversity also tend to be rich in linguistic diversity (a high concentration of languages). While this co-occurrence is not yet fully understood, a strong geographic correlation suggests multiple factors (ecological, social, cultural) influence both forms of diversity, which are also declining at alarming rates. These high-diversity areas are also often at the front lines of the climate crisis. Where plant and animal species are disappearing, languages, dialects and unique expressions often follow a similar pattern of decline.
The Arctic may not be an obvious biodiversity hotspot, like the Brazilian Amazon or Tanzania’s coastal forests, but it plays a critical role in regulating and stabilizing the Earth’s climate and supporting life on our planet. Scientists often say that “what happens in the Arctic does not stay in the Arctic,” and any disruption to its habitat has far-reaching consequences for humanity.
Indigenous communities have deep relationships with the land they have occupied for generations, and this close relationship is reflected in the languages they speak — how they talk about the landscape, and how they express the beliefs and customs in which those languages developed. When their relationships with the land suffer, so can their languages.
For example, Vanuatu, a South Pacific island nation with the highest density of languages on the planet (110 languages across 4,707 square miles), is home to 138 threatened plant and animal species. It is also one of the countries that is particularly vulnerable to sea level rise and climate-related natural disasters. Scientists warn that the climate crisis has become the “final nail in the coffin” for many Indigenous languages, as coastal communities are forced to relocate.
When they can no longer depend on the land, communities may be forced to emigrate to other areas where their languages aren’t spoken, leaving behind not just their mother tongue, but all the wisdom contained in it. There is also evidence to suggest that in cases where a language begins to decline — due to economic or social factors, for example — people may gradually stop caring for the land. When languages are abandoned, the traditional ecological knowledge they carry is also left behind.
“Our language and traditional practices are closely tied to the land,” a community leader from Dishchii’bikoh, a tribally controlled school, in Cibecue, Arizona, told researchers in a 2016 study.“In many ways, it is used in describing objects, teaching moral lessons, and expressing our purpose on this land. Since the loss of our traditional language … our traditional ecological knowledge has become more and more threatened.”
Increasingly, Indigenous communities are pointing to the inextricable link between language and biodiversity as evidence that humans are not separate from nature, but very much a part of it.
[---]
Linguistic diversity can be seen as an indicator of cultural diversity more broadly, Gorenflo says, which has traditionally been more difficult to define. “For a long time, anthropology was considered to be the social science that studied culture. But nobody could come to an agreement about what culture was,” he says. “Linguistic diversity is really what we’re using as a proxy for cultural diversity.”
The exact reasons behind the connections between languages and nature are not entirely clear, Gorenflo told me. Previous studies have suggested that areas with a high number of resources create linguistic diversity because people must adapt to more complex environments. But others have argued that it’s because more plentiful resources reduce the likelihood of having to share them and communicate with neighboring groups in times of need. Meanwhile, some research has suggested that the reasons behind this co-occurrence are far more complex and differ from one area to another. Gorenflo emphasized the need for more research. “Understanding this connection is important because it would change how we manage the relationship between Indigenous people and biological diversity — and nature.”
[---]
For Gorenflo, the factors driving the co-occurrence of linguistic and biological diversity, which were initially puzzling, are now becoming even more evident. “I see languages as an extension of the cultural system, which itself is part of the broader ecology of the world,” he told me. “So, it’s less and less of a mystery to me, and more about exploring what this ecology looks like.”
The preservation of endangered languages is about more than saving words — it could be vital to safeguarding centuries of human knowledge and understanding the systems that sustain us.
- More Here
Wednesday, February 12, 2025
How We Treat Non-Human Animals Is Darwin's Greatest Contribution
For over 150 years, Charles Darwin and his work have influenced the fields of science, religion, politics, gender, literature, philosophy, and medicine. With a view in 2013 of the innumerable changes he has sparked across a number of disciplines, what should be considered Darwin’s most important contribution?
Darwin showed us that we’re animals. He showed us that there’s no fundamental distinction between us and any other critter on the planet. The most important implication of this Gestalt shift may be ethical. As soon as we accept that the human-animal distinction is not fundamental in nature, it becomes difficult to accept a moral code that privileges the wellbeing of human beings but is indifferent to the wellbeing of any other animal. It becomes hard to resist extending our moral concern to any creature capable of suffering, human or not. If present trends continue, the main beneficiaries of Darwin’s great idea may not be human beings. Ultimately, the main beneficiaries may be the other animals we share the planet with.
- More here from Steve Stewart-Williams
Monday, February 10, 2025
Where Did Trees Come From?
Trees are considered to be an evolutionary descendant of ferns, one of the oldest types of plants currently around today. These early trees were much shorter than the average tree today and also reproduced with spores rather than seeds. The first known tree fossil dates back to about 385 million years ago during the Middle Devonian. During this time period, no plant grew higher than roughly waist height. However, in order to grow higher, plants would need to develop a stronger form of tissue.
The development of wood was a big evolutionary leap and took millions of years to accomplish. Wood is useful for several reasons. The most obvious is the structural support, but wood is also useful for allowing more efficient transport of water. With this new development, early trees could out-compete with their neighbors for precious sunlight and store more water to survive in droughts.
These early trees formed the backbone of Devonian and Carboniferous forests. This includes varieties like the Wattieza above, and their relatives the Lepidodendrales. The forests that grew and died during this period are the primary source material for all modern coal deposits. Without trees evolving at this time, the Industrial Revolution may have never happened! Even hundreds of millions of years ago, trees were laying the groundwork for modern human advancement.
These trees also grew fairly differently from modern trees. Instead of gradually growing continuously throughout its life, these plants stay at a low height for a while. Once it has built up sufficient resources, it will “rapidly” shoot up in height to rise above its neighbors and expose itself to lots of sunlight. Rapidly here means faster than a modern tree, but still slow to our eyes. Another difference is the quality of wood; ancient trees used a variety of wood that was easier to create but much less structurally sound. As a result, these trees could not grow very tall and did not have branches.
Trees were due for another evolutionary shift during the Triassic Period. Their method of reproduction shifted from spores to seeds. This is where we see the first example of a gymnosperm. Gymnosperm is Greek for ‘revealed seed’ or ‘naked seed.’ This class consists of many trees that we would recognize today, including any tree that has cones. Gymnosperms include cedars, Douglas-firs, cypresses, firs, junipers, kauri, larches, pines, hemlocks, redwoods, spruces, and yews. While gymnosperms became dominant in the Triassic Period, they first appeared sometime during the Carboniferous Period.
- More Here
Thursday, February 6, 2025
Psychodynamic Nonsense
The art of ‘being for another’ – following, listening to and making sense of another person’s world – has been practised for millennia. Humans have always discussed their lives, their values and their problems, trying to find meaning, solace and joy. Experts at this sort of discussion have been called wise women, shamans, priests – and now therapists. Then, starting with Sigmund Freud, came a series of attempts to create a science of psychotherapy out of it.
But there is very little science to it.
[---]
I became a psychotherapist and psychologist to maximise the good I could do in the world. It seemed obvious that helping people by engaging with the root of their suffering would be the most helpful thing to do. I also became a child psychotherapist to address the roots of suffering in childhood, where they seemed to stem. I experienced how deepening into a feeling could transform it, and learned about pre-natal trauma; I even wrote a doctorate on trauma. Now, two decades into my career, I practise, lecture, supervise and write about all of these things, but increasingly I reject everything that I learned. Instead, I practise the art of ‘being for another’, an idea that arose in conversation with my colleague Sophie de Vieuxpont. I’m a mentor, a friend in an asymmetrical friendship, and a sounding board and critical ally assisting people as they go through the complexities, absurdities, devastations and joys of life.
Along the way, over years of practise, I lost faith that awareness was always curative, that resolving childhood trauma would liberate us all, that truly feeling the feelings would allow them to dissipate, in a complex feedback loop of theory and practice.
The effect of your family environment matters very little when it comes to your personality
It started with returning to an old interest in evolutionary biology, with the release of Robert Plomin’s book Blueprint (2018). An account of twin studies, the book draws upon decades of twin statistics, from several countries, and the numbers were clear: childhood events and parenting rarely matter that much in terms of how we turn out.
That caused me to re-read Judith Rich Harris’s book No Two Alike (2006), which also examined twin studies along with wide-ranging studies of other species. Harris proposed that the brain was a toolbox honed by evolution to deliver sets of skills, leaving each of us utterly unique.
These books are perhaps summed up best in the second law of behavioural genetics: the influence of genes on human behaviour is greater than the family environment. I noticed my defences popping up, desperately trying to find holes in the science. But at the end of the day, without cherry-picking data conforming to what I learned in my training, the simple fact was this: twin sisters with identical genes raised in totally different families developed very similar personalities, while adopted sisters with no genetic links raised in the same family had very different personalities.
That finding, from the journal Developmental Psychology, undermined years of learning in psychodynamic theory. It means that the effect of your family environment – whether you are raised by caring or distant parents, whether in a low-income or high-income family – matters very little when it comes to your personality. If you’ve ever had any training in therapy, this goes against everything you have been taught.
Yet the tenets of psychotherapy did not reflect my clients’ lived experience, or even my own. Instead, we see what we expect to see, and we make sense of our past based on how we feel now. If I am sad, I will recall deprivation and strife in my childhood, while my happier brother remembers a more positive situation; consider the memoirs Running with Scissors (2002), Be Different (2011) and The Long Journey Home (2011), each a radically different depiction of the same family.
In the few longitudinal studies that have been made, where we track children and their adverse childhood experiences (ACEs) from early years to adulthood, there is no link between ACEs and subsequent adult mental ill health. There is only a link between adult mental ill health and the ‘recollection’ of ACEs. This may seem wildly counterintuitive to a profession steeped in trauma theory. ACEs have not been shown to cause mental ill health; it is rather that, when we suffer as adults, we interpret our childhoods as having been bad. I’m convinced that there are rare exceptions to this, of truly horrendous childhood experiences that do leave a mark, but even that certainty falters when I consider the fact that events that supposedly traumatise one person in a group fail to traumatise the others.
If you are denying what I’ve just written out of hand, you may be doing what religious fundamentalists have been doing for millennia. What I say may feel heartless, cold or politically toxic, but feelings aren’t epistemically valid grounds for rejecting information.
Our treatments could be largely pointless and potentially harmful
Instead, consider this: it is possible to care about suffering while reassessing your analysis of how it is caused and how it can be addressed. Perhaps a vast majority of therapy trainings are wrong about why people suffer. People in other cultures with radically different worldviews about how suffering develops and how best to deal with it also care deeply about helping people – they simply have a different way of doing it.
We need to reconsider why people suffer to help them in a better way. Freud and more recent trauma proponents like Gabor Maté tell us that our personalities and sufferings stem from how we were treated as children. This may resonate with us, but it could actually be wrong. If it is wrong, our treatments could be largely pointless and potentially harmful, and we need to critically examine these theories more carefully before we, as a profession, do more harm.
Historically, in many cultures around the world, from Nigeria to Malaysia, or the West more than 50 years ago, childhood has been seen as just one of the stages we move through, with no sacred status. We learn all the time, but suffering stems from how we now, at this time, relate to the world and what our current circumstances are.
Isn’t it a bit arrogant that so many in the West assume that this new, unevidenced theory – that suffering stems from childhood – should be universally true, or even true for us? How does the psychodynamic therapist, faced with their suffering client, feel resolute that they should dredge up the past, when philosophical traditions from across the world say the answer lies in the here and now? The Buddha, Lao Tzu, Aristotle and Jesus didn’t mention a word about childhood’s irreversible stain on the human condition – they saw us as individuals living through choices in the now.
Tuesday, February 4, 2025
There Are No Pure Cultures
One of the most important pieces I read in years - There are no pure cultures: All of our religions, stories, languages and norms were muddled and mixed through mobility and exchange throughout history.
We should understand the nonsensical nationalism of cultural pride. It's all freaking intermingled.
Our present appears that way only because we have forgotten our common past. Globalisation didn’t begin in the 1990s, or even in the past millennia. Remembering this older shared history is a path to a different tale, which begins much, much earlier – long before the arrival of international supply chains, ocean-going sailing ships, and continent-spanning silk roads. The tale of globalisation is written across human history. So why do we keep getting the story so wrong?
[---]
You are strolling around a street market, the Grote Markt, in the Dutch city of Groningen, sometime in the 2020s. A lady operating a stall asks a customer if he wants his hummus ‘naturel’, by which she means ‘plain’. He looks baffled as she gestures to the orange, green and purple varieties of hummus on offer. It had taken him some time to try the original stuff – that pale paste that had him eating more chickpeas, sesame seeds and olive oil than all his ancestors combined – so purple hummus will have to wait for another day. He mutters: ‘The authentic one, please,’ and hurries to the opposite stall for the last item on his shopping list: potatoes, the most elementary ingredient of Dutch cuisine. Elsewhere in the market, other customers are searching for their favourite ingredients. Some are seeking whole wheat for a French-style sourdough loaf or Basmati rice for an Iraqi recipe; others are shopping for maize (corn) flour for a Nigerian pudding, tomatoes for a fresh Italian pasta sauce, or olives for a Greek salad.
Marketplaces like this one are perfect sites to observe the flux and mixing of peoples, goods, ideas and mores that we now call globalisation. They are also places where we can begin imagining the longer history of this process.
Many historical markets were established well before our global age. When the Grote Markt started operating in the late medieval era, little of the produce now available to Groningen’s current international community would have been on display. Back then, the people visiting the market would also have hailed from fewer and closer territories, most of them still speaking their regional dialects. In 1493, however, the imaginative horizons of everyday life at this and other European marketplaces suddenly expanded as news of an extraordinary discovery began to circulate: a previously unknown human world existed beyond Europe’s shores. It was a world so unexpected and seemingly so different that it shook Europeans’ consciousness to the core.
Our philosophical notions of ‘the self’ were born from the shock of Europeans discovering ‘otherness’
[---]
For many historians, this ‘early modern era’, spanning from around 1500 to 1800, marks the first stage of globalisation. According to them, this period birthed the first global capitalist economy and integrated world market, began an unprecedented mixing of local cultures and ethnicities, and crystallised the first global consciousness of a shared world. It was so powerful that its effects still endure to this day in diets, languages, economies, social and legal regimes, international balances of political and military power, and scientific frameworks and institutions. The early modern era even shaped our philosophical notions of ‘the self’, born from the shock of Europeans discovering ‘otherness’.
But even this era was not the first global age in human history. It, too, was the product of earlier global movements, encounters and exchanges. In fact, early modern globalisation was merely one accelerated episode of a general process that has been ongoing for tens of thousands of years.
Collective human memory is a partial and imperfect repository of our encounters with one another through time. We are not good at remembering, let alone acknowledging, the ways that these encounters have shaped our present societies, cultures and economies. So, how did we forget?
Globalisation theorists following the sociologist Roland Robertson use the term ‘glocalisation’ to describe how local cultures digest the products of the global market and turn them into something seemingly new. Through this process, incoming goods – technologies, ideas, symbols, artistic styles, social practices or institutions – are assimilated, becoming hybrid recreations that take on new meanings. These recreations are then redeployed as new markers of cultural or class distinction, sedimenting borrowed cultural products in the collective consciousness to the point of misrecognition. And so the global becomes local, the foreign becomes familiar, and the other becomes us. Glocalisation is how and why we collectively forget. Such is the silent trick of every single globalisation in our history: our forgetfulness of it is the method and mark of its success.
Excavating the sources of our identities is made more difficult by our tendency to focus on the uniqueness of the present. By limiting ourselves to the minutia of the current global moment, we overlook the most obvious manifestations of globalisation’s deeper past. Consider these broad, defining characteristics of human civilisation: our few world religions, our dominant paradigm of written communication, and our widely shared ethical norms of societal conduct. Consider our (quasi-)universal agrarian mode of subsistence, and our single nutritional and psychotropic order, which is based on an incredibly small number of starchy crops (including wheat, maize, rice), domesticated animals (cows, chickens) and stimulants (coffee, sugar) uniformly consumed across the planet. These characteristics predate our current ‘global age’ by millennia. And they are arguably more fundamental features of human culture, and more representative illustrations of globalisation, than either K-pop or the Birkenstock sandal – itself a recent reappropriation of identical or similar products that have been circulating for at least 10,000 years.
Such global phenomena follow a repeated pattern we can easily recognise throughout our history, in which cultural products travelled around the planet through increasingly elaborate connective technologies. Before the internet came aeroplanes and containerships. Before those, came the electric telegraph, railways, steamships, the printing press, newspapers, caravels, writing systems, chariots, and horses and camels. Before all of that came the earliest ideographic signs and the first sea-faring ships of the Palaeolithic Age.
Each new connective technology has opened or expanded pathways of mobility and exchange, creating eras of globalisation that have left lasting imprints in human consciousness. Along these pathways, social intercourse turned local languages into global languages and lingua francas – French, Arabic, classical Chinese, Nahuatl, Maya, Greek or Akkadian – which facilitated and intensified cross-cultural relations. As a result, material culture, ideas and innovations were able to circulate more easily during each historical period of exchange. This is how both ‘prehistoric’ jewellery and T-shirts spread across the globe. It is why monotheism and the story of the flood have appeared in so many different places. And it explains why certain ideas, like the theory of humours or quantum mechanics, have become shared ways of understanding the world.
No cultural system of any significance to our existence escapes this pattern of global becoming. Consider the food systems that sustain our existence and culinary practices. When we associate the potato with ‘traditional’ European cuisines or the Irish famine, we forget its Andean origin and the global journeys that eventually made it ubiquitous in family kitchens and fast-food restaurants all around the world. Similar forgotten stories can be told of other globalised staple foods, including the tomatoes and maize that originated from America, rice from East-Asia and Africa, and the wheat, barley and olives of Southwest-Asia. This forgetting is why many local culinary emblems, such as French wine or American hamburgers, are easily turned into totems and mythologies of national identity. The ‘local’ wine grapes and cattle that flood the world market today are the end-products of global migrations that began as early as the Neolithic Age.
The cultural markers of identity we cherish most jealously – our cuisines, religions, languages and social mores – are products of past globalisations. When we celebrate such cultural markers as ‘authentic’ elements of our identities, we are effectively celebrating our shared human culture, born of a long chain of encounters and exchanges.
Every generation appropriates the inheritances of global exchanges and refashions them as its own. Excavating the sediments our predecessors left in our collective consciousness is not a task that we are naturally disposed to perform. It is an act of remembrance and self-understanding that can destabilise our identities because it counters the processes that endow them with authenticity.
Cultural products travelled around the planet through increasingly elaborate connective technologies
Culture is how we have adapted to our changing environment to sustain ourselves and flourish. Cultures, plural, are the specific manifestations of human culture in different times and places. These two categories – human culture and cultures – are roughly equivalent to the biological idea of the ‘genotype’ (our core code) and the ‘phenotype’ (its variable expressions). The history of our globalisations is the history of how phenotypical variations in human culture have circulated and cumulatively transformed our cultural genotype.
Exclusionist and anti-globalist sentiments come from a confusion of these categories. National or regional cuisines, for example, which anchor feelings of pride in one’s identity and mediate feelings of disgust or contempt for the cuisines of others, are merely variations on a universal human behavioural trait, cooking, that distinguishes us from all other species. Cooking is an extraordinary trait of true significance for our ‘identity’ as a species. Less significant is how different cultures use this or that ingredient.
The invention matters, but equally important is the circulation of those discoveries
The distinctiveness of local cultures is an illusion of scale. When viewed in the long term, their boundaries blur and melt into each other. But the consciousness of an individual or a generation is not capacious enough to span the deep temporality that human culture inhabits. And so, we forget.
The national histories we are taught also erase this long story of cultural movement. They tend to focus on tales of innovation that emphasise moments of creation. In reality, there are few stories of origin and genuine invention.
[---]
Our culture is cosmopolitan because we are a cosmopolitan species. We are citizens of the world, not nations, to paraphrase both Socrates and Thomas Paine. What has allowed us to thrive, physically and culturally, is not our rootedness but our mobility. Without it, we would already be extinct.
Mobility requires freedom of movement. This is a fundamental right we often overlook as we focus our attention on the valuable freedoms that we gained more recently – freedom of thought, belief and expression. Free movement secured our survival and allowed us to flourish on a planet we were not originally adapted to inhabit so widely. Forgetting this precious right makes it easier to succumb to the dominant ideology of rooted difference.
Sunday, February 2, 2025
Curbing Animal Testing
I hope this happens soon as in few months:
Animal testing is a relic from a bygone era but still promoted fervently by interest groups and government agencies as the “gold standard” in experimental sciences for predicting response in people. That is even though — in drug development, for instance — animals are notoriously poor predictors of drug safety and efficacy in humans. To this end, exclusive reliance on animal testing translates into irrecuperable delays in the development of medicines, missed opportunities due to misguided regulatory principles, and exorbitant costs ultimately passed onto consumers.
A jarring 90-95% of experimental drugs fail in clinical trials after acceptable outcomes data in animals are used to justify their advancement for testing on humans. Moreover, scores of potentially life-saving drugs are prematurely abandoned once they confer no benefits to animals, exacerbating an already inefficient animal-centric drug discovery paradigm. Failed oncology trials alone are estimated to cost $50-$60 billion annually. Most new-generation therapies (e.g., cell therapy, immunotherapy) are by design human-specific, and testing on animals is a fool’s errand.
Through decisive actions, DOGE could in principle curb unreliable testing on animals in favor of prioritizing technology-driven, human-relevant alternatives. By doing so, it would — in a singular swoop — reduce waste across federal contracts and grants, promote modern drug development, lower healthcare and prescription drugs cost, bolster national competitiveness, improve environmental health and safety testing, and modernize practices within all health and regulatory agencies.
Francis S. Collins, the longest-serving former director of the National Institutes of Health, wrote in the journal Nature a decade ago that “preclinical research, especially work that uses animal models, seems to be the area that is currently most susceptible to reproducibility issues.” Consistently, 89% of preclinical studies, most of which involve animals, cannot be reproduced!
Reducing the dependency on the key variable (i.e., animal models) associated the most with irreproducibility (e.g., the failure to translate results from the laboratory to the clinic) is one sensible approach to limit fiscal waste in medical research and, more broadly, healthcare.
The cost of developing a single new drug is a stupefying $2 billion with an average development time of 10-15 years from target identification in the laboratory to market release, not factoring in withdrawals or recalls. Poor reliability of animal models in the drug discovery workflow compounds sky-high research and development costs to disincentivize investment in many disease domains. Case in point, 95% of the 7000-plus rare diseases that affect 25-30 million Americans have not a single FDA approved drug to treat them.
[---]
Yet to this day, inexplicable delays in implementing the FDA Modernization Act 2.0 (FDAMA 2.0) are causing significant regulatory confusion among drug sponsors. The failure to act on the part of the FDA, the regulatory agency chiefly responsible for implementing this policy mandate, is in turn a good example of government discordancy, if not malfeasance.
In 2023, a bipartisan group of Senators, led by Rand Paul, R-Ky., and Cory Booker, D-N.J., sent a letter to the FDA demanding an explanation for the stultification and an implementation timeline of the enacted law. When no progress materialized, alarmed lawmakers introduced in February of 2024 the FDA Modernization Act 3.0 (FDAMA 3.0) in the U.S. House of Representatives, H.R. 7248 (and later in the U.S. Senate, S. 5046) to assure FDAMA 2.0 implementation and accomplish further improvements.
Saturday, February 1, 2025
Thank You Inspector Hathiram Chaudhary
There are hardly any good Indian movies but there are some hidden gems in the form of Hindi series.
A colleague told me about the Paatal Lok series a couple of years ago and I was hooked.
Jaideep Ahlawat as Inspector Hathiram Chaudhary is just brilliant. In the middle crappy actors, Jaideep is an actor who is showered with talent probably from up above.
Jaideep Ahlawat is the Hindi version of what Vijay Sethipathi is to Tamil cinema.
I haven't been to India for almost 2 decades now but through Hathiram's eyes I am discovering not much has changed - poverty, power, and pusillanimous seems persistent.
Thank you sir for making me lost in your art and making me think.
Hathiram Chaudhary: A Hero For Our Times
He is an Indian, Rohtak-born. His precinct is Outer Jamuna Paar in Delhi. His currency of operation is that tough, drain-pipe humanity, which he has to preserve in an increasingly murky world.
High-profile police cases that turn out to be zero-sum games are his to negotiate. Slouch-shouldered and pot-bellied, he goes through a series of spirals only to come upon dead ends.
To do this night after night is to earn those bleary, exhausted eyes that are his signature.
Those eyes have wonderful bags under them that touch us deeply.
[---]
We keep persisting with Paatal Lok's hardbound cynicism because we know that even if wiped out and shattered, we can still come home to Hathiram Chaudhary. We are sure he would let us in with a shrug.
He has a political stance; he most certainly does. But he never uses it as a tool to patronize, instruct, or elevate himself to a higher moral plane.
Does this explain his broad appeal, why he's equally beloved by right-wingers and lefties?
Here's Hathiram's version of liberalism, as unrehearsed as they come.
In the first season, while standing up for a Muslim colleague, he doesn't position himself as the progressive one battling a bunch of bigots.
On the contrary, his actions suggest that steering clear of bigotry is something we all can aspire to.
In Season 2, there's a wonderful scene involving the revelation of a close friend's sexuality, where he rebukes his personal brand of Haryanvi machismo as he lends his support to the slightly embarrassed friend.
"I'm a country bumpkin with no knowledge of gay parades. But if it feels right to you, then that's all that matters," so says the bumpkin, not emphatically but searchingly, and with a faint note of some swear-word bubbling up in his throat.
His inclusive attitude is unique: It may not possess the jingle of a placard slogan, but it surely has the warmth of a hardboiled embrace.
Monday, January 27, 2025
Much of the Cuisine We Now Know, and Think of as Ours, Came to Us by War
“Sicily became quite famous for its fruits and vegetables, and that can be traced back to the Muslim era, when the gardens probably began as pleasure gardens,” says Wright. Pleasure gardens were designed as places of repose, and for Muslims, a reminder of the paradise awaiting the virtuous. “They were eventually turned into ‘kitchen gardens,’” Wright continues, describing them as “experimental horticultural stations” to develop better propagation methods. But at the same time, they were places of beauty. “The gardens were lush with vegetable crops, flowering bushes, and fruit trees, and graced with water fountains and pavilions,” Wright explains in A Mediterranean Feast. During the 300 years that the Arabs ruled Sicily, its agriculture and economy grew, and institutions evolved. In fact, when the Normans seized power, they kept many practices of their predecessors, including the organization of the government and, in the upper classes, the wearing of flowing robes.
Humans are bound to food by necessity first, and then by choice. The types of food you eat distinguish your country from another country, your group from another group. When new influences come—whether from conquest or colonial exploration or the popularity of a TV cooking show—there is a period of adaptation, and then often the full incorporation of a new technique or ingredient into the country’s culinary lexicon. The potatoes and tomatoes that went from the New World to Europe in the Columbia Exchange of the 15th century were first scorned by Old World diners who feared they were poisonous, then in time became emblematic of their cuisines. In its original form, Sicilian caponata would never have been made with tomatoes, but today there are versions that include them and they are considered perfectly Sicilian.
Food constantly evolves, as do taste buds. To the Western palate, Japanese food seems so distinctly Japanese, yet it went through many modifications once the country opened to the West in the 19th century, explains Katarzyna Cwiertka, the chair of modern Japanese Studies at Leiden University and a scholar of East Asian food. “New ingredients, new cooking techniques, and new flavorings were adapted to Japanese customs,” she says. “The changes were really tremendous.”
Military canteens played the role of first adopters. Once Japanese soldiers became accustomed to a food, they would eventually introduce it to the wider public when they returned to civilian life. Such was the case with curry, which started appearing in Japan in the late 19th century. It was a borrowing not directly from India, but from the British Empire. “The Japanese start to serve it as a Western food,” says Cwiertka. “It enters military menus and canteens and continues after [World War II] into school canteens. By the 1950s and 1960s it is a national dish. When you ask Japanese students abroad what they crave most, they would say ramen or curry. And ramen [of Chinese origin] is also not a Japanese food.”
What the Japanese have done—over and over again, Cwiertka points out—is move foreign foods into the category of washoku, the genuinely Japanese. They adapt and absorb foreign culinary influences this way. “It’s more like the invention of a tradition than a tradition,” she says.
- More Here
Saturday, January 25, 2025
What Are The Odds?
This week on Wednesday within a span of a few hours - I saw an Owl and then even Neo got startled when two bald Eagles flew over Max's Walden.