Monday, March 3, 2025

Finding Awe!

I am blessed. 

I lived for over 13 years in a state of Max's Awe and I still do.  has become our awe paradise. 

Thank you my love. "I" became irrelevant living and one day soon dying with you. 

Awe Is Good for Your Brain:

Some attribute the beginning of the study of awe to the Apollo 8 mission. In December 1968, three astronauts entered a small capsule—the vehicle for mankind’s first trip to the moon. (They orbited ten times but didn’t land.) Major William Anders glanced out the window in time to see his blue home planet rising above the stark lunar horizon. “Oh, my God,” he said. Then he took a photo.

Later called Earthrise, the image became one of the most famous photographs ever taken. Fifty years after Anders captured it, he said that the view of Earth changed his life, shaking his religious faith and underscoring his concern for the planet. “We set out to explore the moon,” he wrote about the experience, “and instead discovered the Earth.”

Dubbed the overview effect, the profound experiences shared by Anders and many astronauts helped usher in a wave of academic interest in transcendent events and their attendant emotion—notably, awe. Experimental psychologists tried to induce the emotion in laboratories, showing people pictures of earth taken from space, as well as videos of a flash mob performing the “Ode to Joy” movement of Beethoven’s Ninth Symphony, or Susan Boyle wowing the world when she sang on Britain’s Got Talent. (If you haven’t seen Boyle doing her thing, look it up; I dare you not to feel some tingles.)

For research purposes, subjects let scientists measure their goose bumps, supplied cortisol samples before and after whitewater rafting, performed tedious cognitive tasks, and were fitted with suction probes to measure something that’s called “awe face.”

Researchers pondered many aspects of awe, including why experiencing it caused some people to feel greater belonging or generosity. They speculated that awe may be the primary pathway through which therapeutic psychedelics help so many patients suffering from trauma, depression, anxiety, and addiction. They even asserted that experiencing awe may be the defining feature of our species.

For an emotion with so much riding on it, what seems surprising is that it took the academic world so long to take awe seriously.

“Science got into the awe game really late,” says Dacher Keltner, a psychology professor at the University of California at Berkeley, and the author of the new book Awe: The New Science of Everyday Wonder and How It Can Transform Your Life.

Keltner grew up in 1960s California, raised by progressive parents. All around him people were exploring Buddhism, experimenting with mind-altering drugs, and communing with nature. It was also the golden age of spaceflight. “I was raised in a historical period that was in some sense devoted to awe,” he says. “But it was a neuroscientific and cognitive mystery.”

In 2003, Keltner and the psychologist Jonathan Haidt published one of the first academic papers on the experience. In “Approaching Awe, a Moral, Spiritual, and Aesthetic Emotion,” the two scientists tried to pinpoint what exactly awe is. They combed through historical accounts by philosophers and mystics; what they arrived at was both eloquent and expansive.

“We said that awe is really an emotion you feel when you encounter something vast and mysterious that transcends your understanding of the world,” he says. The vastness part, he explains, doesn’t have to be literally vast, like a view from a mountaintop. It can be conceptually vast, like the anatomy of a bee or string theory or a late-night stoner realization that every mammal on earth must have a belly button.

In the two decades of research that followed, an even more remarkable conclusion emerged: that this state of mind could potentially alter us by unleashing feelings like humility, generosity, and a desire to reassess our lives. And sometimes even existential terror. Whether it’s cataclysmic or gentle, an awe experience could be an effective antidote to burnout, post-traumatic stress, heartbreak, and loneliness.

[---]

I had to admit, I hadn’t really been thinking of this spectacle from the plant’s perspective. It suddenly seemed a totally reasonable thing to do. Most of these plants have been around a lot longer than humans have. The seeds that created this bloom were made in the past. They finally germinated during this precious wet year, but the whole thrust of the extravagant effort was to make seeds for a future bloom in an outrageous cycle of hope. Godoy and I were standing, accidentally, in the middle of a space-time continuum that had absolutely nothing to do with us. We humans just need to not screw it up.

Then it hit me: the risk of chasing awe, of making it about personal growth, is that you dilute its strongest power. Because improving ourselves really isn’t the point of awe at all. I’d been doing it wrong, and it had taken a 27-year-old human and a cluster of yellow tickseeds to help me realize it. The point is this: by listening, we find a small seam in the universe through which to feel ourselves entirely irrelevant.

 

Sunday, March 2, 2025

Yes, Shrimp Matter

I left private equity to work on shrimp welfare. When I tell anyone this, they usually think I've lost my mind. I know the feeling — I’ve been there. When I first read Charity Entrepreneurship's proposal for a shrimp welfare charity, I thought: “Effective altruists have gone mad — who cares about shrimp?” 

The transition from analyzing real estate deals to advocating for some of the smallest animals in our food system feels counterintuitive, to say the least. But it was the same muscle I used converting derelict office buildings into luxury hotels that allowed me to appreciate an enormous opportunity overlooked by almost everyone, including those in the animal welfare space. I still spend my days analyzing returns (though they’re now measured in suffering averted). I still work to identify mutual opportunities with industry partners. Perhaps most importantly, I still view it as paramount to build trust with people who — initially — sit on opposite sides of the table.

After years of practicing my response to the inevitable raised eyebrows, I now sum it up simply: ignoring shrimp welfare would have been both negligent and reckless.

This may seem like an extreme stance. Shrimp aren't high on the list of animals most people think about when they consider the harms of industrial agriculture. For a long time — up until the last few years — most researchers assumed shrimp couldn't even feel pain. Yet as philosopher Jonathan Birch explains in The Edge of Sentience, whenever a creature is a sentience candidate1 and we cannot rule out its capacity for conscious experience, we have a responsibility to take its potential for suffering seriously.  

We don’t know what it is like to be a shrimp. We do know that if shrimp can suffer, they are doing so in the hundreds of billions. 

Why worry about shrimp in a world where so many mammals and birds live in torturous conditions due to industrial agriculture?2 The answer is that shrimp farming dwarfs other forms of animal agriculture by sheer numbers. An estimated 230 billion shrimp of various species are alive in farms at any given moment —  compared to the 779 million pigs, 1.55 billion cattle, 4 33 billion chickens, and 125 billion farmed fish.

Shrimp are harvested at around 6 months of age, which puts the estimated number slaughtered annually for human consumption at 440 billion. For perspective: that’s more than four times the number of humans who have ever walked the earth. At sea, the numbers are even more staggeringly shrimpy. Globally,  27 trillion shrimp are caught in the wild6 every year, compared to 1.5 trillion fish.

Despite their size, shrimp are the proverbial “elephant in the room” when discussing animal welfare in food systems.

[---]

The future of shrimp welfare is one of the most underexplored areas in modern animal rights, but its potential for impact is immense. We are only at the beginning of a movement that could fundamentally shift the way we treat aquatic animals — both on farms and for those caught in the ocean. While challenges remain, including entrenched industry practices and global trade complexities, the path forward is becoming clearer with each step taken by animal NGOs and progressive food companies.

For the first time ever, shrimp welfare is becoming a relevant topic within the broader animal welfare movement, one that has traditionally focused on larger animals and more familiar causes. But the staggering number of shrimp affected, their capacity to suffer, and the emerging solutions make this a moral issue we can no longer ignore. Addressing shrimp welfare isn’t just about reducing suffering for billions of animals — it’s about redefining our relationship with the natural world, expanding our circle of compassion, and challenging the limits of our ethical responsibilities.

- More Here


Tuesday, February 25, 2025

Gastronomical Conversations Can Reflect Who We Are, & Who We Are Not

KR: One of the first settings where food and language converge is during family meals. How does this differ from country to country?

MSK: Research shows that in the United States, families talk about whether the food is healthy, whereas in Italy, they talk about whether it’s tasty, which is ironic since there are so many health problems in the US with obesity.

KR: Eating together is not the norm in all cultures. Those who do have family meals often don’t talk while eating — it’s considered distracting. What they want to represent to children is an attentiveness to their food and gratitude for it. In the Marquesas, I found that talking happens while procuring and preparing food, not at meals.

JC: We think of the family meal as something everyone does, but it is closely related to class and race. Those who can afford to, and people who work 9 to 5, can have regular family meals. But not shift workers, those working two or three jobs, or those who come from different cultural traditions. It’s become a moral issue too — the message is that if you don’t do it, you’re missing a really important socializing moment with your children. People are made to feel like they’re failing.

MSK: It’s put up as an ideal today but, at some time in history, children weren’t supposed to eat with parents or talk at the table, so this idea of the family meal as an eternal institution that’s crumbling is wrong.

- More Here


Tuesday, February 18, 2025

Against Optimization

For most of the big decisions we make—about how to govern our societies or how to structure our individual lives—there is a better, wiser strategy for us to follow. Topple the churches to the god of Optimization. Replace them with shrines to a wiser, more caring deity: Resilience.

To see why, we need to draw on lessons from unexpected places: the shells of molluscs, the carefully engineered robustness of ant colonies, and by debunking the mistaken interpretations of evolutionary biology that have infected the dominant—but incorrect—view as to how our world works.

The popular reduction of evolutionary principles to “survival of the fittest”—with overtones of relentless, flawless optimization—is a tragic mistake. (Many incorrectly attribute the phrase to Charles Darwin, but it was first coined by Herbert Spencer). While it is true that evolution does often fine-tune species to greater fitness over time through natural selection, the ultimate engine of evolution is survival and reproduction—which often requires robustness and the ability to adapt to uncertainty.

A hyper-optimized species that can only survive in one environment will get wiped out if that environment changes. That’s one reason why evolution routinely works in unexpected ways, through what the brilliant evolutionary biologist Zachary Blount calls “the genomic junk drawer.” The specific evolutionary path that a species took—along with plenty of accidental, contingent events along the way—leaves extra stuff in the genome that might at first appear to be junk.

The awe-inspiring genius of our natural world is that evolution provides a mechanism to repurpose that genomic “slack” into something more useful when the environment changes. It’s the evolutionary wizardry of resilient adaptation. That’s why, as Daniel Milo argues, a huge range of lasting species are defined not by optimal solutions, but by “good enough” ones. It’s not survival of the perfectly optimized, but survival of the resilient, as only the most robust inherit the Earth.

For example, nacre, or “mother of pearl,” is one of the oldest and most unchanged biomechanical structures on Earth. With a stunningly beautiful lustre, it gives pearls their sheen and adorns the inner shell of some molluscs. It is largely the same structure from when it first emerged roughly 530 million years ago. (Modern humans have been around for only about 250,000 years, so we might have something to learn from this longstanding byproduct of evolutionary pressure).

Nacre persists because nature is an engineering marvel, producing an ingenious structure that offers a parable for us. The short version is this: at the nano-level, the nacre on mollusc shells has a series of flawed, interconnecting parts that are decidedly un-optimized. The flaws lock together in an irregular brick and mortar pattern, where the “mortar” is organic material that, if needed, can be squeezed out when the material is put under strain.

Moreover, the unique structure creates discontinuities, so if one part cracks, the damage is contained, isolated, decoupled from the rest of the material. To an untrained eye, the structure looks woefully inefficient, wasteful, badly designed. Instead, it’s one of the strongest substances in the world.

This structure provides two initial key lessons for humans—both in our social systems and in our lives. Resilience can often be produced by systems that feature:

Diversity (lots of different kinds of components that work together are more robust than a uniform single structure, just as the Estonian power supply was augmented by a wide array of other electricity sources when one cable was severed);

Redundancy (systems that are designed to work even after an unexpected failure or setback are more robust, illustrated by the Suez Canal, which had no backup option when the route became blocked).

The third lesson comes not from molluscs but from ants. It’s resilience from what I call decoupled connectivity, the idea that robustness comes from interconnected support networks—but also that one needs to be able to sever a destructive node when it becomes toxic. Connectivity allows a system to repair itself when under strain, while decoupling allows isolation to contain a devastating cascade.3

When ant colonies face a disease outbreak, for example, they exhibit ingenious behaviors. If the outbreak is merely of a mild fungal infection, then connectivity saves the colony, as “nurse” ants are swiftly deployed to administer “a formic acid antimicrobial poison to their patients whilst grooming them.” Without the connected network, a mild outbreak could become a devastating epidemic.

However, if the outbreak does become more severe, maintaining connectivity could prove fatal. Then, the colony will pursue more extreme strategies, either of isolation—keeping infected ants away from the healthy ones—or of killing diseased individuals, pruning that node off from the colony altogether.4 Since outbreaks don’t happen all the time, keeping these mechanisms in place could be thought of as a form of inefficient slack. But when disease strikes, it’s the slack that saves the colony from existential risk.

One of nature’s overarching lessons is this: what may look to a naive human eye as waste, or inefficiency, or under-optimized slack is often evolution’s secret weapon, providing the adaptive resilience to survive in an ever-changing world.

- More Here


Friday, February 14, 2025

The Languages Lost To Climate Change

Scientists and linguists have discovered a striking connection between the world’s biodiversity and its languages. Areas rich in biological diversity also tend to be rich in linguistic diversity (a high concentration of languages). While this co-occurrence is not yet fully understood, a strong geographic correlation suggests multiple factors (ecological, social, cultural) influence both forms of diversity, which are also declining at alarming rates. These high-diversity areas are also often at the front lines of the climate crisis. Where plant and animal species are disappearing, languages, dialects and unique expressions often follow a similar pattern of decline.

The Arctic may not be an obvious biodiversity hotspot, like the Brazilian Amazon or Tanzania’s coastal forests, but it plays a critical role in regulating and stabilizing the Earth’s climate and supporting life on our planet. Scientists often say that “what happens in the Arctic does not stay in the Arctic,” and any disruption to its habitat has far-reaching consequences for humanity.

Indigenous communities have deep relationships with the land they have occupied for generations, and this close relationship is reflected in the languages they speak — how they talk about the landscape, and how they express the beliefs and customs in which those languages developed. When their relationships with the land suffer, so can their languages. 

For example, Vanuatu, a South Pacific island nation with the highest density of languages on the planet (110 languages across 4,707 square miles), is home to 138 threatened plant and animal species. It is also one of the countries that is particularly vulnerable to sea level rise and climate-related natural disasters. Scientists warn that the climate crisis has become the “final nail in the coffin” for many Indigenous languages, as coastal communities are forced to relocate.

When they can no longer depend on the land, communities may be forced to emigrate to other areas where their languages aren’t spoken, leaving behind not just their mother tongue, but all the wisdom contained in it. There is also evidence to suggest that in cases where a language begins to decline — due to economic or social factors, for example — people may gradually stop caring for the land. When languages are abandoned, the traditional ecological knowledge they carry is also left behind.

“Our language and traditional practices are closely tied to the land,” a community leader from Dishchii’bikoh, a tribally controlled school, in Cibecue, Arizona, told researchers in a 2016 study.“In many ways, it is used in describing objects, teaching moral lessons, and expressing our purpose on this land. Since the loss of our traditional language … our traditional ecological knowledge has become more and more threatened.”

Increasingly, Indigenous communities are pointing to the inextricable link between language and biodiversity as evidence that humans are not separate from nature, but very much a part of it.

[---]

Linguistic diversity can be seen as an indicator of cultural diversity more broadly, Gorenflo says, which has traditionally been more difficult to define. “For a long time, anthropology was considered to be the social science that studied culture. But nobody could come to an agreement about what culture was,” he says. “Linguistic diversity is really what we’re using as a proxy for cultural diversity.”

The exact reasons behind the connections between languages and nature are not entirely clear, Gorenflo told me. Previous studies have suggested that areas with a high number of resources create linguistic diversity because people must adapt to more complex environments. But others have argued that it’s because more plentiful resources reduce the likelihood of having to share them and communicate with neighboring groups in times of need. Meanwhile, some research has suggested that the reasons behind this co-occurrence are far more complex and differ from one area to another. Gorenflo emphasized the need for more research. “Understanding this connection is important because it would change how we manage the relationship between Indigenous people and biological diversity — and nature.”

[---]

For Gorenflo, the factors driving the co-occurrence of linguistic and biological diversity, which were initially puzzling, are now becoming even more evident. “I see languages as an extension of the cultural system, which itself is part of the broader ecology of the world,” he told me. “So, it’s less and less of a mystery to me, and more about exploring what this ecology looks like.”

The preservation of endangered languages is about more than saving words — it could be vital to safeguarding centuries of human knowledge and understanding the systems that sustain us.

- More Here


Wednesday, February 12, 2025

How We Treat Non-Human Animals Is Darwin's Greatest Contribution

For over 150 years, Charles Darwin and his work have influenced the fields of science, religion, politics, gender, literature, philosophy, and medicine. With a view in 2013 of the innumerable changes he has sparked across a number of disciplines, what should be considered Darwin’s most important contribution?

Darwin showed us that we’re animals. He showed us that there’s no fundamental distinction between us and any other critter on the planet. The most important implication of this Gestalt shift may be ethical. As soon as we accept that the human-animal distinction is not fundamental in nature, it becomes difficult to accept a moral code that privileges the wellbeing of human beings but is indifferent to the wellbeing of any other animal. It becomes hard to resist extending our moral concern to any creature capable of suffering, human or not. If present trends continue, the main beneficiaries of Darwin’s great idea may not be human beings. Ultimately, the main beneficiaries may be the other animals we share the planet with.

- More here from Steve Stewart-Williams


Monday, February 10, 2025

Where Did Trees Come From?

Trees are considered to be an evolutionary descendant of ferns, one of the oldest types of plants currently around today. These early trees were much shorter than the average tree today and also reproduced with spores rather than seeds. The first known tree fossil dates back to about 385 million years ago during the Middle Devonian. During this time period, no plant grew higher than roughly waist height. However, in order to grow higher, plants would need to develop a stronger form of tissue.

The development of wood was a big evolutionary leap and took millions of years to accomplish. Wood is useful for several reasons. The most obvious is the structural support, but wood is also useful for allowing more efficient transport of water. With this new development, early trees could out-compete with their neighbors for precious sunlight and store more water to survive in droughts.

These early trees formed the backbone of Devonian and Carboniferous forests. This includes varieties like the Wattieza above, and their relatives the Lepidodendrales. The forests that grew and died during this period are the primary source material for all modern coal deposits. Without trees evolving at this time, the Industrial Revolution may have never happened! Even hundreds of millions of years ago, trees were laying the groundwork for modern human advancement.

These trees also grew fairly differently from modern trees. Instead of gradually growing continuously throughout its life, these plants stay at a low height for a while. Once it has built up sufficient resources, it will “rapidly” shoot up in height to rise above its neighbors and expose itself to lots of sunlight. Rapidly here means faster than a modern tree, but still slow to our eyes. Another difference is the quality of wood; ancient trees used a variety of wood that was easier to create but much less structurally sound. As a result, these trees could not grow very tall and did not have branches.

Trees were due for another evolutionary shift during the Triassic Period. Their method of reproduction shifted from spores to seeds. This is where we see the first example of a gymnosperm. Gymnosperm is Greek for ‘revealed seed’ or ‘naked seed.’ This class consists of many trees that we would recognize today, including any tree that has cones. Gymnosperms include cedars, Douglas-firs, cypresses, firs, junipers, kauri, larches, pines, hemlocks, redwoods, spruces, and yews. While gymnosperms became dominant in the Triassic Period, they first appeared sometime during the Carboniferous Period.

- More Here


Thursday, February 6, 2025

Psychodynamic Nonsense

The art of ‘being for another’ – following, listening to and making sense of another person’s world – has been practised for millennia. Humans have always discussed their lives, their values and their problems, trying to find meaning, solace and joy. Experts at this sort of discussion have been called wise women, shamans, priests – and now therapists. Then, starting with Sigmund Freud, came a series of attempts to create a science of psychotherapy out of it.

But there is very little science to it.

[---]

I became a psychotherapist and psychologist to maximise the good I could do in the world. It seemed obvious that helping people by engaging with the root of their suffering would be the most helpful thing to do. I also became a child psychotherapist to address the roots of suffering in childhood, where they seemed to stem. I experienced how deepening into a feeling could transform it, and learned about pre-natal trauma; I even wrote a doctorate on trauma. Now, two decades into my career, I practise, lecture, supervise and write about all of these things, but increasingly I reject everything that I learned. Instead, I practise the art of ‘being for another’, an idea that arose in conversation with my colleague Sophie de Vieuxpont. I’m a mentor, a friend in an asymmetrical friendship, and a sounding board and critical ally assisting people as they go through the complexities, absurdities, devastations and joys of life.

Along the way, over years of practise, I lost faith that awareness was always curative, that resolving childhood trauma would liberate us all, that truly feeling the feelings would allow them to dissipate, in a complex feedback loop of theory and practice.

The effect of your family environment matters very little when it comes to your personality

It started with returning to an old interest in evolutionary biology, with the release of Robert Plomin’s book Blueprint (2018). An account of twin studies, the book draws upon decades of twin statistics, from several countries, and the numbers were clear: childhood events and parenting rarely matter that much in terms of how we turn out.

That caused me to re-read Judith Rich Harris’s book No Two Alike (2006), which also examined twin studies along with wide-ranging studies of other species. Harris proposed that the brain was a toolbox honed by evolution to deliver sets of skills, leaving each of us utterly unique.

These books are perhaps summed up best in the second law of behavioural genetics: the influence of genes on human behaviour is greater than the family environment. I noticed my defences popping up, desperately trying to find holes in the science. But at the end of the day, without cherry-picking data conforming to what I learned in my training, the simple fact was this: twin sisters with identical genes raised in totally different families developed very similar personalities, while adopted sisters with no genetic links raised in the same family had very different personalities.

That finding, from the journal Developmental Psychology, undermined years of learning in psychodynamic theory. It means that the effect of your family environment – whether you are raised by caring or distant parents, whether in a low-income or high-income family – matters very little when it comes to your personality. If you’ve ever had any training in therapy, this goes against everything you have been taught.

Yet the tenets of psychotherapy did not reflect my clients’ lived experience, or even my own. Instead, we see what we expect to see, and we make sense of our past based on how we feel now. If I am sad, I will recall deprivation and strife in my childhood, while my happier brother remembers a more positive situation; consider the memoirs Running with Scissors (2002), Be Different (2011) and The Long Journey Home (2011), each a radically different depiction of the same family.

In the few longitudinal studies that have been made, where we track children and their adverse childhood experiences (ACEs) from early years to adulthood, there is no link between ACEs and subsequent adult mental ill health. There is only a link between adult mental ill health and the ‘recollection’ of ACEs. This may seem wildly counterintuitive to a profession steeped in trauma theory. ACEs have not been shown to cause mental ill health; it is rather that, when we suffer as adults, we interpret our childhoods as having been bad. I’m convinced that there are rare exceptions to this, of truly horrendous childhood experiences that do leave a mark, but even that certainty falters when I consider the fact that events that supposedly traumatise one person in a group fail to traumatise the others.

If you are denying what I’ve just written out of hand, you may be doing what religious fundamentalists have been doing for millennia. What I say may feel heartless, cold or politically toxic, but feelings aren’t epistemically valid grounds for rejecting information.

Our treatments could be largely pointless and potentially harmful

Instead, consider this: it is possible to care about suffering while reassessing your analysis of how it is caused and how it can be addressed. Perhaps a vast majority of therapy trainings are wrong about why people suffer. People in other cultures with radically different worldviews about how suffering develops and how best to deal with it also care deeply about helping people – they simply have a different way of doing it.

We need to reconsider why people suffer to help them in a better way. Freud and more recent trauma proponents like Gabor Maté tell us that our personalities and sufferings stem from how we were treated as children. This may resonate with us, but it could actually be wrong. If it is wrong, our treatments could be largely pointless and potentially harmful, and we need to critically examine these theories more carefully before we, as a profession, do more harm.

Historically, in many cultures around the world, from Nigeria to Malaysia, or the West more than 50 years ago, childhood has been seen as just one of the stages we move through, with no sacred status. We learn all the time, but suffering stems from how we now, at this time, relate to the world and what our current circumstances are.

Isn’t it a bit arrogant that so many in the West assume that this new, unevidenced theory – that suffering stems from childhood – should be universally true, or even true for us? How does the psychodynamic therapist, faced with their suffering client, feel resolute that they should dredge up the past, when philosophical traditions from across the world say the answer lies in the here and now? The Buddha, Lao Tzu, Aristotle and Jesus didn’t mention a word about childhood’s irreversible stain on the human condition – they saw us as individuals living through choices in the now.

- After decades of practising psychotherapy, I believe it has little foundation in science and often causes harm


Tuesday, February 4, 2025

There Are No Pure Cultures

One of the most important pieces I read in years - There are no pure cultures: All of our religions, stories, languages and norms were muddled and mixed through mobility and exchange throughout history. 

We should understand the nonsensical nationalism of cultural pride. It's all freaking intermingled. 

Our present appears that way only because we have forgotten our common past. Globalisation didn’t begin in the 1990s, or even in the past millennia. Remembering this older shared history is a path to a different tale, which begins much, much earlier – long before the arrival of international supply chains, ocean-going sailing ships, and continent-spanning silk roads. The tale of globalisation is written across human history. So why do we keep getting the story so wrong?

[---]

You are strolling around a street market, the Grote Markt, in the Dutch city of Groningen, sometime in the 2020s. A lady operating a stall asks a customer if he wants his hummus ‘naturel’, by which she means ‘plain’. He looks baffled as she gestures to the orange, green and purple varieties of hummus on offer. It had taken him some time to try the original stuff – that pale paste that had him eating more chickpeas, sesame seeds and olive oil than all his ancestors combined – so purple hummus will have to wait for another day. He mutters: ‘The authentic one, please,’ and hurries to the opposite stall for the last item on his shopping list: potatoes, the most elementary ingredient of Dutch cuisine. Elsewhere in the market, other customers are searching for their favourite ingredients. Some are seeking whole wheat for a French-style sourdough loaf or Basmati rice for an Iraqi recipe; others are shopping for maize (corn) flour for a Nigerian pudding, tomatoes for a fresh Italian pasta sauce, or olives for a Greek salad.

Marketplaces like this one are perfect sites to observe the flux and mixing of peoples, goods, ideas and mores that we now call globalisation. They are also places where we can begin imagining the longer history of this process.

Many historical markets were established well before our global age. When the Grote Markt started operating in the late medieval era, little of the produce now available to Groningen’s current international community would have been on display. Back then, the people visiting the market would also have hailed from fewer and closer territories, most of them still speaking their regional dialects. In 1493, however, the imaginative horizons of everyday life at this and other European marketplaces suddenly expanded as news of an extraordinary discovery began to circulate: a previously unknown human world existed beyond Europe’s shores. It was a world so unexpected and seemingly so different that it shook Europeans’ consciousness to the core.

Our philosophical notions of ‘the self’ were born from the shock of Europeans discovering ‘otherness’

[---]

For many historians, this ‘early modern era’, spanning from around 1500 to 1800, marks the first stage of globalisation. According to them, this period birthed the first global capitalist economy and integrated world market, began an unprecedented mixing of local cultures and ethnicities, and crystallised the first global consciousness of a shared world. It was so powerful that its effects still endure to this day in diets, languages, economies, social and legal regimes, international balances of political and military power, and scientific frameworks and institutions. The early modern era even shaped our philosophical notions of ‘the self’, born from the shock of Europeans discovering ‘otherness’.

But even this era was not the first global age in human history. It, too, was the product of earlier global movements, encounters and exchanges. In fact, early modern globalisation was merely one accelerated episode of a general process that has been ongoing for tens of thousands of years.

Collective human memory is a partial and imperfect repository of our encounters with one another through time. We are not good at remembering, let alone acknowledging, the ways that these encounters have shaped our present societies, cultures and economies. So, how did we forget?

Globalisation theorists following the sociologist Roland Robertson use the term ‘glocalisation’ to describe how local cultures digest the products of the global market and turn them into something seemingly new. Through this process, incoming goods – technologies, ideas, symbols, artistic styles, social practices or institutions – are assimilated, becoming hybrid recreations that take on new meanings. These recreations are then redeployed as new markers of cultural or class distinction, sedimenting borrowed cultural products in the collective consciousness to the point of misrecognition. And so the global becomes local, the foreign becomes familiar, and the other becomes us. Glocalisation is how and why we collectively forget. Such is the silent trick of every single globalisation in our history: our forgetfulness of it is the method and mark of its success.

Excavating the sources of our identities is made more difficult by our tendency to focus on the uniqueness of the present. By limiting ourselves to the minutia of the current global moment, we overlook the most obvious manifestations of globalisation’s deeper past. Consider these broad, defining characteristics of human civilisation: our few world religions, our dominant paradigm of written communication, and our widely shared ethical norms of societal conduct. Consider our (quasi-)universal agrarian mode of subsistence, and our single nutritional and psychotropic order, which is based on an incredibly small number of starchy crops (including wheat, maize, rice), domesticated animals (cows, chickens) and stimulants (coffee, sugar) uniformly consumed across the planet. These characteristics predate our current ‘global age’ by millennia. And they are arguably more fundamental features of human culture, and more representative illustrations of globalisation, than either K-pop or the Birkenstock sandal – itself a recent reappropriation of identical or similar products that have been circulating for at least 10,000 years.

Such global phenomena follow a repeated pattern we can easily recognise throughout our history, in which cultural products travelled around the planet through increasingly elaborate connective technologies. Before the internet came aeroplanes and containerships. Before those, came the electric telegraph, railways, steamships, the printing press, newspapers, caravels, writing systems, chariots, and horses and camels. Before all of that came the earliest ideographic signs and the first sea-faring ships of the Palaeolithic Age.

Each new connective technology has opened or expanded pathways of mobility and exchange, creating eras of globalisation that have left lasting imprints in human consciousness. Along these pathways, social intercourse turned local languages into global languages and lingua francas – French, Arabic, classical Chinese, Nahuatl, Maya, Greek or Akkadian – which facilitated and intensified cross-cultural relations. As a result, material culture, ideas and innovations were able to circulate more easily during each historical period of exchange. This is how both ‘prehistoric’ jewellery and T-shirts spread across the globe. It is why monotheism and the story of the flood have appeared in so many different places. And it explains why certain ideas, like the theory of humours or quantum mechanics, have become shared ways of understanding the world.

No cultural system of any significance to our existence escapes this pattern of global becoming. Consider the food systems that sustain our existence and culinary practices. When we associate the potato with ‘traditional’ European cuisines or the Irish famine, we forget its Andean origin and the global journeys that eventually made it ubiquitous in family kitchens and fast-food restaurants all around the world. Similar forgotten stories can be told of other globalised staple foods, including the tomatoes and maize that originated from America, rice from East-Asia and Africa, and the wheat, barley and olives of Southwest-Asia. This forgetting is why many local culinary emblems, such as French wine or American hamburgers, are easily turned into totems and mythologies of national identity. The ‘local’ wine grapes and cattle that flood the world market today are the end-products of global migrations that began as early as the Neolithic Age.

The cultural markers of identity we cherish most jealously – our cuisines, religions, languages and social mores – are products of past globalisations. When we celebrate such cultural markers as ‘authentic’ elements of our identities, we are effectively celebrating our shared human culture, born of a long chain of encounters and exchanges.

Every generation appropriates the inheritances of global exchanges and refashions them as its own. Excavating the sediments our predecessors left in our collective consciousness is not a task that we are naturally disposed to perform. It is an act of remembrance and self-understanding that can destabilise our identities because it counters the processes that endow them with authenticity.

Cultural products travelled around the planet through increasingly elaborate connective technologies

Culture is how we have adapted to our changing environment to sustain ourselves and flourish. Cultures, plural, are the specific manifestations of human culture in different times and places. These two categories – human culture and cultures – are roughly equivalent to the biological idea of the ‘genotype’ (our core code) and the ‘phenotype’ (its variable expressions). The history of our globalisations is the history of how phenotypical variations in human culture have circulated and cumulatively transformed our cultural genotype.

Exclusionist and anti-globalist sentiments come from a confusion of these categories. National or regional cuisines, for example, which anchor feelings of pride in one’s identity and mediate feelings of disgust or contempt for the cuisines of others, are merely variations on a universal human behavioural trait, cooking, that distinguishes us from all other species. Cooking is an extraordinary trait of true significance for our ‘identity’ as a species. Less significant is how different cultures use this or that ingredient.

The invention matters, but equally important is the circulation of those discoveries

The distinctiveness of local cultures is an illusion of scale. When viewed in the long term, their boundaries blur and melt into each other. But the consciousness of an individual or a generation is not capacious enough to span the deep temporality that human culture inhabits. And so, we forget.

The national histories we are taught also erase this long story of cultural movement. They tend to focus on tales of innovation that emphasise moments of creation. In reality, there are few stories of origin and genuine invention.

[---]

Our culture is cosmopolitan because we are a cosmopolitan species. We are citizens of the world, not nations, to paraphrase both Socrates and Thomas Paine. What has allowed us to thrive, physically and culturally, is not our rootedness but our mobility. Without it, we would already be extinct.

Mobility requires freedom of movement. This is a fundamental right we often overlook as we focus our attention on the valuable freedoms that we gained more recently – freedom of thought, belief and expression. Free movement secured our survival and allowed us to flourish on a planet we were not originally adapted to inhabit so widely. Forgetting this precious right makes it easier to succumb to the dominant ideology of rooted difference.

 

Sunday, February 2, 2025

Curbing Animal Testing

I hope this happens soon as in few months: 

Animal testing is a relic from a bygone era but still promoted fervently by interest groups and government agencies as the “gold standard” in experimental sciences for predicting response in people. That is even though — in drug development, for instance — animals are notoriously poor predictors of drug safety and efficacy in humans. To this end, exclusive reliance on animal testing translates into irrecuperable delays in the development of medicines, missed opportunities due to misguided regulatory principles, and exorbitant costs ultimately passed onto consumers.

A jarring 90-95% of experimental drugs fail in clinical trials after acceptable outcomes data in animals are used to justify their advancement for testing on humans. Moreover, scores of potentially life-saving drugs are prematurely abandoned once they confer no benefits to animals, exacerbating an already inefficient animal-centric drug discovery paradigm. Failed oncology trials alone are estimated to cost $50-$60 billion annually. Most new-generation therapies (e.g., cell therapy, immunotherapy) are by design human-specific, and testing on animals is a fool’s errand.

Through decisive actions, DOGE could in principle curb unreliable testing on animals in favor of prioritizing technology-driven, human-relevant alternatives. By doing so, it would — in a singular swoop — reduce waste across federal contracts and grants, promote modern drug development, lower healthcare and prescription drugs cost, bolster national competitiveness, improve environmental health and safety testing, and modernize practices within all health and regulatory agencies.

Francis S. Collins, the longest-serving former director of the National Institutes of Health, wrote in the journal Nature a decade ago that “preclinical research, especially work that uses animal models, seems to be the area that is currently most susceptible to reproducibility issues.” Consistently, 89% of preclinical studies, most of which involve animals, cannot be reproduced!

Reducing the dependency on the key variable (i.e., animal models) associated the most with irreproducibility (e.g., the failure to translate results from the laboratory to the clinic) is one sensible approach to limit fiscal waste in medical research and, more broadly, healthcare.

The cost of developing a single new drug is a stupefying $2 billion with an average development time of 10-15 years from target identification in the laboratory to market release, not factoring in withdrawals or recalls. Poor reliability of animal models in the drug discovery workflow compounds sky-high research and development costs to disincentivize investment in many disease domains. Case in point, 95% of the 7000-plus rare diseases that affect 25-30 million Americans have not a single FDA approved drug to treat them.

[---]

Yet to this day, inexplicable delays in implementing the FDA Modernization Act 2.0 (FDAMA 2.0) are causing significant regulatory confusion among drug sponsors. The failure to act on the part of the FDA, the regulatory agency chiefly responsible for implementing this policy mandate, is in turn a good example of government discordancy, if not malfeasance.

In 2023, a bipartisan group of Senators, led by Rand Paul, R-Ky., and Cory Booker, D-N.J., sent a letter to the FDA demanding an explanation for the stultification and an implementation timeline of the enacted law. When no progress materialized, alarmed lawmakers introduced in February of 2024 the FDA Modernization Act 3.0 (FDAMA 3.0) in the U.S. House of Representatives, H.R. 7248  (and later in the U.S. Senate, S. 5046) to assure FDAMA 2.0 implementation and accomplish further improvements.


Saturday, February 1, 2025

Thank You Inspector Hathiram Chaudhary

There are hardly any good Indian movies but there are some hidden gems in the form of Hindi series.

A colleague told me about the Paatal Lok series a couple of years ago and I was hooked. 

Jaideep Ahlawat as Inspector Hathiram Chaudhary is just brilliant. In the middle crappy actors, Jaideep is an actor who is showered with talent probably from up above. 

Jaideep Ahlawat is the Hindi version of what Vijay Sethipathi is to Tamil cinema. 

I haven't been to India for almost 2 decades now but through Hathiram's eyes I am discovering not much has changed - poverty, power, and pusillanimous seems persistent. 

Thank you sir for making me lost in your art and making me think. 

Hathiram Chaudhary: A Hero For Our Times

He is an Indian, Rohtak-born. His precinct is Outer Jamuna Paar in Delhi. His currency of operation is that tough, drain-pipe humanity, which he has to preserve in an increasingly murky world.

High-profile police cases that turn out to be zero-sum games are his to negotiate. Slouch-shouldered and pot-bellied, he goes through a series of spirals only to come upon dead ends.

To do this night after night is to earn those bleary, exhausted eyes that are his signature.

Those eyes have wonderful bags under them that touch us deeply.

[---]

We keep persisting with Paatal Lok's hardbound cynicism because we know that even if wiped out and shattered, we can still come home to Hathiram Chaudhary. We are sure he would let us in with a shrug.

He has a political stance; he most certainly does. But he never uses it as a tool to patronize, instruct, or elevate himself to a higher moral plane.

Does this explain his broad appeal, why he's equally beloved by right-wingers and lefties?

Here's Hathiram's version of liberalism, as unrehearsed as they come.

In the first season, while standing up for a Muslim colleague, he doesn't position himself as the progressive one battling a bunch of bigots.

On the contrary, his actions suggest that steering clear of bigotry is something we all can aspire to.

In Season 2, there's a wonderful scene involving the revelation of a close friend's sexuality, where he rebukes his personal brand of Haryanvi machismo as he lends his support to the slightly embarrassed friend.

"I'm a country bumpkin with no knowledge of gay parades. But if it feels right to you, then that's all that matters," so says the bumpkin, not emphatically but searchingly, and with a faint note of some swear-word bubbling up in his throat.

His inclusive attitude is unique: It may not possess the jingle of a placard slogan, but it surely has the warmth of a hardboiled embrace.

 

Monday, January 27, 2025

Much of the Cuisine We Now Know, and Think of as Ours, Came to Us by War

“Sicily became quite famous for its fruits and vegetables, and that can be traced back to the Muslim era, when the gardens probably began as  pleasure gardens,” says Wright. Pleasure gardens were designed as places of repose, and for Muslims, a reminder of the paradise awaiting the virtuous. “They were eventually turned into ‘kitchen gardens,’” Wright continues, describing them as “experimental horticultural stations” to develop better propagation methods. But at the same time, they were places of beauty. “The gardens were lush with vegetable crops, flowering bushes, and fruit trees, and graced with water fountains and pavilions,” Wright explains in A Mediterranean Feast. During the 300 years that the Arabs ruled Sicily, its agriculture and economy grew, and institutions evolved. In fact, when the Normans seized power, they kept many practices of their predecessors, including the organization of the government and, in the upper classes, the wearing of flowing robes.

Humans are bound to food by necessity first, and then by choice. The types of food you eat distinguish your country from another country, your group from another group. When new influences come—whether from conquest or colonial exploration or the popularity of a TV cooking show—there is a period of adaptation, and then often the full incorporation of a new technique or ingredient into the country’s culinary lexicon. The potatoes and tomatoes that went from the New World to Europe in the Columbia Exchange of the 15th century were first scorned by Old World diners who feared they were poisonous, then in time became emblematic of their cuisines. In its original form, Sicilian caponata would never have been made with tomatoes, but today there are versions that include them and they are considered perfectly Sicilian.

Food constantly evolves, as do taste buds. To the Western palate, Japanese food seems so distinctly Japanese, yet it went through many modifications once the country opened to the West in the 19th century, explains Katarzyna Cwiertka, the chair of modern Japanese Studies at Leiden University and a scholar of East Asian food. “New ingredients, new cooking techniques, and new flavorings were adapted to Japanese customs,” she says. “The changes were really tremendous.”

Military canteens played the role of first adopters. Once Japanese soldiers became accustomed to a food, they would eventually introduce it to the wider public when they returned to civilian life. Such was the case with curry, which started appearing in Japan in the late 19th century. It was a borrowing not directly from India, but from the British Empire. “The Japanese start to serve it as a  Western food,” says Cwiertka. “It enters military menus and canteens and continues after [World War II] into school canteens. By the 1950s and 1960s it is a national dish. When you ask Japanese students abroad what they crave most, they would say ramen or curry. And ramen [of Chinese origin] is also not a Japanese food.”

What the Japanese have done—over and over again, Cwiertka points out—is move foreign foods into the category of washoku, the genuinely Japanese. They adapt and absorb foreign culinary influences this way. “It’s more like the invention of a tradition than a tradition,” she says.

- More Here



Saturday, January 25, 2025

What Are The Odds?

This week on Wednesday within a span of a few hours - I saw an Owl and then even Neo got startled when two bald Eagles flew over Max's Walden. 

















Friday, January 24, 2025

How Some Trees Evolved to Birth Live Young

Typically, a seed’s number one job is to have patience. Before it grows into a new clover or pumpkin vine or oak tree or hydrangea, it has to wait. Only when conditions are just right will the seed sprout, which gives it the best chance of survival.

Yet for a few tree species, the seed’s job is different. It doesn’t wait. It starts growing right away, while still attached to its parent plant, and only separates later. Trees that do this are called viviparous, or live-bearing. It’s the same name scientists give to animals, such as humans, that birth live babies instead of laying eggs.

Despite the unexpectedness of this trait, researchers studying the genetics of viviparous trees recently showed that the pathway to their evolution might have been surprisingly simple.

While viviparity is rare among trees in general, it’s common among the mangroves, roughly 80 species that live on warm coastlines around the world. These trees are already unusual, as they absorb water that’s up to 100 times saltier than what most plants can tolerate.

A live-birthed baby mangrove doesn’t look like a chubby infant, or like a miniature adult mangrove. Instead, it’s like a string bean with a bulbous cap, topped by a little crown of roots. The babies hang from their parent tree in clusters, and when they reach a certain stage of development they drop straight down into the mud or sand below, says Yingjia Shen, a researcher at China’s Xiamen University. 

If the tide is out when the baby mangroves fall, their roots grow rapidly, Shen says, with the plants starting to take hold within a few hours of hitting the ground. In other cases, though, the young plants may take a journey. Baby mangroves are buoyant, and “those that fail to root in the mud can drift in the ocean currents for several months,” Shen says, “potentially reaching coastlines thousands of kilometers away and taking root there.”

- More Here


Two Types of Uncertainty

Uncertainty is thus not an intrinsic property of events, Spiegelhalter writes, but rather a reflection of the knowledge, perspective and assumptions of the person trying to understand or predict those events. It varies from person to person and situation to situation, even when the circumstances are identical. It is subjective and shaped by what we know or don’t know at a given time.

Spiegelhalter distinguishes two main types of uncertainty: aleatory uncertainty, that which we cannot know, and epistemic uncertainty, that which we do not know. Understanding this distinction is crucial for making informed decisions. Whereas aleatory uncertainty is often irreducible, epistemic uncertainty can be minimized through better data collection, refined models or deeper investigation.

- Review of the book The Art of Uncertainty: How to Navigate Chance, Ignorance, Risk and Luck by David Spiegelhalter


Sunday, January 12, 2025

The Insanity Of War Has Returned To Our World

Noah Smith penned a heart felt piece for the new year.  Thank you!

Just because someone is in their 20's they don't magically eat food from their ass and poop via their mouth.  Agreed,  that’s biology. Warnings for history are different from innate biological warnings.  As I get older, I see most of old and new sapiens try to force their dinner via their ass.  

Alas, I wish warnings of history because of some sort of epigenetic memory for sapiens. Or maybe lack of these memories is a way of this planet bidding adios to a species. 

If you think about this idea from first principles, its fundamental insanity becomes apparent. Spend a few days in Taiwan, and tell me honestly if there is anything wrong with it — some terrible injustice that needs to be corrected with saturation missile strikes and invasion fleets. You cannot. The people here are happy and wealthy and productive and free. The cities are safe and clean. There is no festering racial or religious or cultural conflict, no seething political anger among the citizenry. Everyone here simply wants things to remain the same.

And yet there is a good chance they may not be granted that wish. High explosives may soon rain down on their homes and their families, and an army of stormtroopers may march in and take all of their freedoms from them. And if this happens, it will be because of the will of men far away — an emperor on a throne, generals hungry for glory, bored malcontents behind a computer screen. If the peaceful, unthreatening people of Taiwan suffer and die, it will be because those distant men decreed that they should.

Why would you do this?? Why would anyone want to launch wars of conquest? The world has progressed beyond the economic need for warfare — China will not become richer by seizing the fabs of TSMC or the tea plantations of Sun Moon Lake. The mostly stable world created in the aftermath of the Cold War was good not just for Taiwan, but for China as well. Why topple it all chasing a dream of empire?

The only possible answer here is that the world is created anew each generation. We still call China by the same name, we still draw it the same way on a map, but essentially all of the people who remember the Long March, or the Rape of Nanking, or the Battle of Shanghai are dead and gone. The hard-won wisdom that they received as inadequate compensation for suffering through those terrible events has vanished into the entropy of history, and their descendants have only war movies and books and half-remembered tales to give them thin, shadowed glimpses.

And so the new people who are now “China” are able to believe that war is a glorious thing instead of a tragic one. They are able to imagine that by coloring Taiwan a different color on a map, their army will redress the wrongs of history, bring dignity to their race, spread the bounties of communist rule, fulfill a nation’s manifest destiny, or whatever other nonsense they tell themselves. They imagine themselves either insulated from the consequences of that violence, or purified and ennobled by their efforts to support it.

They do not understand, in the words of William T. Sherman, that “war is destruction and nothing else.” Nor do they think very hard about the future of the world their short, glorious conquest of Taiwan would inaugurate — the nuclear proliferation, the arms races, the follow-on wars. The German and Russian citizens who cheered their armies and threw flowers as they marched to the front in 1914 could not imagine Stalingrad and Dresden thirty years later. We have seen this movie before.

[---]

And whether the U.S. is even committed to global freedom in the abstract is now an open question. The fabulously wealthy businessmen who have the greatest influence in the new administration openly mock the courageous Ukrainians who stayed and risked death to defend their homes and families from the rape of Russia’s invasion — even though if war ever came to their own doorstep, they would be the first to flee, clutching their Bitcoin to their chests like sacks of gold. An aging Donald Trump indulges in idle fantasies of staging his own territorial conquests in the Western Hemisphere, LARPing the new fad for imperialism even as his peers practice the real thing overseas.

America, like every other nation, has been created anew as the generations turned. This is not the America of Franklin D. Roosevelt, or even the America of Ronald Reagan. My grandparents are dead. Their hard-earned warnings are abstract words fading into memory, and I wonder if the world they won will outlast them by much.

And so across the sea, the old stormclouds gather again. In the seas around Taiwan, an armada assembles. Across the strait, the emperor orders a million kamikaze drones, hundreds of nuclear weapons, a forest of ballistic missiles, and a vast new navy. In Taipei, the sun is out, and people sip their tea, and eat their beef noodle soup, and and try not to think too hard about whether this will be the year the old world finally gives way to new.

Tuesday, January 7, 2025

Meta Values - 37

I tried meditation for 8 years without significant "insights" then I realized being with Max, we are not meditating for 20 minutes or a few hours but instead we are living a meditative life. 

My life and every moment in it is meditation in itself with my awareness fueling it as much humanely as I could. 


Sunday, January 5, 2025

Your Life Is Not A Story

So, why is this a problem? One issue is complexity. Seeing yourself as the main character in a story can overly simplify the fullness of life. Think of the way in which people talk about their ‘journey’ through life. Through this narrative, certain events become more significant while others are overlooked, and random events can be reframed as being part of some grand plan. Yet viewing our lives in such a narrow way hinders our ability to understand the complex behaviour of others and ourselves. For example, a child that accepts the narrative of being ‘naughty’ may incorrectly frame their behaviour as bad, rather than as an expression of their unmet needs. Stories can change us by locking us into ways of acting, thinking, and feeling.

[---]

So, what does it mean to reject a narrative? Living in a non-narrative way means rejecting a particular identity, and instead seeing life and meaning as a set of open choices. For the waiter, rejecting his narrative identity would mean acting in a way that reflects his choices and sense of self, not just the story he tells about himself.

To understand what is involved in rejecting a narrative, it is important to remember that narratives do not exist outside of people’s minds. The stories we tell ourselves are not out there in the world. They are tools that mediate our relationships with the world. Though they relate to facts, and real events, they are not factual. In fact, they are neither true nor false. Instead, stories help us make sense of things. So, if we rejected the power of narratives to sequence events in our lives, how else would we organise our thoughts about the world?

Think of the ways that perspectives organise experiences differently. By ‘perspective’ I mean something more complex than ‘point of view’. I’m referring to the way we engage with the world from a particular position or orientation that draws our attention to aspects of experience, like how our visual ‘perspective’ allows bright colours to show up more easily than dull ones. Perspectives are shaped by our place in the world, our beliefs, values and what we think matters. As the philosopher Elisabeth Camp explains, a perspective ‘helps us to do things with the thoughts we have: to make quick judgments based on what’s most important, to grasp intuitive connections, and to respond emotionally, among other things.’ Through perspective some features of our experiences ‘stick out in our minds while others fade into the background.’

Poetry captures a way of seeing and feeling, not just a sequence of events

Perspectives, then, determine the narratives we adopt. In other words, our core beliefs and values shape the way we see things and what we take to be important in our experiences. It is our perspectives that generate our narratives. Perspective also explains why our narratives can differ so radically from those of other people, even when we experience the same events. But once we understand these perspectives, we can see how flexible our narratives can truly become. Perspectives, it turns out, don’t have a linear, ordered structure. We can’t think of them in terms of sequences of events, like stories. In some ways, perspectives are better represented by the non-linearity of poetry.

[---]

And so, instead of just changing our narratives, we should learn to understand the perspectives that shape them. When we focus on our own stories, we live life as we already know it, but by loosening the grip that stories hold over our lives – by focusing on the perspectives of ourselves and others – we can begin opening ourselves up to other possibilities. We can adopt new orientations, find significance in new places, and even move toward the exciting unpredictability of shared perspectives.

As Sartre warned, everything changes when you tell a story. Narratives limit our potential. Though we are complex beings, living in a chaotic universe, our stories create the illusion that our lives are ordered, logical and complete.

We might never fully escape the narratives that surround us, but we can learn to change the perspectives behind them. And so, we are never bound by stories, only by our ability to understand how our beliefs and values shape the way we perceive and engage with the world. We don’t need better narratives; we need to expand and refine our perspectives.

- More Here


Thursday, January 2, 2025

The Virtues of Virtue Signalling

This brings us back to the original question motivating this essay: Should you be public about your charitable donations? I have been worried that doing so would be perceived as self-aggrandizing or obnoxious virtue signaling – that such immodest boastfulness would reduce the moral value of these actions. However, by concealing my donations, I have missed the chance to also spread the norm of effective giving. This wasted opportunity strikes me as a moral mistake in itself. Clearly signaling our virtuous behavior can have an important moral role, not in highlighting our character, but rather in spreading pro-social norms.

Norms are sticky, and changing them is costly. However, they are not completely fixed. We should celebrate the early movers willing to incur these costs for changing norms. While virtue signaling is often perceived as objectionable, engaging in costly virtue signaling to shift social norms toward more pro-social and moral equilibrium seems like an admirable thing to do. For this reason, I have decided to add my name to the list of almost 10,000 people who have signed the 10% Pledge, and to explain my reasons for doing so in this essay. Thereby, I hope to also inspire others to do the same.

- More Here