Sunday, August 31, 2025

Buddha, Socrates, and Us: Ethical Living in Uncertain Times

Review of the book Buddha, Socrates, and Us: Ethical Living in Uncertain Times by Stephen Batchelor:

Raised in Britain as a post-Christian secular humanist and trained in Asia as a Tibetan and Zen Buddhist monk,” Stephen Batchelor writes at the end of his book, Buddha, Socrates, and Us, “I find that I can no longer identify exclusively with either a Western or an Eastern tradition.” 

Decades of dwelling in these traditions—each with its own intellectual, spiritual, and philosophical riches—have left him strangely homeless. Far from making him unhappy, though, this state of existential homelessness has given Batchelor access to what he sees as a higher life. For, while “unsettling and disorienting” at times, such “spaces of uncertainty seem far richer in creative possibilities, more open to leading a life of wonder, imagination, and action.”

At its core, Batchelor’s Buddha, Socrates, and Us may be read as a response to a simple, yet important observation: everything in life tends to fall into patterns, to settle into habits and routines. Not even matters of the spirit—religion and philosophy, beliefs and ideas, thinking and writing—seem to escape this fate. Such mindless repetition makes our lives easier and more comfortable, at least on the outside, but to do things mechanically and unthinkingly is to invite emptiness and meaninglessness into our existence. The older we get, the more spiritually ossified we become. Eventually, if nothing challenges us, our slumbered existence will be indistinguishable from spiritual death.

That’s what awakenings are for.

Batchelor focuses primarily on two masters of awakening: Gotama and Socrates. As chance would have it, they were contemporaries, even if they lived worlds apart. For all the cultural differences between fifth-century BCE India and Greece, however, Batchelor identifies a series of compelling parallels, from the merely anecdotical to the more substantive, which makes his book the delight of any comparatist of cultures. His narrative shuttles nimbly between the two figures, between East and West, the Indian world and the Greek one, in a compulsively readable way. Batchelor is not only a seasoned practitioner of Buddhism, and a great scholar of it, but a gifted storyteller to boot.

[---]

“I want to make Buddhist and Greek thought more than merely compatible,” Batchelor writes at the beginning of this book. “I am seeking a new language, a synthesis that would transcend the binary of East/West, Greek/Buddhist altogether.” But to find such a voice, he adds, “I first have to come to terms with the Greek and the Buddhist inside me.” The project has taken him a lifetime, and he is still on the road. The delay may be by design, though. For the point of a project like his may be never to settle into a destination, but to keep your head on fire for as long as you can.


Friday, August 29, 2025

Boredom!

Broadly speaking, however, boredom is usually thought of in one of two ways. The first is as a deficit of meaning— a sense of purposelessness and existential disinterest. The second is as a deficit of attention— a state where the mind is unoccupied and without focus. The first, what may be called ‘cultural boredom’, is abstract, less urgent, and entirely man-made— an unintended byproduct of natural selection’s misalignment with modern life. The second, what may be called ‘biological boredom’, is much older and more primal— an evolutionary function woven into the logic of selection and shared across species. In short, there is boredom as defined by nature and boredom as defined by humans.

But perhaps the more useful distinction is between feeling bored and being bored. Like the difference between feeling alone and being alone; one is a psychological state and the other is a material circumstance. One a measure of subjective experience, the other a measure of objective conditions. What this means is that feeling bored is self-reported— you can only know if someone feels bored by asking them. But to know if they are bored, you don’t need words— you need observation: brain scans, environmental situations, introspective awareness, etc.

Of all the available measures, arguably the most comprehensive marker of being bored is the activation of what neuroscientists call the brain’s ‘default mode network’. The default mode network describes a network of brain regions that are more active when an individual is not focused on the outside world and is instead engaged in internal thoughts, daydreaming, self-reflection, or other forms of “mind-wandering”. Put simply, it is the brain alone with nothing but its own thoughts. 

In the most absolute terms, being bored is the condition of nothingness. It is not something felt but something endured; not a disturbance but an absence. It is the blank space behind every human experience; the raw foundation of existence before feeling or meaning is imposed. Being bored is not just a pause in the dopamine drip, but the void into which thought, desire, and distraction rush to take shape. It is not merely the opposite of stimulation— it is the state from which all stimulation is an escape. There is boredom, and from it, everything else grows.

[---]

Americans today live a more cushioned existence than any population—human or otherwise—in history. Parents no longer watch half their children die before the age of five. Smoke-suffocating factories no longer employ eight year olds until they run out of fingers to lose. Our laws no longer punish people for baking bad bread by boiling them alive. The flu no longer sweeps through towns like a biblical reckoning. Cognitively impaired 13 year olds no longer inherit absolute power; and we no longer expect to drop dead before the age of 30. 

The average American lives in a home with climate control, clean water, and a refrigerator stocked with more food, flavor and convenience than kings of old. Thomas Jefferson, one of the richest Americans of his time, never went a winter where he wasn’t cold in his own home, complaining that his pens would freeze and spending every morning chiseling the ice off his writing desk. Even the poorest among us have access to medical care, public education, and safety nets that would have been unthinkable in previous centuries. The daily struggles that once consumed our ancestors—finding food, surviving disease, avoiding violence—are no longer the defining features of life.

[---]

In a society where we have no thing—no predators, no famine, no war—to torment us, nothing itself becomes the torment. In other words, leisure curdles into boredom— a hell so unbearable that international law recognizes it as a torture device (i.e. solitary confinement) and NASA assigns its astronauts busywork to safeguard against it. Studies have found we’d rather blast our ears with the sound of a screaming pig than listen to nothing or engage in dangerous behaviors such as drug-use, erratic driving or high-risk financial speculation in order to avoid boredom. One study of particular interest, published in Science, found that most of us (two-thirds) would rather self-administer electric shocks than endure just 15 minutes devoid of external distraction; with a similar majority saying they’d pay money to never experience such boredom again. 

Important to note is that all the aforementioned examples of boredom involve individuals unable to sit alone with their thoughts; trapped in the claustrophobia of sensory-deprived consciousness. This is, of course, not the kind of boredom most Americans today are experiencing— opening apps, closing apps, reopening the same apps. The kind where, instead of being unable to sit with it for 15 minutes, we are unable to sit without it for more than 15 minutes— jonesing for another hit of flickering junk; compulsively returning to it over and over and over. The kind that feels like boredom, looks like boredom, but, in being, is not boredom. It is— cultural boredom.

If biological boredom is a negative feeling triggered by the absence of stimulation, cultural boredom is the feeling of boredom untethered from that absence— an emotional layer we’ve learned to apply even when nothing is missing. In other words, biological boredom is a response to being bored; cultural boredom is the invention of boredom as a feeling in itself— freestanding, self-replicating, and often entirely disconnected from a stimulus gap.

[---]

In order to endure boredom, one must have a purpose— but in order to discover that purpose, one must first endure boredom. Put another way, the precondition for meaning is the willingness to be with its absence— because it arrives not when we want it, but instead appears when there’s nothing left to distract us. In order to find a new direction, we must stop moving forward. Just as the chalkboard must be erased before anything new can be written, the mind, like a cup, can only hold what it makes room for. What is not empty cannot be filled. As the philosopher Martin Heidegger put it:

“The nothing is what makes possible the openness of beings as such for [emergent awareness]” 

That is to say, the openness we rush to fill—with stimulation, with pleasure, with clicks—is the only space where purpose, drive, creativity, even just basic decision-making can take root. It’s not what boredom gets you— it’s what it frees you from. You can’t force insight. You can’t download creativity. You must sit, watch and then catch it—as it drifts through the open space of awareness—by paying attention. It is what the poet John Keats called “negative capability”— the ability to remain present with uncertainty and unknowing, without lunging for premature relief. 

In other words, we must give up control and make room for disorder.

[---]

As the composer John Cage put it: 

“If something is boring after two minutes, try it for four. If still boring, then eight. Then sixteen. Then thirty-two. Eventually one discovers that it is not boring at all.”

The shift is not so much in what you see, but in how you see it. It’s a recalibration of what we find worthwhile— one that exchanges the frenzy of getting for the depth of being. Boredom turns us inward, towards nothingness, not to escape the world, but to meet it more fully— without decoration, without desire or demand. Here, the mind stops chasing and starts noticing— what once felt empty reveals itself as open. The path to peace isn’t paved with novelty; it’s cleared of clutter. 

- More Here


Thursday, August 28, 2025

How to Be a Good Intelligence Analyst

Such an wonderful piece! highly recommended. 

Please read the whole thing here

Because learning institutionally is hard?

Learning institutionally is hard. Not only is it hard to do, but it's also hard to measure and to affect. But, if nothing else, practitioners became more thoughtful about the profession of intelligence. To me, that was really important. The CIA is well represented by lots of fiction, from Archer to Jason Bourne. It's always good for the brand. Even if we look nefarious, it scares our adversaries. But it's super far removed from reality. Reality in intelligence looks about as dull as reality in general. Being a really good financial or business analyst, any of those kinds of tasks, they're all working a certain part of your brain that you can either train and improve, or ignore and just hope for the best.

[---]

What do American intelligence analysts do if not the fun stuff from the Bourne movies?

They read, they think, they write. They write some more, they edit, they get told their writing sucks. They go back, they start over again. Some manager looks at it and says, "Is this the best you can write?" And they say, “No.” And they hand it back to them, and off they go to write it again. It’s as much of a grind as any other analytic gig. You're reading, thinking, following trends, looking for key variables.

Analysts who are good on their account generally have picked up very specific tips and tricks that they may not even be able to articulate. The best performers in the agency had a very difficult time explaining how it was they went about their analysis, and articulating their expertise. That's not unusual. Experts really aren't very good at articulating why or how they're experts, but we do find that after 10,000-ish cases, they get better, because they're learning what to look for and what not to.

That comes with some penalties. The more hyper-focused you are on topic X, the less likely you are to think that topic Y is going to affect it. And often it's topic Y that comes in orthogonally and makes chaos. “How do you create expert-novice teams?” was a question that we struggled with: finding the right balance between old and new hands, because you wanted the depth of expertise along with the breadth of being a novice. Novices would try anything because nobody told them they couldn't. That's a very valuable thing to learn from. If you're an analyst or an analytic manager, the challenge is how to balance that structure.

[---]

That old model seems more James Bond-y. The character goes more places for the movie at the cost of effectiveness.

A consistent problem is that the effectiveness measures are poorly articulated and poorly understood by both the consumers and the customers. The best consumer of intelligence that I have ever interacted with was Colin Powell. He had a very simple truism: "Tell me what you know, tell me what you don't know, then tell me what you think, so that I can parse out what you're saying and make sense of it.” He was a remarkably savvy consumer of intelligence.

Not all consumers are that savvy. Many of them would benefit from spending a little time learning more about the community, understanding the relationship with their briefers and analysts. The more engaged the policymakers are in learning about intelligence, the more savvy they'll get as consumers. Until then, you're throwing something over the transom and hoping for the best. It's not a great way to operate if you have consumers who want your product.

Who were some relatively poor consumers of intelligence information?

There are so many. Dick Cheney was not a poor consumer of intelligence. He just had an agenda, and he understood the discipline well enough to exercise that agenda. [Donald] Rumsfeld was not good. And [Paul] Wolfowitz was much worse at it than he thought. There were some others in that administration, and I don't mean to pick on them. There were plenty of lousy consumers under Obama and under Clinton. Not a lot of them take enough time to really think about what they're getting.

The biggest problem that I have found with ambassadors, generals, or other consumers is they'll go out into the world, shake hands with their counterpart, and decide based on that interaction that they understand their counterpart better than anybody else does. "I went to lunch with so-and-so, I should know." The problem is that so-and-so is not going to tell you the truth. If so-and-so is going to do something, going to lunch with him probably isn't going to be very revealing. He's probably going to tell you what you want to hear. You'd be surprised how many consumers don't even think about that possibility. It boggles my mind.

It is funny you mention Donald Rumsfeld as a poor consumer of information, because one of his famous truisms was, he wanted you to explain your “known knowns” and your “unknown unknowns.” My first impression would be that he’d be a good consumer.

The problem with the Rumsfelds and the Kissingers is that maybe they are the smartest person in the room, but maybe they should stop believing that for a while. That gets in their way. They just assume from the jump that they're smarter than everybody. Not just everybody individually, but everybody collectively. There's a certain amount of ego that goes along with all of this. When the ego gets sufficiently inflated, you reject information that is contrary to your own values, mental model, and thought processes. You assign outlier status to anything that doesn't conform with the way you think about a problem. That's expertise run amok.

That's where people like Rumsfeld or Kissinger come off the rails. They just assume, "Well, I'm smarter than everybody, so I'll figure it out. You just give me raw data." I have not seen a terribly successful model of that. It's better to walk into a room and assume that you're not remotely the smartest person there. You're doing yourself a cognitive disservice if you think you're cleverer than everybody else. It's a rookie mistake, but you see it over and over, and if it works for you and you keep getting promoted, eventually you start to believe it.

It doesn't seem like a rookie mistake to me. It seems like the mistake of a seasoned professional.

You're right. It is a longevity error.


Sunday, August 24, 2025

Happy Birthday Fluffy!

Max's gal turned 9 years old today! 

Stuart Kauffman coined a term called "Adjacent Possible".  

In biological evolution, this means that new structures and functions arise incrementally, with each step opening up new possibilities for further evolution.

For example, the development of lungs in fish enabled the possibility of terrestrial life, creating entirely new evolutionary pathways.

Things arise incrementally to eventually form a complex system.. 

Fluffy is my "Adjacent Possible" for my time with Max. She inherited so much of Max since she grew up since she was a kitten that now she unveils an adjacent possibility of how Max would have been if he was alive now. 

It's so beautiful. Of the five non-human animals in the house; similar to Max, she is the only one who wants to sleep with me. 

Happy Birthday my gal!











Friday, August 22, 2025

The Future of Climate Change Is on Mauritius - People's Home vs Traveling Morons Paradise

I am going to put in simple terms: 

This "travel" disease almost all humans have is the new imperialism.  

This destroys ecology, animals, economy, health and god knows what else. It's sheer stupidity. 

The simple explanation (or causal reason) behind this disease was described aptly by Pascal centuries ago. 

All of humanity's problems stem from man's inability to sit quietly in a room alone.

Thanks Eat, Pray, & Love book and the movie, add women in upper case to that Pascal's quote. 

This beautiful paradise of an Island named Mauritius has already been decimated by nothing but traveler's diet and shit. 

Read this piece, weep, reflect, and eradicate this disease from your system. 


The United Nations Development Program said our beaches have shrunk by as much as 20 meters in the last few decades, that the loss of tourism could cost us over $100 million per year by 2060, if nothing is done to save our coastline.

December 2022. Our November rains are expected in mid-January. Our reservoirs are 3 percent full. It’s the worst drought since the early 2000s.

There’s nothing to do but swim. We listen to the radio for jellyfish warnings: “Explosion” is the word of choice experts use to describe the creatures who’ve smothered every coastline. Manifestations of a sick ocean, they spawned due to warmer temperatures, overfishing, changing weather patterns.

Today the sky is postcard-perfect, the sea devoid of jellyfish, the beach packed with tourists. 

I think of the carbon emissions of each plane that lands here. The emissions of each of our 106 hotels. Air conditioning units struggling to cool rooms in peak season. Tourists pouring themselves a bath, cleansing themselves of their 12-hour flight. Ignorant that the rest of us have to live on only four to eight hours of water flowing through our taps most days in high summer. Tourists, their sunscreen-coated bodies plunging into the lagoon, leaving a film on the water, poisoning corals. Tourists, delighting in our bathwater lagoon, look it’s so crystal-clear you can see the bottom, a dead zone framed in buoys, cleansed of most of its creatures.

[---]

There were only four Mauritius kestrels in 1974; they are endemic to Mauritius, and were, at the time, the most endangered bird of prey in the world. Colonialism had quite a lot to do with their decline, practically from the moment the Dutch set foot here in the early 18th century: the colonists shrivelled our forests, brought rats on their boats. Three hundred or so years later — after the French and English colonial administrations had their go, pillaging the environment; after they’d driven species to extinction; after Mauritius claimed its independence and multiple economic booms and further, consequential ecological devastation — the kestrels were left with almost no homes. By 2009, however, they’d flourished to around 600 individuals, thanks to the work of the Mauritian Wildlife Foundation and other organizations. They are beautiful animals, though I’ve never seen one in the wild before. Their fluffy white breasts are spotted with brown, as if they’d been dotted over with a thick brush. We have 1 percent of our natural forests left and they live there, up in the Bambous mountains and in the Black River Gorges National Park.

[---]

I read books written mostly by white men in supremely rich countries on how to think about climate disaster. Some concepts I understand in my body: global warming as a hyperobject, heat like honey glistening all over my skin, so viscous that showering won’t remove the stickiness.

I read books that trace the contours of my lifeline. The statistics that predict our future, that suggest the manner of our deaths, the stages and degrees at which our bodies will gradually shut down.

“Recently, researchers estimated that by 2050 as many as 150 million people in the developing world will be at risk of protein deficiency as the result of nutrient collapse,” writes David Wallace-Wells in The Uninhabitable Earth. “138 million could suffer from a deficiency of zinc, essential to healthy pregnancies; and 1.4 billion could face a dramatic decline in dietary iron — pointing to a possible epidemic of anaemia.” I’m already borderline anaemic, like many women in my country and their mothers. In the ministry of health’s Health Statistics Report 2021, 38 percent of all Mauritian women who received antenatal care in public hospitals were reported as anaemic.

“Sudden rainfall shocks — both deluges and their opposite, droughts — can devastate agricultural communities economically, but also produce what scientists call, with understatement, “nutritional deficiencies” in foetuses and infants, writes Wallace Wells.


Tuesday, August 19, 2025

Is God a Mushroom?

After all, our species began as forest-floor foragers, in regions where psychedelic mushrooms grew plentifully in the dung of the very cattle they later domesticated. Like many other animals, we also seem to possess what the psychopharmacologist Ronald K. Siegel calls an “intoxication drive” — an impulse to seek inebriation in order to alter or expand our consciousness, equal to “the basic drives of hunger, thirst, or sex.” 

“Drug-induced alteration of consciousness preceded the origin of humans,” psychedelics researcher Giorgio Samorini writes. “It is an impulse that manifests itself in human society without distinction of race or culture; it is completely cross-cultural.”

[---]

In the cognitive science of religion, the dominant explanation for the origin of the belief in gods has long been to blame a “hyperactive agency detection device”: in other words, an inclination, coded into our brain, to imagine threats where there are none — to imagine an active threat behind a rustling bush or bubbling water. But recent studies have challenged the viability of this explanation. 

Hyperactive agency detection is not correlated with religious belief, they say. Besides, our cognitive models are, ultimately, based on our embodied experiences. Why would we presume agency behind every undetermined stimulus, they ask, without past experience to inform our caution? And just how could our god-belief be so universally, cross-culturally encoded, if it is based on something we have never, in any capacity, experienced?

But what if God had already shown his face to us, had been here from the very beginning? What if God wasn't a man, or a power, or a hidden threat — what if God was, this whole time, a mushroom?

[---]

One of the weird things about the mystical experiences occasioned by psychedelics is how universal and non-sectarian they tend to be. “It’s not uncommon for subjects to report encounters with symbols or deities that have not been part of their process of enculturation,” the pioneering psychiatrist William Richards writes in Sacred Knowledge, his book on psychedelic research. Midwestern atheists report seeing visions of Islamic architecture; Baptist priests hear Sanskrit liturgies in their ears. “When you get into the symbolic, archetypal realm… good agnostics are seeing images of the Christ,” he told me.

This goes some way to justifying the theory that our religious impulses may be born of these fungi, rather than simply activated by them. But such a conclusion, for the theologically inclined, would be revolutionary. What if our revelations — our relics, temples, and testaments — came not from God, but from an evolutionary dance with fungi? Can God still be said to exist if we accept that as true?

- More Here

Monday, August 18, 2025

Origin Of The Word 'Dog' Remains A Mystery

 Centuries ago, dogs were more commonly called "hounds" — a term derived from the Old English word "hund." Today, "hound" typically refers to a specific breed of dog, but back then, it referred to all domestic canines, according to Gorrie.

Early forms of the word "dog" did appear in land charters and place names over a millennia ago. But most notably, during the Middle English period from roughly 1100 to 1450, "dog" was often used as an insult directed at people.

" The use of terms for dog to insult people are pretty common historically and across cultures and we see it all over the place," Gorrie said. "So not just in the history of English but in related languages of Europe and Asia."

Over time, the positive emotions people felt toward the four-legged creature eclipsed some of the word's negative, derogatory charge, he said. Around the 1500s, "dog" replaced "hound" as the standard term we use for the pet today.

"It's very possible that the same word that you use as an insult, you can repurpose as a term of affection," Gorrie said. " Almost as if they're reclaiming that word or using it ironically to show just how strong the affection is."

Since "dog" became ubiquitous, it has continued to broaden in meaning. According to Gorrie, the term was used to describe an ugly woman in the 1930s, while in the 1950s, it came to mean a sexually aggressive man. Today, it is used widely as slang for a close friend.

Theories behind the origin of "dog"

While the evolution of "dog" is fairly clear, the mystery lies in its origins.

One theory among linguists is that "dog" comes from the Old English word "dox," which was a term used to describe color, according to Gorrie. "It's not entirely clear what it meant, but it probably meant something like dark or golden or yellow," he said.

Another possibility is that it's related to the Old English word "dugan," which meant to be good, of use or strong, Gorrie added.

Part of the difficulty in tracing the origin of the word "dog," he said, is that dogs have been part of human life for a very long time. That's also true for common words such as "boy" and "she," as well as animal-related ones like "pig" and "hog."

"  There are theories about some of them," Gorrie said. "But dog is the one that's the real mystery."

- More Here


Sunday, August 17, 2025

Wisdom Of John Adams

I must study politics and war, that our sons may have liberty to study mathematics and philosophy. Our sons ought to study mathematics and philosophy, geography, natural history and naval architecture, navigation, commerce and agriculture in order to give their children a right to study painting, poetry, music, architecture, statuary, tapestry and porcelain.

- John Adams via Noah

And what is happening now?

Sapiens on one end "prepare" perpetually for an imaginary war that might happen anytime and other end, constantly "amusing themselves to death". 

While Sapiens survive till date by dedicated quest for knowledge and truth pursued by few noble beings. 


Saturday, August 16, 2025

The “We Evolved to Eat Meat” Argument Doesn’t Hold Up

Moreover, while eating meat in general requires smaller digestive tracts, the validity of this association is limited. At roughly 400 to 600 calories and 10 to 20 grams of protein per 100 grams, nuts and seeds are low-volume, high-nutritional-density foods for which small stomachs suffice. Top them off with peanuts and some honey, and you can do well as a plump, small-stomach, obligatory plant-eater.

Want more numerical specificity? Suppose we explore replacing modern beef with various plant-based alternatives, and calculate how much of each plant food we’d need to fully match the calories and protein in the beef. For each plant, we get two answers — one for replacing all the energy, and another for all the protein. From these two answers, we use the more exacting one, the one that requires more mass.

To test whether plants can match the nutritional value of meat, I compared 59 plant foods to beef. Of those, seven — almonds, kidney beans, peanuts, pistachios, chickpeas, lentils, and soy — require replacement masses that are in fact smaller than that of the beef they replace, on average requiring 800 grams to replace a kilogram of beef. Six more — barley, hazelnuts, oats, walnuts, buckwheat, and spelt — require only slightly higher masses (20 percent higher on average) than the beef mass they replace for the exact full energy–protein replacement.

The notion that our diet requires meat thus confronts roaring headwinds. While Pleistocene forebears of these nuts and legumes differed markedly from their modern counterparts, the message is clear: If more than 2 in 10 plant items are just as energy- and protein-dense as game meat, early plant-eating hominins could have invested relatively modest efforts in gathering plant-based diets with no less protein and energy, and no more bulk than large game eating would have provided, with none of the serious risks inherent in big-game hunting. This is why I find this argument unpersuasive.

- More Here


Monday, August 11, 2025

Happy Birthday Garph!

I was telling Garph today how he used to sit next to Max for hours. Those moments were immensely soothing to Max and I. 

A few years ago, my brothers and sisters' kids were home and they were arguing for hours. Topic - Why is Garph the best cat in the world? 

Let me be clear - the topic wasn't Is Garph the best cat in the world? But they all agreed he is the best cat in the world but they were arguing about their differences in the causal reason behind it :-) 

Freaking, I am so so blessed my dear to have you in my life. 

Happy 8th Birthday Sweet Heart! I love you

Garph next to Neo!








Friday, August 8, 2025

This Is Me!

And just last month I started reading them within a day or else it goes into the abyss.  

There is so much to know and learn, I understood after Max that I cannot do it in my lifetime. 

So if an article looks like it will teach me something new then I start to read them immediately or that day if not I trust my instinct (since I didn't read it immediately) and delete it. 

Probably, Oliver had a similar insight

If you’re stuck in a rut, and you feel like you’ve stopped making progress on things that matter, it could be that you need more immediacy in your life.

To explain what I mean, I suppose I’ll have to tell you about the other day when I deleted, or threw in the recycling, about 300 articles I’d saved to read later; roughly 70 web pages I’d bookmarked; a three-inch-high stack of supposedly vitally important printouts; plus more task lists and old project plans than I care to think about.

All gone – and at time of writing, I haven’t regretted it for a moment, because it worked.

[--]

The most obvious problem here, of course, is that you far less frequently get around to actually reading or watching – and thus letting yourself be changed by – the ideas you encounter. But the other problem is that it generates a huge backlog to slog through – so that even if you do get around to reading or watching, you’re no longer responding from the place of aliveness and excitement that first drew you in, but from a duller sense of obligation to clear the backlog, extract the important bits, and move on to something else.

 

Wednesday, August 6, 2025

Trade, Trees, and Lives

This paper shows a cascading mechanism through which international trade-induced deforestation results in a decline of health outcomes in cities distant from where trade activities occur. We examine Brazil, which has ramped up agricultural export over the last two decades to meet rising global demand. Using a shift-share research design, we first show that export shocks cause substantial local agricultural expansion and a virtual one-for-one decline in forest cover. 

We then construct a dynamic area-of-effect model that predicts where atmospheric changes should be felt – due to loss of forests that would otherwise serve to filter out and absorb air pollutants as they travel – downwind of the deforestation areas. Leveraging quasi-random variation in these atmospheric connections, we establish a causal link between deforestation upstream and subsequent rises in air pollution and premature deaths downstream, with the mortality effects predominantly driven by cardiovascular and respiratory causes. 

Our estimates reveal a large telecoupled health externality of trade deforestation: over 700,000 premature deaths in Brazil over the past two decades. This equates to $0.18 loss in statistical life value per $1 agricultural exports over the study period.

- Full Paper Here


Climate Change - Entire Country Of Tuvalu To Migrate To Australia!

Tuvalu is preparing to carry out the first planned migration of an entire country in response to the effects of climate change. Recent studies project that much of its territory could be submerged in the next 25 years due to rising sea levels, forcing its inhabitants to consider migration as an urgent survival measure.

This island nation in Oceania is made up of nine coral islands and atolls inhabited by just over 11,000 people. The country’s average altitude is just 2 meters above sea level, making it extremely vulnerable to rising oceans, flooding, and storm surges, all exacerbated by the climate crisis.

The visas will be allocated through a ballot system and will grant beneficiaries the same health, education, housing, and employment rights enjoyed by Australian citizens. In addition, Tuvaluans will retain the ability to return to their home country if conditions permit.

- More Here


Sunday, August 3, 2025

Very Good Sentences

This is not to say that human design has no place in nature. But it does mean that our models — rooted in symmetry, hierarchy, and predictability — are often a poor fit for systems that thrive on variation and response. The more we learn from ecology, the more we see that strength often lies not in perfect order, but in the capacity to bend, absorb, and shift. Nature’s designs are not clean, but they work — and they last.

[--]

To see beauty in nature is easy when it fits our expectations — when a flower is symmetrical, a bird unblemished, a landscape orderly and undisturbed. But much of the natural world does not conform to these standards.

This kind of beauty isn’t immediate. It asks for more attention, and a willingness to look past surface regularity. In biology, what may seem imperfect often reveals a hidden logic — structures shaped by use, behavior, or necessity rather than by visual appeal. A limpet’s uneven shell tells of wave exposure; the patchiness of a savanna shows where animals have grazed or fire has passed through. The landscape holds memory, but it does not preserve it cleanly.

Learning to recognize this kind of beauty means shifting our sense of value. It means seeing that irregular forms often tell us more about how life works than polished ones do. The complexity, resilience, and history embedded in these structures is not ornamental — it is essential. And when we begin to see that, the natural world becomes less like a picture and more like a living record.

- More Here


Saturday, August 2, 2025

This Cannot Be True... Well I Thought I Was A Pessimistic...

A frozen food company surveyed Americans to find out their favourite and least favourite vegetables but discovered something shocking — a staggering one quarter of respondents had never even eaten one.

The survey of 2000 adults, conducted by research firm OnePoll in May on behalf of Dr. Praeger’s, found that even among those who do eat them, only one third (36 per cent) of their meals actually included a vegetable.

Seventy-two per cent of respondents said they wanted to eat more vegetables and 67 per cent said they felt guilty when they failed to eat vegies with their meal. So what’s holding them back?

One in four Americans said both that vegetables were simply “too expensive” and that “they always rot before I get a chance to eat them”, 24 per cent said they didn’t have convenient access to buying them, 22 per cent said they took too much time to prepare and 19 per cent said they didn’t know how to cook them properly.

“Most of us already know they should be eating more vegetables,” Dr. Praeger’s chief executive Larry Praeger said in a statement. “While more and more people are adopting plant-based diets, there’s still a long way to go toward reaching recommended consumption levels.”

The survey was conducted to promote the company’s Veggie Tracker website.

- More Here