Thursday, October 31, 2024

Animals’ Understanding of Death Can Teach Us About Our Own

But this is precisely what understanding death essentially means: grasping that a dead individual can no longer do what they could when they were alive.

Some scientists who study animals’ relation to death might disagree with this conclusion. Understanding death, they might argue, implies comprehending the absolute finality of it, its inevitability, its unpredictability, and the fact that it will affect everyone, including oneself. These scientists would be in the grip of what I have termed intellectual anthropocentrism: the assumption that the only way of understanding death is the human way, that animals either have a concept of death equivalent to the average adult human’s—or none at all.

But that couldn’t be farther from the truth. Intellectual anthropocentrism is a bias that affects comparative thanatology, the study of how animals deal with and understand death. The way to extirpate this bias is by realizing that the concept of death is not an all-or-nothing matter, but rather a spectrum—something that comes in degrees. So when we study whether animals can understand death, we should not start from the hypercomplex human concept, but rather from what I call the minimal concept of death. Understanding death in minimal terms means grasping that dead individuals don’t do the sorts of things that living beings of their kind typically do, and that this is an irreversible state. And this is precisely what the chimpanzees’ behavior suggests that they had understood.

There is another bias that also affects comparative thanatology: what I have termed emotional anthropocentrism. This is the idea that animals’ reactions to death are only worthy of our attention when they appear human-like. Afflicted by this bias, comparative thanatologists have been looking for manifestations of grief in animals, exemplified by the story of Tahlequah, the orca who carried her dead baby for 17 days and over 1000 miles, or Segasira, the gorilla who attempted to suckle from his dead mother’s breast despite already having been weaned. Don’t get me wrong: animal grief is a real and an important phenomenon that we should absolutely be paying attention to. However, if we’re only looking for mourning behavior in animals, we may be missing most of the picture.

Think back to the chimps. They clearly weren’t mourning the albino baby’s death. Instead, their behavior seemed dominated by an attitude of curiosity. But this did not detract from their understanding of what had happened. Grief does not signal a special or deep understanding of death. What it signals instead is the existence of a strong social bond between the mourner and the deceased.

But there are many ways of emotionally reacting to the realization that someone died that don’t involve grieving. You might react with joy, if, for instance, it means you’re inheriting a large sum of money. You might instead react with anger, if the deceased owed you money that you’re now never going to get back. You might react with excitement or hunger, if, say, your flight crashed in the Andes and there was no more food around. Or you might be totally indifferent, if you didn’t know the person or they meant nothing to you. Of course, all of these reactions are taboo in our societies, and we wouldn’t publicly admit to having them. But this doesn’t mean that they’re not possible. And crucially: they wouldn’t mean that you haven’t properly understood what happened. The polar bear who finally manages to catch a seal might understand death just as well as the heartbroken monkey mother who hangs on to her baby’s corpse, even though the former thinks of it as a gain rather than a loss.

The biases of emotional anthropocentrism and intellectual anthropocentrism have prevented us from seeing that there are many more ways of reacting to death than what is considered politically correct in our societies. In fact, the concept of death, instead of being a complex intellectual achievement within the sole reach of the most cognitively sophisticated species, is actually quite easy to acquire and linked to abilities that are crucial for survival. If we manage to extirpate these two biases, we will see that the concept of death, far from being a uniquely human trait, is widespread in the animal kingdom and more diverse than we will ever know.

_ More Here


Wednesday, October 30, 2024

My Beloved Monster by Caleb Carr

What a beautiful life is this one we have? Now, imagine that beauty is amplified in multitudes plus in multitude of dimensions when we share this bond with another non-human animal. 

Bloody, I am so damn lucky to have this in this lifetime with Max. I am so damn lucky! 

Now that Max is not present and my time is ticking, he left with Neo, Fluffy, Garph and now this year Saroo and Blue. 

Well, I am not the only one who is lucky; Caleb Carr's final book before he passed away is about his relationship with his cat Masha, My Beloved Monster: Masha, the Half-wild Rescue Cat Who Rescued Me (review here).

The beauty of Carr's relationship is - for the first time someone writes this in an inverse way. He is not anthropomorphizing Masha but he thinks Masha was reverse anthropomorphizing. 

In this exquisite book novelist Caleb Carr tells the story of the “shared existence” he enjoyed for 17 years with his beloved cat, Masha. At the time of writing she is gone, he is going, and all that remains is to explain how they made each other’s difficult lives bearable. The result is not just a lyrical double biography of man and cat but a wider philosophical inquiry into our moral failures towards a species which, cute internet memes notwithstanding, continues to get a raw deal.

Carr explains how Masha picked him as her person when he first visited the animal rescue centre nearly 20 years ago. She was a Siberian forest cat – huge, nearer to her wild self than most domestic moggies, and utterly delightful, a long-bodied streak of red-gold whose forward-facing eyes gave her the look of a delighted baby. The rescue centre staff are desperate that Carr take her, and equally anxious that he should understand what he is getting into. This cat, apparently, fights, bites and is unbothered about seeming grateful. But then, why should she be? Abandoned by her previous owners, she was locked in an apartment and left to die. It is an obscenity, says Carr, that goes on more often than we can bear to imagine.

Once Carr gets Masha – a name he hopes sounds vaguely Siberian – home to his farmhouse on Misery Mountain in upstate New York, she starts to show her true “wilding” nature. Mice and voles are taken down with industrial efficiency. She even sees off a bear, dispatching it with a bloody nose. The only creature that gets the better of her is a wicked kind of weasel native to the area called a “fisher” which bites off her luscious tail and leaves her less nimble for the closing part of her life.

With Carr, though, Masha shows a different side. She is not a lap cat in any sense, but something better, an actively attentive partner. When Carr is racked with pain from his chronic neuropathy, Masha bores her broad Siberian forehead into his clenched body to release the agony. Or she sits by his head for hours at a time, looking anxiously for signs that the discomfort might be easing. “What will cynics call this,” Carr asks rhetorically, “if they will not call it love?” In return he makes her mixtapes of her favourite music, mostly Wagner. And, to help her through the August moon, a time when all cats in the American north-east long to stay outdoors all night, he sets up a halfway house for them on the porch with blankets, camping lights and a television, so that they can get through the high summer madness safely together.

There had been plenty of previous cats in Carr’s life, a succession of spirit animals who accompanied him as he grew up in a household that sounds frankly feral. His father was best friends with Kerouac, Ginsberg and Burroughs. Lucien Carr, who was prodigiously clever, madly violent and free with his fists, battered his middle son into profound anti-sociability. Caleb explains how he has spent most of his adult life dealing with these accumulated wounds – a fractured body with a cats’ cradle of internal adhesions, and an inability to hang on to a romantic relationship for more than a few months. Masha is the salve for this lifetime of self-loathing: “how I lived, what I chose to do, my very nature – all were good enough for her.”

The question of anthropomorphism inevitably raises its head. Carr tetchily denies it, maintaining that everything wondrous about Masha – her emotional receptivity, careful social etiquette, even her tactical stealing of visitors’ socks – can be explained as intentional either by the growing academic literature on animal consciousness or the close observations of her clever vets. Altogether more plausible is his suggestion that it is Masha who is doing a kind of anthropomorphism in reverse, ascribing traits of her own species to Carr in order to make his behaviour comprehensible.

By the end, though, it barely matters. Carr has become so enmeshed with Masha that it is getting hard to tell them apart. When she is diagnosed with terminal lymphoma you know that it will not be long before he follows. And, indeed, in May this year Caleb Carr died of cancer at the age of 68. He has left behind a beautiful book, one of the finest meditations on animal companionship that I have ever read.


Friday, October 25, 2024

Practicing Deeper Gratitude via Geology

Yet most of us Anthropocene Earthlings are barely aware of the rich legacy of natural history that envelopes us. We are like squatters living amid the remains of earlier empires, worlds defined by different geographies, governed by alternate rules, inhabited by other residents. Consider the variety of ancient realms represented by a few North American cities: Milwaukee lies on a teeming coral reef; Minneapolis is perched on the edge of a vast volcanic rift; Montreal and New York City rise from the roots of great mountain belts; San Francisco sits, unsteadily, on rocks churned in an ancient subduction zone; Mexico City, also precarious, on the bed of a vanished lake. All vividly remember other versions of Earth.

We self-absorbed humans, meanwhile, mostly ignore the stories that lie just beneath our feet, believing them to be irrelevant, subordinate to our own reality, reducible to the convenient cubbyhole of “prehistory.” If we bothered to notice it, Earth’s crinkled crust would reveal how the past not only persists but in fact shapes the present. The rock record would show us that earlier iterations of the world are no less real for having occurred before we happened onto the scene. Rocks would remind us that we too live in geologic time, that our own moment will one day be long ago.

When we do pay attention to rocks, it is usually because of their utility, not their long memory. Although we humans like to think that we are in charge of our own destiny, the technological ages of humankind—Stone, Bronze, Iron, Fossil Fuel, Nuclear, and yes, even Digital—have always been dictated by rocks, and our own short-lived empires have risen and fallen in the pursuit of their riches. The science of geology has of course been entangled with all of this looting, but along the way, as geologists hammered at rocks, they began to understand that each gold vein and coal seam was part of a grand illuminated manuscript recording the history of the world. In an ironic twist, geology’s richest discovery is arguably an intellectual and philosophical one—an understanding of Deep Time.

Like all creatures on Earth, we need to use the planetary materials at hand to make a living, and our species has been exceptionally clever at appropriating those materials for our own purposes. As our technological prowess has grown, however, our respect for these works of time has declined. We rarely pause to consider that rocks and minerals have their own life stories—that they are emissaries bearing messages from across eons.

[---]

It occurred to me later that that house on a Croatian hillside embodies the way we in the modern world thoughtlessly scavenge the monuments of the geologic past. Our cities are just scaled-up versions. Every asphalt roadway and concrete structure contains fragments, often still readable, from the chronicles of prior geologic regimes, irreverently blended and reconstituted. The metals in our cars, phones, and computers, having been separated from their source rocks, are more akin to individual letters in a shredded manuscript, but still whisper of their deep geologic origins. All the coal, oil, and natural gas we’ve burned—the photosynthetic memories of earlier ecosystems—hovers now in the air, the ghosts of combustion that haunt us in the Anthropocene.

Does it matter whether or not we acknowledge the histories of nonliving components of nature? Not every rock in Earth’s time-wrinkled crust can be treated as a precious artefact, and we Earthlings have no option other than to use what the planet provides. But simply developing an awareness of the deep history that enfolds us—and the immense amount of time embodied in the planet’s generous gifts—can foster a perceptual shift with radical psychological and practical implications.

[---]

Becoming familiar with Earth’s monumental autobiography, and getting to know the characters and plots that fill the vast expanses of geologic time, forces one to abandon the notion that the only stories that matter are those with human protagonists. Once free from that deep-seated prejudice, one begins to see even the nonliving components of the Earth, including rocks and rivers, atmosphere and ocean, not as dumb matter but rather as part of a dynamic, animate, evolving—and self-documenting—collective.

Although there is a tendency to think of biological evolution as a steady march of progress, with primitive organisms being systematically replaced by more sophisticated ones, the fact is that when new lineages of organisms have emerged, they have simply joined the other branches on the Tree of Life. Bacteria and archaea, the progenitors of all subsequent life-forms, are still very much with us, as are myriad other microorganisms and invertebrates, as well as fish, amphibians, and reptiles, all living together in the wide crown of the tree with Johnny-come-lately species like us.

Similarly, old rocks are not merely relics of the distant past but active participants in current events and ecosystems, thanks to the way that Earth has crumpled rocks of all ages into the crust. Rocks young and old take part with equal vigor in present-day earthquakes. Strata that formed as desert dunes a hundred million years ago lap up today’s rain, having found new careers in middle age as aquifers. Schists that remember the dawn of Life carry on intimate discourse with modern microbes and root systems, on their way to becoming soils of the future. Basalts that were erupted as lavas eons before Homo sapiens appeared now attempt valiantly to absorb our carbon emissions.3 In other words, even the oldest rocks are responsive to new conditions, taking note of changes in the air, interacting in real time with the present.

This view of Earth’s rocky crust as dynamic and reactive—an ancient archive with comments to make about the current state of the world—suggests that we need a radical reappraisal of the way we live, farm, and build infrastructure on it. Even over human timescales, rocks and landscapes are not static but instead inherently mutable, and their capacity for shape-shifting will only increase in the face of anthropogenic changes in the surface environment. Yet most training for designers and engineers is still predicated on the view of rocky matter as timeless and inert. This reflects certain aesthetic preferences that were adopted early on in the history of Western science.

[---]

There is a new and powerful generation of engineers and entrepreneurs who believe that humans can simply opt out of time—not appreciating the irony that their worldview is in fact an antiquated misconception from a bygone era. These are the moguls who think that “colonizing” Mars is not only possible but in fact right and inevitable, who advocate for stratospheric sulfate injection as an instant solution to the long-brewing climate crisis, and who actually seem convinced that they personally will be exempted from dying.5 At the same time, in another irony, obsolescence is a constant threat in their world, where someone who has been in the business for two decades is regarded as an ancient sage.6 All of these delusions are symptoms of temporal dysmorphia, or more bluntly, time illiteracy.

This condition is harmful to any human but particularly pathological when it afflicts the rich and influential. Spewing sulfate into the upper atmosphere to correct for a century’s worth of fossil fuel burning is considered by geoscientists to be madness, sure to have a torrent of unintended consequences not only for global weather patterns but also geopolitical stability.7 Time illiteracy seduces otherwise intelligent people into believing a planet with no soil, oceans, or active tectonics could become an Earthlike Eden in a matter of decades—somehow overlooking the fact that even if we could homestead on a new planet, we would still be us: the same flawed creatures expelled from the first Eden.

[---]

The spiritual solace we crave may lie in the records of deep time that are our common heritage as Earthlings. The rocky archives have been patiently awaiting our notice. In them, we may find reassurance in the persistence of earlier worlds all around us; a sense of wonder at how extraordinary their preservation is; gratitude for the way they permeate the present with mystery, gravitas, and the promise of continuity. A spirit of evolutionary camaraderie may come from the knowledge that we have shared the arduous journey to the present with so many other long-lived lineages and have kin everywhere in nature. Accepting that we too live in geologic time can free us from narcissism. Letting go of the illusion that only the present is real, allowing the undulations of time to wash over us, may carry us with less fear into the future.

- More Here


Monday, October 21, 2024

How Learning Can Guide Evolution

Well, this paper gave me lot of boost! An old paper from 1987 but a gem! 

Abstract. 

The assumption that acquired characteristics are not inherited is often taken to imply that the adaptations that an organism learns during its lifetime cannot guide the course of evolution. This inference is incorrect (Baldwin, 1896). Learning alters the shape of the search space in which evolution operates and thereby provides good evolutionary paths towards sets of co-adapted alleles. We demonstrate that this effect allows learning organisms to evolve much faster than their nonlearning equivalents, even though the characteristics acquired by the phenotype are not communicated to the genotype.

Discussion

The most common argument in favor of learning is that some aspects of the environment are unpredictable, so it is positively advantageous to leave some decisions to learning rather than specifying them genetically (e.g. Harley, 1981). This argument is clearly correct and is one good reason for having a learning mechanism, but it is different from the Baldwin effect which applies to complex co-adaptations to predictable aspects of the environment.

To keep the argument simple, we started by assuming that learning was simply a random search through possible switch settings. When there is a single good combination and all other combinations are equally bad a random search is a reasonable strategy, but for most learning tasks there is more structure than this and the learning process should make use of the structure to home in on good switch configurations. More sophisticated learning procedures could be used in these cases (e.g. Rumelhart, Hinton, and Williams, 1986). Indeed, using a hillclimbing procedure as an inner loop to guide a genetic search can be very effective (Brady, 1985). As Holland (1975) has shown, genetic search is particularly good at obtaining evidence about what confers fitness from widely separated points in the search space. Hillclimbing, on the other hand, is good at local, myopic optimization. When the two techniques are combined, they often perform much better than either technique alone (Ackley, 1987). Thus, using a more sophisticated learning procedure only strengthens the argument for the importance of the Baldwin effect.

For simplicity, we assumed that the learning operates on exactly the same variables as the genetic search. This is not necessary for the argument. Each gene could influence the probabilities of large numbers of potential connections and the learning would still improve the evolutionary path for the genetic search. In this more general case, any Lamarckian attempt to inherit acquired characteristics would run into a severe computational difficulty: To know how to change the genotype in order to generate the acquired characteristics of the phenotype it is necessary to invert the forward function that maps from genotypes, via the processes of development and learning, to adapted phenotypes. This is generally a very complicated, non-linear, stochastic function and so it is very hard to compute how to change the genes to achieve desired changes in the phenotypes even when these desired changes are known.

We have focused on the interaction between evolution and learning, but the same combinatorial argument can be applied to the interaction between evolution and development. Instead of directly specifying the phenotype, the genes could specify the ingredients of an adaptive process and leave it to this process to achieve the required end result. An interesting model of this kind of adaptive process is described by Von der Malsburg and Willshaw (1977). Waddington (1942) suggested this type of mechanism to account for the inheritance of acquired characteristics within a Darwinian framework. There is selective pressure for genes which facilitate the development of certain useful characteristics in response to the environment. In the limit, the developmental process becomes canalized: The same characteristic will tend to develop regardless of the environmental factors that originally controlled it. Environmental control of the process is supplanted by internal genetic control. Thus, we have a mechanism which as evolution progresses allows some aspects of the phenotype that were initially specified indirectly via an adaptive process to become more directly specified.

Our simulation supports the arguments of Baldwin and Waddington, and demonstrates that adaptive processes within the organism can be very effective in guiding evolution. The main limitation of the Baldwin effect is that it is only effective in spaces that would be hard to search without an adaptive process to restructure the space. The example we used in which there is a single spike of added fitness is clearly an extreme case, and it is difficult to assess the shape that real evolutionary search spaces would have if there were no adaptive processes to restructure them. It may be possible to throw some light on this issue by using computer simulations to explore the shape of the evolutionary search space for simple neural networks that do not learn, but such simulations always contain so many simplifying assumptions that it is hard to assess their biological relevance. We therefore conclude with a disjunction: For biologists who believe that evolutionary search spaces contain nice hills (even without the restructuring caused by adaptive processes) the Baldwin effect is of little interest,[3] but for biologists who are suspicious of the assertion that the natural search spaces are so nicely structured, the Baldwin effect is an important mechanism that allows adaptive processes within the organism to greatly improve the space in which it evolves.


 

Sunday, October 20, 2024

Ed Yong talks about his seven rules for being a public figure in a time of crisis

I have been reading Ed Yong since he started writing a blog on Discover; a long long time ago when I guess he was just out of college. 

I love his humility; a rare trait in this world.



Wednesday, October 16, 2024

The Hidden World of Electrostatic Ecology

The magic of animal electrostatics is all about size. Large animals don’t meaningfully experience nature’s static — we’re too big to feel it. “As humans, we are living mostly in a gravitational or fluid-dynamics world,” Ortega-Jiménez said. But for tiny beings, gravity is an afterthought. Insects can feel air’s viscosity. While the same laws of physics reign over Earth’s smallest and largest species, the balance of forces shifts with size. Intermolecular forces flex beneath the feet of water striders on a pond, capillary forces shoot water impossibly upward through a plant’s thin roots, and electrostatic forces can ensnare any oppositely charged flecks that lie in their path.

[---]

If static charges aid pollination, they could shift plant evolution, too. “Maybe some fundamental features of flowers are actually just in service of generating the correct electrostatic field,” Dornhaus said, “and because we can’t see them, we’ve ignored that whole dimension of a flower’s life.” The idea isn’t so far-fetched: In 2021, Robert’s team observed petunias releasing more compounds that attract bugs (opens a new tab) around beelike electric fields. This suggests that flowers wait until a pollinator is nearby to actively lure them closer, Robert said.

“Humans are very visually oriented, so we tend to emphasize flowers that are showy and large,” Dornhaus said. But we already know that flowers transmit strong invisible signals, like scents or ultraviolet patterns. “It may well be that for some flowers, the electric field is actually a more prominent signal to bees than color is.”

However, evolutionary details surrounding electrostatic ecology remain murky at best. “It’s amazing, really, how little we know,” said Wainwright, the insect evolutionary ecologist. Even within better-understood visual and acoustic systems, ecologists are only beginning to connect evolutionary dots.

Because electrostatics has flown under the radar, England worries that humans unknowingly hinder the ability of animals to use these forces. “We’re spitting electrostatic stuff into the environment all the time,” he said. Electronic devices, appliances, power lines, fertilizers (opens a new tab) and even clothing bear static charges. “If [insects are] sensitive to the wingbeat of a wasp, they’re probably sensitive to a power line, and it might be messing up that entire system.”

- More Here


Tuesday, October 15, 2024

How To Compose A Successful Critical Commentary

  • You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.”
  • You should list any points of agreement (especially if they are not matters of general or widespread agreement).
  • You should mention anything you have learned from your target.
  • Only then are you permitted to say so much as a word of rebuttal or criticism.

- Daniel Dennett attributing to Anatol Rapoport (via here


Sunday, October 13, 2024

Teenage Boys & Ayn Rand

The Russian terrorist Vera Figner, a leader of the group that assassinated Tsar Alexander II in 1881, recalled how she lost respect for her father when he replied to a serious question: “I do not know.” This answer filled the child with “burning shame.” All important questions, Figner knew, have clear answers, and all reasonable people accept them.

Figner didn’t weigh pros and cons. No sooner did she hear some indubitably correct answer than she adopted it. Regardless of counterevidence, she never questioned a belief, just as one never doubts a mathematical proof. Figner was by no means unusual. This way of thinking—this certainty about being absolutely certain—characterized both the prerevolutionary Russian radical intelligentsia and, after the Bolshevik coup, official Soviet thought.

Born and raised in Petersburg, Alisa Rosenbaum—better known as Ayn Rand—shared this mentality. Though Jewish, her thought was Russian to the core. Rand’s fiction closely resembles Soviet socialist realism except for preaching the opposite politics. Call it capitalist realism. In the most perceptive article on Rand I have encountered, Anthony Daniels claimed, without much exaggeration, that “her work properly belongs to the history of Russian, not American, literature.”

[---]

When I became a scholar of Russian literature, I immediately recognized Rand’s debt to the Russian radical intelligentsia. One can divide prerevolutionary Russian thought into two strongly opposed traditions: that of the radicals and that of Tolstoy, Dostoevsky, Chekhov, and other great writers. The radical tradition featured ideologues and revolutionaries, including devoted terrorists like Figner and Sergey Nechaev; anarchists Mikhail Bakunin and Peter Kropotkin; populists Pyotr Lavrov, Nikolai Mikhailovsky, and the leaders of the Socialist Revolutionary Party; Marxists like Stalin, Trotsky, and Lenin; and a host of tendentious literary critics.

[---]

Rand’s reasoning was, in fact, remarkably sloppy. She regarded Aristotle as the greatest philosopher because he formulated the law of identity “A is A,” from which she claimed to derive a proof of radical individualism and the morality of selfishness. If A is A, then Man is Man and I am I, which (for Ran d) means that I owe nothing to anyone else. If Man is Man, then man must choose to survive, which means that it is irrational to let oneself be “looted” by unproductive people. If one is to survive, she reasoned, one’s ultimate value must be one’s own life. “The fact that a living entity is determines what it ought to do. So much for Hume’s question of how to derive ‘is’ from ‘ought.’” It is hard to say which is worse, Rand’s failure to understand the positions she dismissed or the shoddy logic she deployed in the name of infallible “reason.”

[---]

Rand utterly rejected the idea that some issues are ambiguous or call for compromise. “One of the most eloquent symptoms of the moral bankruptcy of today’s culture,” she declared, “is a certain fashionable attitude toward moral issues, best summarized as: ‘There are no blacks and whites, there are only grays.’ . . . Just as, in epistemology, the cult of uncertainty is a revolt against reason—so, in ethics, the cult of moral grayness is a revolt against moral values. Both are a revolt against the absolutism of reality.”

Middle-of-the-road thinking is for Rand “the typical product of philosophical default—of the intellectual bankruptcy that has produced irrationalism in epistemology, a moral vacuum in ethics, and a mixed economy in politics. . . . Extremism has become a synonym of ‘evil.’”

Is it any surprise that Rand strongly appealed to bright teenage boys? As comic book writer John Rogers remarked, “There are two novels that can change a bookish fourteen-year old’s life: The Lord of the Rings and Atlas Shrugged. One is a childish fantasy that often engenders a lifelong obsession with its unbelievable heroes, leading to an emotionally stunted, socially crippled adulthood, unable to deal with the real world. The other, of course, involves orcs.”

[---]

I could not help recalling this famous phrase, which Rand must have known, when encountering her defense of essentially the same position. “Why must fiction represent things ‘as they might be and ought to be’?” she asked. Quoting from Atlas Shrugged—Rand had an annoying habit of quoting herself or her fictional heroes as authorities—she answered: “As man is a being of self-made wealth, so he is a being of self-made soul.” In other words, she concluded, “Art is the technology of the soul.”

Both Rand and the Soviets believed that, without the aid of supernatural power, humanity will accomplish what had always been regarded as miraculous. There are no fortresses Bolsheviks cannot storm, declared Stalin, while Rand attributed the same power to unfettered capitalism. Enlightened by the right philosophy, human will can accomplish anything.

[---]

Children require sacrifice: this obvious fact indicates that people are not, and can never be, fully independent. Rand’s heroes and heroines apparently arrive at adulthood without having gone through childhood, let alone infancy. It is as if she believed that, like Athena springing fully grown from Zeus’s head, people are created by sudden flashes of insight. My point is not just that infants are utterly dependent on another person and that children only gradually learn to take care of themselves. It is also that no one chooses when, where, and to whom to be born. Unlike Howard Roark, people always inherit something they did not choose.

Rand wrote as if poverty always resulted from failure of willpower, as if no one is born into it. She was right to reject the deterministic view that people are wholly the product of heredity and environment; choices that cannot be wholly explained by such factors help make us who we are. But it is no less mistaken to treat people as entirely self-made and utterly responsible for their condition.

- More Here

I was one of those "teenage boys"; although I was like 22 or so when I read Rand for the first time and hooked. 

Then as time passed, I grew up and I understood how deluded I was. There are some very good self-reliance messages in her book a.k.a “pep” talk - I took those few hundred words and flushed out everything else from my life. 

Ayn Rand gives a "certain" view of the world which is completely against reality. Reality is uncertain and to top it off - our biology doesn't even allow us to see much of reality in the world. 

The answer is to be humble, keep learning and change your mind as our understanding of reality changes. 


Wednesday, October 9, 2024

The Tibet Myth

Buddha once said, "If you see the Buddha, kill him". 

That was a wise warning about organized religion plus idol worship. 

We all know monolithic religions and eastern religions are riddled with "God Men" who promote self interest. Buddhism is no different. 

In my life, I learnt a ton from Buddha and that reflects in this blog. 

Organized religion is a bane, Friendly Feudalism: The Tibet Myth:

A reading of Tibet’s history suggests a somewhat different picture. “Religious conflict was commonplace in old Tibet,” writes one western Buddhist practitioner. “History belies the Shangri-La image of Tibetan lamas and their followers living together in mutual tolerance and nonviolent goodwill. Indeed, the situation was quite different. Old Tibet was much more like Europe during the religious wars of the Counterreformation.” [6] In the thirteenth century, Emperor Kublai Khan created the first Grand Lama, who was to preside over all the other lamas as might a pope over his bishops. Several centuries later, the Emperor of China sent an army into Tibet to support the Grand Lama, an ambitious 25-year-old man, who then gave himself the title of Dalai (Ocean) Lama, ruler of all Tibet.

His two previous lama “incarnations” were then retroactively recognized as his predecessors, thereby transforming the 1st Dalai Lama into the 3rd Dalai Lama. This 1st (or 3rd) Dalai Lama seized monasteries that did not belong to his sect, and is believed to have destroyed Buddhist writings that conflicted with his claim to divinity. The Dalai Lama who succeeded him pursued a sybaritic life, enjoying many mistresses, partying with friends, and acting in other ways deemed unfitting for an incarnate deity. For these transgressions he was murdered by his priests. Within 170 years, despite their recognized divine status, five Dalai Lamas were killed by their high priests or other courtiers. 

For hundreds of years competing Tibetan Buddhist sects engaged in bitterly violent clashes and summary executions. In 1660, the 5th Dalai Lama was faced with a rebellion in Tsang province, the stronghold of the rival Kagyu sect with its high lama known as the Karmapa. The 5th Dalai Lama called for harsh retribution against the rebels, directing the Mongol army to obliterate the male and female lines, and the offspring too “like eggs smashed against rocks… In short, annihilate any traces of them, even their names.”

[---]

Young Tibetan boys were regularly taken from their peasant families and brought into the monasteries to be trained as monks. Once there, they were bonded for life. Tashì-Tsering, a monk, reports that it was common for peasant children to be sexually mistreated in the monasteries. He himself was a victim of repeated rape, beginning at age nine. The monastic estates also conscripted children for lifelong servitude as domestics, dance performers, and soldiers.

In old Tibet there were small numbers of farmers who subsisted as a kind of free peasantry, and perhaps an additional 10,000 people who composed the “middle-class” families of merchants, shopkeepers, and small traders. Thousands of others were beggars. There also were slaves, usually domestic servants, who owned nothing. Their offspring were born into slavery. The majority of the rural population were serfs. Treated little better than slaves, the serfs went without schooling or medical care. They were under a lifetime bond to work the lord’s land — or the monastery’s land — without pay, to repair the lord’s houses, transport his crops, and collect his firewood. They were also expected to provide carrying animals and transportation on demand. Their masters told them what crops to grow and what animals to raise. They could not get married without the consent of their lord or lama. And they might easily be separated from their families should their owners lease them out to work in a distant location.

As in a free labor system and unlike slavery, the overlords had no responsibility for the serf’s maintenance and no direct interest in his or her survival as an expensive piece of property. The serfs had to support themselves. Yet as in a slave system, they were bound to their masters, guaranteeing a fixed and permanent workforce that could neither organize nor strike nor freely depart as might laborers in a market context. The overlords had the best of both worlds.

One 22-year old woman, herself a runaway serf, reports: “Pretty serf girls were usually taken by the owner as house servants and used as he wished”; they “were just slaves without rights.”  Serfs needed permission to go anywhere. Landowners had legal authority to capture those who tried to flee. One 24-year old runaway welcomed the Chinese intervention as a “liberation.” He testified that under serfdom he was subjected to incessant toil, hunger, and cold. After his third failed escape, he was merciless beaten by the landlord’s men until blood poured from his nose and mouth. They then poured alcohol and caustic soda on his wounds to increase the pain, he claimed.

The serfs were taxed upon getting married, taxed for the birth of each child and for every death in the family. They were taxed for planting a tree in their yard and for keeping animals. They were taxed for religious festivals and for public dancing and drumming, for being sent to prison and upon being released. Those who could not find work were taxed for being unemployed, and if they traveled to another village in search of work, they paid a passage tax. When people could not pay, the monasteries lent them money at 20 to 50 percent interest. Some debts were handed down from father to son to grandson. Debtors who could not meet their obligations risked being cast into slavery. 

[---]

The Tibetan serfs were something more than superstitious victims, blind to their own oppression. As we have seen, some ran away; others openly resisted, sometimes suffering dire consequences. In feudal Tibet, torture and mutilation — including eye gouging, the pulling out of tongues, hamstringing, and amputation — were favored punishments inflicted upon thieves, and runaway or resistant serfs. 

Journeying through Tibet in the 1960s, Stuart and Roma Gelder interviewed a former serf, Tsereh Wang Tuei, who had stolen two sheep belonging to a monastery. For this he had both his eyes gouged out and his hand mutilated beyond use. He explains that he no longer is a Buddhist: “When a holy lama told them to blind me I thought there was no good in religion.” Since it was against Buddhist teachings to take human life, some offenders were severely lashed and then “left to God” in the freezing night to die. “The parallels between Tibet and medieval Europe are striking,” concludes Tom Grunfeld in his book on Tibet. 

Let's ignore "ism" and organized religions. I cannot promise you the world will be a better world but I can promise that we got rid of some of the major badness in the world. 

Sunday, October 6, 2024

What Has Travel Ever Done for Me?

Why should I feel this way about travel? What has it ever done to me? Travel is one of those things one generally doesn’t attack in polite company, the world of letters excepted. Its wholesomeness is assumed. It broadens the mind. It makes us empathetic and, by rewarding our curiosity, encourages it to develop further. It teaches people the just-right amount of relativism —the amount that makes them easygoing in company, perhaps usefully pliable in exigencies, but not nihilistic. Only a fool or a misanthrope would criticize travel.

[---]

Given travel’s salutary reputation, it is no wonder that I am biased against the whole topic. A writer is someone who resents being told that something is good for him, and that this is therefore why he must do it. It’s no wonder, either, if such people repeatedly fling themselves against this broad, smiling enemy, hoping to smite it.

[---]

Similarly, the well-worn complaint that travel banalizes places—that, if too many people start to go somewhere, the place reconfigures itself in order to please the almighty tourist’s gaze—doesn’t take the absolute otherness of human beings seriously enough. For example, in “Consider the Lobster,” David Foster Wallace writes that tourism is good for the soul, not because it broadens tourists, but precisely because it constricts them, in a painful yet educational way:

To be a mass tourist, for me, is to become a pure late-date American: alien, ignorant, greedy for something you cannot ever have, disappointed in a way you can never admit. It is to spoil, by way of sheer ontology, the very unspoiledness you are there to experience. It is to impose yourself on places that in all noneconomic ways would be better, realer, without you. It is, in lines and gridlock and transaction after transaction, to confront a dimension of yourself that is as inescapable as it is painful: As a tourist, you become economically significant but existentially loathsome, an insect on a dead thing.

[---]

In fact, you can treat this performance as information in its own right. Before every place was spoiled, assuming that there was such a time, we could go observe what we took to be the unselfconscious manners and ancient customs of the people there. Today, we can observe self-conscious manners and generic customs—each with its own little flutters of imperfection and telling gaps in performance. And these, again, are information. You can learn as much about people from thinking about the way they act themselves out for you as you can from analyzing their less artful, less premeditated moments. So I agree with Wallace that there is no “unspoiledness” to experience, but the curious and attentive mind can do just fine with spoiledness. Wallace certainly did, in several of his classic essays.

[---]

We love, as well, to mock the privileged Westerners who go somewhere far away and realize one or two momentous, banal things about themselves, especially if these same people then have the temerity to make art about their epiphanies. Consider, to name two much-discussed examples, Elizabeth Gilbert in Eat, Pray, Love, and Alanis Morrissette in “Thank U,” that song in which she thanks India. It happens that I, too, dislike that book, and that song. But to have an epiphany in Italy or India is no sillier than to have one in the woods or at work or on a walk around one’s neighborhood. Abroad, one is surrounded by billions of strangers who presumably have better things to do than serve as one’s backdrop, but that is also true at home, or even in the woods. (Look at all those trees! Do you, solipsistic walker, even know their species names?) Yet we dare to have interior lives anyway.

[---]

Agnes Callard criticizes tourism as pointless “locomotion.” (She does so, tellingly, only after distinguishing tourism from several more benign forms of faraway-place-going.3)“The single most important fact about tourism is this: We already know what we will be like when we return,” she writes. This is a hell of an assumption. I don’t really know what I will be like next week, at least not in every important detail. To judge by her other writing, Callard is also, and not infrequently, a surprise to herself; her ability to describe these moments in fine, perhaps unintentionally comic detail provides her work with much of the insight and entertainment value it possesses.

In disconnecting us from the ongoing and sometimes nightmarish dailiness of our lives, travel allows us to “do nothing and be nobody.” For Callard, this makes it a preview of death, the nothingness that will put an end to our quotidian boredom forever. “Socrates said that philosophy is a preparation for death,” she concludes. “For everyone else, there’s travel.” This is funny because, like many of Nietzsche’s witticisms, it is a melodramatic overstatement of something that is, perhaps, five percent true. When we disrupt our routines, we do not do nothing, or become no one; we do different things, we try on other selves. This is why we frequently come back from even rather silly jaunts, pace Callard, a bit different.

[---]

So the antitravel position, broadly conceived, doesn’t seem to work. Yet I feel a sour satisfaction, as I have said, whenever someone decides to take travel down a peg. Partly, this is because the cases for travel are often sillier than the cases against, and I think it’s important to question them. If, for example, travel broadens the mind, why are at least some of the best-traveled people the worst blockheads one has ever met? If travel increases tolerance, why did it not have exactly that effect on so many of history’s conquerors—monomaniacs who could not let stand any place that failed to give back their own image?

- More Here


Friday, October 4, 2024

VICT3R Project: What Are the Goals of Virtual Rabbits?

Animals used in laboratories are often treated as mere objects, enduring painful procedures like burns, poisoning, food deprivation, and skin, eye, and ear lacerations—all in the name of human safety. While many argue that these tests are necessary for ensuring product safety, ethical alternatives exist, and they should be explored. That’s where the Spanish university’s virtual rabbit initiative comes in.

The primary goal of the VICT3R project is to significantly reduce the number of animals used in safety testing for drugs and other chemicals by replacing them with computer-generated virtual models. This represents a crucial milestone in the quest for ethical and sustainable scientific research. If successful, the project could  prove that virtual models can yield reliable scientific results without harming living creatures.

Scientific advancements have provided more humane -and incredibly scientific- alternatives to animal testing, such as computer simulations and human tissue models. These methods can offer effective results without harming living creatures like rabbits. The VICT3R project introduces additional key objectives:

  • Reducing Animal Use: The European VICT3R project aims to reduce the total number of animals used in experiments by up to 25%. This could lead to fewer animals being subjected to tests for medicine and chemical safety.
  • Data Reuse and Sharing: The project promotes reusing and sharing data and applying new data science techniques to further implement the 3Rs—reduce, refine, and replace—in preclinical animal experimentation.
  • Generative AI for Synthetic Animals: In cases where historical data on certain species or conditions is unavailable, generative AI could create fully synthetic virtual animals to fill the gaps.
  • Expansion to Other Studies: The aspiration of the VICT3R project is to extend this concept of virtual control groups to other toxicological and pharmacological studies, both in academic and industrial settings, further reducing reliance on animal testing.

- More Here


Wednesday, October 2, 2024

Imagination vs. Creativity

I like to make a distinction between imagination and creativity that you may or may not agree with. Imagination is the ability to see known possibilities as being reachable from a situation. Creativity is the ability to manufacture new possibilities out of a situation. The two form a continuous spectrum of regimes in simple cases, but are disconnected in complex cases.

[---]

Imagination is an aptitude based on analysis, and is a variety of reasoning forwards from a current state marked by freedom from habituated patterns of seeing. Creativity is an aptitude is based on synthesis, and is a variety of reasoning backwards from desired outcomes marked by closing of realizability gaps. To some extent, the two behaviors exist on the same continuous spectrum, and in most situations we alternate between forwards and backwards reasoning modes. But in complex situations, there is also a discontinuity between the two modes, which is the same as the general discontinuity and qualitative difference that separates analysis from synthesis.

Forward and backward are not symmetric. Synthesis, since it works backwards from a desired state, is strictly more expressive, since it can start from desired states that are not realizable or reachable from the current state using known techniques and patterns of behavior. It can also fail in more ways, since it might attempt impossibilities.

A leap — a creative leap — may be required to connect the forward and backward regimes. Sometimes this might just manifest as a textbook technical problem that is easy to solve once you pose it correctly. You could even outsource that to an appropriate sort of technician to actually execute. Craftsmanship and skill are useful for creativity up to the point where you can see the leap that is needed, but once seen, others can often do it. The most creative people in a medium are rarely the master technicians.

I like the definition of genius as “talent hits the target others can’t hit, genius hits the target others can’t see.” Creative genius likes in seeing what others don’t see. But once you’ve actually seen it, you might be able to simply point it out to others to hit. They might even be better at hitting it than you, once you point it out.

At other times creativity might manifest as an “invention gap,” as I’ve taken to calling it, or even a “discovery” gap — uncovering a new principle or phenomenon to harness in nature. A problem that nobody knows how to solve, or a behavior of nature that nobody has noticed, modeled, or figured out how to harness.

[---]

Imagination to some extent is relative to training data. What for you is a leap of imagination may be a straightforward inference for someone who has seen or experienced more cases. A sufficiently trained AI model may produce behaviors indistinguishable from highly imaginative human behaviors.

Creative behaviors require imagination, but also require more something more. Imagination is necessary but not sufficient for creativity.

Creative behaviors, I think, call for the equivalent of mutation or noise-injection into an evolutionary process. There is a non sequitur quality to creative leaps that strikes me as fundamentally non-analytical and serendipitous.

- More Here