Tuesday, December 17, 2024

What Is Entropy? A Measure of Just How Little We Really Know

But despite its fundamental importance, entropy is perhaps the most divisive concept in physics. “Entropy has always been a problem,” Lloyd told me. The confusion stems in part from the way the term gets tossed and twisted between disciplines — it has similar but distinct meanings in everything from physics to information theory to ecology. But it’s also because truly wrapping one’s head around entropy requires taking some deeply uncomfortable philosophical leaps.

As physicists have worked to unite seemingly disparate fields over the past century, they have cast entropy in a new light — turning the microscope back on the seer and shifting the notion of disorder to one of ignorance. Entropy is seen not as a property intrinsic to a system but as one that’s relative to an observer who interacts with that system. This modern view illuminates the deep link between information and energy, which is now helping to usher in a mini-industrial revolution on the smallest of scales.

Two hundred years after the seeds of entropy were first sown, what’s emerging is a conception of this quantity that’s more opportunistic than nihilistic. The conceptual evolution is upending the old way of thinking, not just about entropy, but about the purpose of science and our role in the universe.

[---]

Notions of entropy developed in disparate contexts thus fit together neatly. A rise in entropy corresponds to a loss in information about microscopic details. In statistical mechanics, for instance, as particles in a box get mixed up and we lose track of their positions and momentums, the “Gibbs entropy” increases. In quantum mechanics, as particles become entangled with their environment, thus scrambling their quantum state, the “von Neumann entropy” rises. And as matter falls into a black hole and information about it gets lost to the outside world, the “Bekenstein-Hawking entropy” goes up.

What entropy consistently measures is ignorance: a lack of knowledge about the motion of particles, the next digit in a string of code, or the exact state of a quantum system. “Despite the fact that entropies were introduced with different motivations, today we can link all of them to the notion of uncertainty,” said Renato Renner (opens a new tab), a physicist at the Swiss Federal Institute of Technology Zurich.

However, this unified understanding of entropy raises a troubling concern: Whose ignorance are we talking about?

[---]

In September 2024, a few hundred researchers gathered (opens a new tab) in Palaiseau, France, to pay homage to Carnot on the 200th anniversary of his book. Participants from across the sciences discussed how entropy features in each of their research areas, from solar cells to black holes. At the welcome address, a director of the French National Center for Scientific Research apologized to Carnot on behalf of her country for overlooking the impact of his work. Later that night, the researchers gathered in a decadent golden dining room to listen to a symphony composed by Carnot’s father and performed by a quartet that included one of the composer’s distant descendants.

Carnot’s reverberating insight emerged from an attempt to exert ultimate control over the clockwork world, the holy grail of the Age of Reason. But as the concept of entropy diffused throughout the natural sciences, its purpose shifted. The refined view of entropy is one that sheds the false dreams of total efficiency and perfect prediction and instead concedes the irreducible uncertainty in the world. “To some extent, we’re moving away from enlightenment in a number of directions,” Rovelli said — away from determinism and absolutism and toward uncertainty and subjectivity.

Like it or not, we are slaves of the second law; we can’t help but compel the universe toward its fate of supreme disorder. But our refined view on entropy allows for a more positive outlook. The trend toward messiness is what powers all our machines. While the decay of useful energy does limit our abilities, sometimes a new perspective can reveal a reservoir of order hidden in the chaos. Furthermore, a disordered cosmos is one that’s increasingly filled with possibility. We cannot circumvent uncertainty, but we can learn to manage it — and maybe even embrace it. After all, ignorance is what motivates us to seek knowledge and construct stories about our experience. Entropy, in other words, is what makes us human.

You can bemoan the inescapable collapse of order, or you can embrace uncertainty as an opportunity to learn, to sense and deduce, to make better choices, and to capitalize on the motive power of you. 

- More Here



No comments: