Started this post in 2020 (IIRC) never got further than semi-developed notes. Still, if you're intrigued by the second law of thermodynamics, you may enjoy reading my [partial] thoughts on its implications.
Change requires loss, and choice always comes with a price. At the very least, you must pay the opportunity cost — once a decision is made, the other options are closed off; cauterized. It's not a real decision until the flesh sizzles and your blood has only one path.
Imagine, if you will, a body with fingers plunging into every possibility, a body that must be pruned back — snip-snap! — to grow in a certain direction. Otherwise it will spread and spread until the skin splits, until the end and beginning are strangers. The body would become a cacophony of beginnings and ends, none still in congress, all surface area and no core.
Wield the hot sword — Gabriel holds it out to you; the pommel burns in your grasp — so that pieces of yourself fall away. You weep to see them transformed into common meat, no longer electrified by your desires. But now those desires are potently concentrated, coursing through a smaller vessel. Now you have a discernible shape.
This possibility body of yours, it can be something, or nothing, or everything. Nothing and everything are the same option, lavishly costumed as each other, neither living up to its name.
I have a mantra: "There are always tradeoffs." Picking a tradeoff, or a set of them, and committing, is what it means to choose. You could say that no, Pareto improvements are the free lunch we always wanted — but don't forget opportunity cost! You must give up alternate possibilities.
In a previous email I brought up the second law of thermodynamics, claiming that "by far the least confusing way to understand entropy is via statistical mechanics (AKA probability)." I feel bad for linking to that Wikipedia page; it is less easily comprehensible than implied. So I will explain how entropy works, and what it is.
Imagine that you take a box of puzzle pieces and dump them out on a table. In theory, it is possible for the pieces to fall perfectly into place and create a completed puzzle when you dump them out of the box. But in reality, that never happens.
Why?
Quite simply, because the odds are overwhelmingly against it. Every piece would have to fall in just the right spot to create a completed puzzle. There is only one possible state where every piece is in order, but there are a nearly infinite number of states where the pieces are in disorder. Mathematically speaking, an orderly outcome is incredibly unlikely to happen at random.
Similarly, if you build a sand castle on the beach and return a few days later, it will no longer be there. There is only one combination of sand particles that looks like your sand castle. Meanwhile, there are a nearly infinite number of combinations that don't look like it.
Clear's description is close, and I appreciate that he makes clear (heh) the statistical aspect of entropy. But disorder versus order is a red herring. Entropy is not about specificity per se — after all, each of the "infinite number of combinations" that doesn't look like a sandcastle is itself highly specific, and the universe is indifferent to which of these is meaningful to humans.
Rather, entropy is the inexorable reduction of the number of valid possibilities (in a closed system / absent outside injections of energy). The fewer possibilities, the more entropy. Intuitively, this makes sense: Things keep happening until there are no longer more things that can happen, given the current configuration of said things — due to the once-concentrated energy that was "stored" in chemical bonds having all been unlocked and dispersed.
In other words, eventually a system will realize a possibility that reduces the subsequent number of possibilities, and so on, until there are no available possibilities that could have this effect. That's the heat death of the universe — maximal energy dispersal, minimal possible futures. Those are the same thing, because a possibility is the fact of locked-up energy being present to potentially realize that possibility. The possibility wouldn't be possible if that weren't the case.
Thus entropy is an inverse measure of the couldness of a system — the amount and variety of different configurations into which it might permute; the trajectories of change able to occur through space-time. Straightforwardly, this couldness is unable to increase without more energy being applied — and there you have the second law. As the possibility space shrinks, entropy increases.
This observation doesn't merely apply to the future; it also describes the number of chemical or atomic microstates that could add up to the present macrostate. "A solid has lower entropy than a gas because [...] the constraints on the positions of the atoms in the solid and limitations on their velocities drastically reduce the number of possible configurations," Harvard physics professor Matthew Schwartz pointed out [PDF link]. Likewise, "A solid has lower entropy than a gas because we have more information about the location of the atoms." In other words, a solid entails a lot of inherent couldness — ways that its constituent parts, down to the atoms, might otherwise be validly assembled — whereas a gas entails less.
Here specificity comes back. The more specific a form, the greater the number of possibilities other than what it is currently — and on the flip side, the less mysterious its current microstate. By contrast, amorphousness means that fewer possibilities are available to be unlocked, and the precise microstate is less distinguishable from any other precise microstate that is valid for that substance.
Professor Schwartz again:
One way to connect the information picture to thermodynamics is to say that entropy measures uncertainty. For example, suppose you have gas in a box. In reality, all of the molecules have some velocities and positions (classically). If you knew all of these, there would be only one microstate compatible with it and the entropy would be zero. But the entropy of the gas is not zero. It is nonzero because we don't know the positions and velocities of the molecules, even though we could in principle. So entropy is not a property of the gas itself but of our knowledge of the gas.
In information theory, the Shannon entropy is 0 if the coins are always heads. That is because we know exactly what will come next — a head — so our uncertainty is zero. If the coin is fair and half the time gives heads and half the time tails, then H=1: we are maximally ignorant. We know nothing about what happens next. If the coin is unfair, 90% chance of heads, then we have a pretty good sense of what will happen next, but are still a little uncertain. If we know the data ahead of time (or the sequence of flips), we can write a simple code to compress it: 1 = the data. So there is no ignorance in that case, as with knowing the position of the gas molecules.
This beautifully demonstrates Eliezer Yudkowsky's observation that we are all inside of physics.
Header illustration: Keeping Up With Science by Shari Weisberg, circa 1930s.