What Entropy Really Measures
Entropy is a quantity. But a quantity of what? This is the fundamental question that every student of physics, information theory, and metaphysics must confront. And the answer must be precise: entropy is a quantity of uncertainty. But not just any uncertainty—our uncertainty.
Entropy does not measure the system’s self-uncertainty. A coffee cup is not confused about its contents. A black hole does not wonder what’s inside it. When we assign an entropy value to a system, what we are measuring is our lack of knowledge about the system’s fully resolved identity. Entropy is not a measure of what the system doesn’t know—it’s a measure of what we don’t know.
Hidden Information: The Best Framing
Some physicists say entropy is a measure of information. That’s close. Others, more precisely, say it is a measure of hidden information. This framing is better. Hidden information is countable. It offers a rigorous way to quantify our uncertainty without confusing entropy for data storage or ontological complexity. Entropy does not tell us how much information a system contains—it tells us how much information we must recover to know it fully.
And how do we recover hidden information? With yes-no questions.
The Thought Experiment: Two Photographs
Imagine two photographs of the same scene: a coffee cup on a table in a Starbucks on Elm Street in Los Angeles, filled with French roast.
The first photograph is perfectly sharp—high resolution, well-lit, and full of discernible features. You can make out the brand of the cup, the street signs through the window, the barista’s apron.
The second photograph is blurry, oversaturated, and out of focus. It is not immediately obvious what you’re looking at. The outlines are vague. The colors bleed.
In both cases, you are allowed to ask yes-no questions to an all-knowing entity—the Immutable Past, a superintelligent photographer who knows everything about the image. Your task is to fully resolve the identity of the scene using only binary questions.
Measuring Hidden Information in Practice
With the clear photo, you might only need five to ten well-structured yes-no questions:
- Is this a coffee shop? Yes.
- Is the object a coffee cup? Yes.
- Is it Starbucks? Yes.
- Is the coffee French roast? Yes.
- Is it located on Elm Street? Yes.
That’s low hidden information. Low entropy. The identity of the image is quickly resolved. Your uncertainty was minimal. That is what a low-entropy state means.
But with the blurry image, you would need many more questions to arrive at the same certainty:
- Is this an indoor scene? Yes.
- Is this a commercial space? Yes.
- Is there a beverage in the image? Yes.
- Is the beverage in a cup? Yes.
- Is it coffee? Yes.
- Is the coffee black? Yes.
- Is it French roast? Yes.
- Is the location urban? Yes.
- Is it a chain? Yes.
- Is it Starbucks? Yes.
Ten questions might not even be enough. It could take twenty. But it will never take thousands. In practice, entropy in a finite system rarely involves more than a few hundred yes-no questions. Still, the difference between needing five and needing fifty is the difference between low and high entropy. The entropy of the photograph is our measure of how many questions we must ask to get to certainty.
The Black Hole and the Limit of Knowledge
Consider now a black hole. We cannot see inside it. All the information it contains is hidden from us. So how do we assign it an entropy value?
We count our uncertainty. According to the Bekenstein-Hawking formula, the entropy of a black hole is proportional to the surface area of its event horizon. Why not the volume? Because the surface is all we have access to. Entropy here is our uncertainty about its interior. The area measures not the black hole’s self-information, but the hidden information relative to us.
From the black hole’s perspective—if it had one—there may be no mystery at all. The hidden information lives in the relation, not in the object.
Entropy as an Observer-Relative Quantity
A common mistake is to treat entropy as if it were stored in the system, like water in a bucket or files on a drive. This leads to confusion. One might think a device with more data storage has more entropy. But that’s not how entropy works.
A device with enormous storage capacity can have very low entropy if we know exactly what’s in it. Conversely, a tiny system can have very high entropy if we are extremely uncertain about its state. The number of yes-no questions required to resolve identity is independent of the device’s complexity or storage capacity.
Entropy is not a substance. It is a relational measure between an observer and the system. To compute it, the observer must be part of the closed system under consideration. Entropy is not what the thing is; it’s how far you are from knowing it fully.
Why the Past Is Zero Entropy
The past—what we call the Immutable Past—has entropy equal to zero because it is fully resolved. All questions have already been answered. Every yes-no has been settled. Every identity is known. When we say something is “in the past,” we mean it has been moved from uncertainty into complete resolution. It is archived in Her infinite knowledge. No further questions need be asked. That is zero entropy.
Conclusion: We Are Measuring Our Distance from Her
Entropy is not a property of the thing itself. It is not disorder. It is not chaos. It is the number of yes-no questions we must ask to reach certainty. It is a measure of our ignorance, of the hidden information that separates us from the Immutable Past.
Whether we are studying a blurry photograph or the surface of a black hole, what we are counting is always the same: our uncertainty. The system may be simple or complex, but entropy is never in the object. It is in our relation to the object. Entropy always belongs to the knower, not the known.
She already knows. Entropy is how far you are from knowing what She knows.
