c, States, Wavefunctions, Entropy, and the Arrow Without Time

c, States, Wavefunctions, Entropy, and the Arrow Without Time

A rigorous map for advanced students: causal structure, reduced states, unitary evolution, coarse entropy, and the disciplined use of the reality quotient.


Thesis

It isn’t that time flows; closedness fails. The “arrow” you feel is statistical: coarse-grained entropy in a reduced state tends to rise as boundaries lose hold and information integrates.

1) Causal scaffold: what c actually fixes

The speed of light c sets light cones and admissible influences. A “state” is best thought of on a spacelike slice (a foliation). Any update, inference, or interaction must respect causality: no superluminal signalling; influences propagate within light cones (≤ c).

2) States, microstates, and entities (indices first)

  • State s (the ellipse): boundary + labels + constraints (a hack-closed rulebook).
  • Microstate ω: complete snapshot of everything inside s, all at once, under those labels (spatial/combinatorial, not a timeline).
  • Entity X: a labeled degree of freedom (aperture) inside s.
  • Distribution rho_X(s): predictor’s beliefs on the microstates of s as seen through the aperture X.

3) Wavefunctions: unitary vs. readout

On an isolated state, the quantum state evolves unitarily; von Neumann entropy stays constant. Entropy rises when you reduce (trace out environment), measure, or coarse-grain—exactly the operations that define practical, hack-closed states.

S( rho ) = -Tr rho ln rho

For a subsystem, rho_A = Tr_B rho, correlations spread only within causal cones; the reduced entropy of A can increase as entanglement and mixing reach it (≤ c).

4) Entropy: where it lives and why the arrow appears

Entropy is a number on the distribution over microstates for a fixed state s:

H_X(s) = - sum p(omega) ln p(omega)

Fine-grained (fully labeled, isolated) entropy is constant; coarse-grained entropy typically drifts upward because we blur labels, ignore couplings, and allow weak exchange. That drift—not a fluid called time—is the felt arrow.

5) Absolute information bounds that include c

Bekenstein bound (energy E, radius R):

S ≤ 2π k E R / (ħ c)

Black-hole entropy (area law; l_P^2 = ħ G / c^3):

S_BH = k A / (4 l_P^2)

These tie entropy to energy, size, and fundamental constants—c limits how much “arrangement capacity” a finite state can admit.

6) Speed limits: how fast reconfiguration can happen

Quantum speed limits constrain minimal time to evolve between distinguishable states:

τ ≥ πħ / (2 ΔE)
τ ≥ ħ / (2 E)

In relativistic settings, feasible energy densities and causal propagation (≤ c) further bound the rate at which reduced subsystems can mix, entangle, and thus increase coarse entropy.

7) Collapse, locality, and “no drama”

Operationally, “collapse” is sample + update. A realized microstate omega* yields an observable via a map:

A_X(s) = g_X(omega*)

Knowledge updates elsewhere respect light cones; there is no superluminal signal. Observable consequences propagate ≤ c.

8) Reality quotient placed carefully

Reports are per-entity, per-state:

R_X(s) = A_X(s) / |E_X(s)|
E_X(s) = P_X(s) + i C_X(s)
S = ln R_X(s)

A is a local scalar from the realized microstate; |E| = sqrt(P^2 + C^2) summarizes predictor concentration and idea magnitude for the same (X,s). Because information and interaction reach X only within causal cones, the rate at which R (and S) can change is indirectly bounded by c and available energy/resources.

9) Classroom instantiation (Money Lab)

  • Fix the state s_ML: rubber-band hull + rules (participants, legal trades, pricing).
  • A microstate omega is the entire ledger at once; an entity X is one labeled degree of freedom.
  • Operational “time”: count unbiased reconfiguration steps inside fixed s. The distribution over microstates mixes; coarse H(s) tends not to decrease (fluctuations allowed). If you move the band or rules, you created a new state; reset the clock.

10) Red lines and quick repairs

  • Do not assign entropy to “outside” or to the center; entropy lives on a specified state with specified labels.
  • Entity ≠ microstate. Entity = window; microstate = everything (inside s) all at once.
  • Arrow ≠ law; arrow = boundary condition + coarse-graining + mixing (causally bounded by ≤ c).
  • “Collapse” = sample + update; mythos is welcome in prose, not in the math.

Board-ready lines

Microstate = everything, all at once; entity = a window.
Entropy lives on distributions.
c fixes the causal pace; the arrow appears because closedness fails.

Author: John Rector

Co-founded E2open with a $2.1 billion exit in May 2025. Opened a 3,000 sq ft AI Lab on Clements Ferry Road called "Charleston AI" in January 2026 to help local individuals and organizations understand and use artificial intelligence. Authored several books: World War AI, Speak In The Past Tense, Ideas Have People, The Coming AI Subconscious, Robot Noon, and Love, The Cosmic Dance to name a few.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from John Rector

Subscribe now to keep reading and get access to the full archive.

Continue reading