Why complexity rises then falls while entropy only increases
The First Law of Complexodynamics is Scott Aaronson’s exploration of why physical systems exhibit a characteristic pattern: complexity rises, peaks, then falls—even as entropy monotonically increases.
The Puzzle
Entropy always increases (Second Law of Thermodynamics):
But complexity behaves differently. A freshly shuffled deck isn’t complex—it’s random. An ordered deck isn’t complex—it’s simple. Complexity peaks somewhere in between.
The Coffee Example
Consider cream being poured into coffee:
| Time | State | Entropy | Complexity |
|---|---|---|---|
| t=0 | Separated layers | Low | Low |
| t=mid | Swirling patterns | Medium | High |
| t=∞ | Uniform mixture | High | Low |
The intricate swirls are more “complex” than either extreme.
Defining Complexity
Aaronson proposes complextropy: the length of the shortest efficient program that outputs a distribution from which the observed state appears random.
For a string :
where is the Kolmogorov complexity of set , and looks random within .
Interactive Demo
Watch complexity rise and fall as a system evolves:
Complexodynamics
t = 0Why Complexity Peaks
At : Simple description (“all black on left, all white on right”)
At : Complex description (must specify intricate patterns)
At : Simple description (“random noise” or “uniform distribution”)
The Sophistication Connection
Kolmogorov’s sophistication formalizes this:
The sophistication of a string is the complexity of the simplest set containing it as a “typical” member.
Implications for AI
Deep learning loss landscapes might follow similar dynamics:
- Early training: Simple patterns (high loss, low complexity)
- Mid training: Complex intermediate features
- Late training: Simplified, generalizable representations
Key Resource
- The First Law of Complexodynamics — Scott Aaronson
https://scottaaronson.blog/?p=762