It was no less than Albert Einstein who said, “Classical Thermodynamics is the only physical theory of a universal nature in which, I am convinced that, within the framework of its basic concepts it will never be overthrown.” Around a century later this statement holds true; classical thermodynamics remains a cornerstone of physical law with physicists trying to resolve the predictions of quantum gravity with the laws of thermodynamics.
After over a decade of teaching thermodynamics and system design, it’s time to more formally merge these disparate fields. If the Laws of Thermodynamics do indeed apply to and govern everything in the physical universe, they also control the evolution of complex systems. What follows here are core tenets (e.g. principles or beliefs) extended from thermodynamic law, for how complex systems evolve. Since humans are often forcing the evolution (and sometimes arbitrarily) we have to accept that these tenets, like the laws of thermodynamics, are subject to statistical averages — while we can arbitrarily force a solution that disagrees with a tenet, the statistical history of many attempts will hold these tenets to be true on average.
Generalized in such a way, these tenets apply to a host of systems humanity has invented, from traditional energy systems to complex machinery, from social interactions to business and economics, from biological systems to Nature and the Universe.
1) Complex systems exist to sustain themselves
The first law of thermodynamics states that energy is conserved and neither created or destroyed. Extended to complex systems this most basic law of the universe not only defines what a complex system is, but what they do. A complex system is something, really anything, that exists as a result of an energy cascade. An energy cascade is a gradient, or difference, in anything that can be measured or perceived. These gradients often occur in a case, or spectrum, of differing degrees that result in many interconnected complex systems. We can define this something as a complex system because it persists and sustains itself in an energy cascade. Said simply, complex systems exist at many scales because of an energy cascade and can be identified because they persist to sustain themselves.
2) Intelligent systems maximize future freedoms
A leading definition of intelligence is doing whatever maximizes future freedoms. In my introduction to the second law of thermodynamics, which states that entropy (aka disorder) must always increase, I use the example of 1 Watt of heat; this Watt could be created by rubbing my hands together or from a fusion energy device. The question is, which Watt is more valuable? The Watt that’s keeping my hands warm? Or the Watt that could pyrolize water, propel an interstellar spacecraft, or transmute elements? Obviously, the Watt from a fusion energy reactor is much more useful, in many more ways, should we be intelligent about how we use it (we could always waste it boiling water or hand warming). What should also be noted is that the temperature gradient for the fusion Watt is MUCH larger than the hand warming Watt. This larger temperature gradient enables a much higher Carnot, or limiting, efficiency of the system. The higher efficiency allows you to do the work you want done before you loose the energy to simple warmth and entropy. Said simply, not all energy is created equal. Not all intelligence is equal. The entropy of the universe always increases, sometimes in more ways than others. But when we’re smart about it, we leave ourselves more options going forward.
3) Systems change when more things become connected, more simply
Combining the first and second laws of thermodynamics most simply results in the Gibbs Energy G=U+Pv-TS (These variables are Gibbs Energy G, Internal Energy U, Pressure P, Specific Volume v, Temperature T, and Entropy S). The difference in Gibbs Energy (G2-G1) defines whether a system will spontaneously react or change (if negative) or stay the same (if positive). While there are multiple terms in this function you can generally see that when the entropy (S, or the number of ways something can be) goes up, the function G2-G1 is more likely to be negative as the G2 term gets smaller and a change will spontaneously occur. You can see this in all chemical reactions, but also from Miles Davis’ use of atonality in Kind of Blue, to the Android operating system on your phone, to Mike Leach’s Air Raid offense, to basically anything that has ever changed. Change occurs when more ways or things become interconnected, more simply.
4) The speed of system change depends on the density, resistivity, and capacity
A classic statement about the third tenet is that classical thermodynamics will tell you whether something will change or not, but does nothing to tell you how fast it will change. The fourth tenet and Non-Equilibrium Thermodynamics, however, does. Non-Equilibrium Thermodynamics has several rate constants for the different forcing mechanisms driving entropy generation. In the case of heat transfer the rate is controlled by the thermal diffusivity, which is defined as the thermal conductivity divided by the heat capacity multiplied by the density. This thermal diffusivity then controls how quickly a change in the driving temperature differential propagates through a system. Similar equations can be developed for momentum, mass, and electrical diffusion. A key challenge is extending this framework to other systems like economics, biology, and information transfer.
5) Robust systems are loosely coupled
This is a core tenet of computer programmers. Energy cascades create and necessitate the coexistence of many systems, systems of systems. Look at any complex system in Nature and you’ll see many mostly independent sub-systems existing in the harmonious synergy of the energy cascade. Big trees, medium trees, small trees, large shrubs, small shrubs, plants and flowers, grasses, mosses and fungi, the bacterial microcosm, and so on in a complex synergistic web of existence. Very few things in the physical world have to work around a single entropy generation mechanism. Most, if not all are complex problems resulting in many competing drivers of entropy generation, optimize for one and another will get you. This is how life and Nature evolves; in a random walk to fill gaps in the energy cascade. Also known as the Constructal Law of thermodynamics, complex systems when left free to morph will evolve to maximize flow through competing extremes. Nature knows that over-reliance on specific structures, or over specialization, does not allow for maximization of flow in the face of an ever-changing universe. Non-Equilibrium Thermodynamics requires the transport of energy to generate entropy, leading to branching tree like system hierarchies. Loosely couple the systems so if one fails, several are ready to fill the void. This is the reason that our cellphones haven’t been replaced by neural links, the US political system is so robust, and that AI won’t totally wipe us out. Loose coupling is the most robust in the long run. (Special thanks to Chuck Pezeshki who pointed this tenet out to me years ago.)
Now what?
Ok. So that’s it (so far). That’s how just about everything we know of evolves and changes from thermodynamic law. The question really becomes, how many complex systems not traditionally associated with Thermodynamic Law can we apply this to? I also imagine you’re skeptical and not comfortable with thermodynamics yet. This framework shows that understanding all of this is going to take some time. This will likely become the core theme of a book I plan to write during an upcoming sabbatical. The book will carefully teach this framework and then consider possible extensions. Let me know if you’d like to be involved or see this framework applied to specific topics.