“The future belongs to those who can manipulate entropy; those who understand but energy will be only accountants. . . . The early industrial revolution involved energy, but the automatic factory of the future is an entropy revolution.” ~Frederic Keffer (~1900?)

Ludwig Boltzmann discovered the nature of entropy. Boltzmann was a physicist from Vienna. His first major contribution to science was derivation of the ideal gas law Pv = RT from purely statistical based arguments – no measurements involved, just free atoms modeled as billiard balls in a container with a statistical distribution of characteristics. This is illustrated in Figure 1.

molecular ratchet example
Figure 1: Molecular ratchet for work extraction with atom distributions.

Boltzmann’s next discovery was even more earth shattering. Boltzmann realized that the statistical distribution of the molecular properties was important. A thought experiment similar to the one shown in Figure 1 was helpful with this. Which molecular ratchet do you think is more capable of extracting work from the atoms in the system? If you thought the one on the left, you are probably correct. Which system do you think is more ordered with lower entropy? Again the one on the left is probably the right answer.

This line of thinking led Boltzmann to realize that the microscopic entropy of a system is related to the statistical distribution of particle energies. The higher the number of ways an atom, molecule, or particle can occupy a space (different velocities, speeds, vibrations, rotations, etc.), the higher the entropy of the system. This eventually led him to derive a mathematical expression for the entropy of a statistical system:
S = k log(#ways)
In general, his conclusion for a theoretical or statistical system was that the entropy or disorder of the system is a constant multiplied by the logarithm of the probability, or number of ways, that something can occur. The life of Boltzmann is a fascinating story in itself and worth a read. He believed in his equation so much that he even had it engraved on his tombstone.

Boltzmanns tombstone

Figure 2: Boltzmann’s tombstone with his famous equation.

Although heat and temperature were used to derive entropy from a classical perspective, starting from a statistical point of view allows entropy to be applied to many more things than heat and thermodynamics. Boltzmann’s equation continues to come up in many unexpected places. Claude Shannon found the same equation when studying the transfer of information during communication. Information entropy is behind the game everyone plays as a kid where you whisper a defined sentence into your neighbor’s ear, your neighbor whispers it into their neighbor’s ear, and so on down the line. Usually the end result is quite different or disordered from the beginning. The amount of disorder is directly related to the number of ways or times that it was communicated during the transmission.

Let’s try another example: the game of Hangman. I’m thinking of a five letter word that starts with the letter “z” z _ _ _ _.

Any guesses? How about now: z _ b _ _?

If you guessed “z e b r a” you got it.

Let‟s try another five letter word: _ e _ r _.

Having trouble?

This one is also “z e b r a“. So why is the first example easier to solve than the second example? The answer lies in the number of ways, or the entropy of the letters “z‟ and “b‟. These letters are used in only a few ways in the English language, they have very low entropy and a smaller chance of a chaotic or disordered number of words they can be used in. The letters “e‟ and “r‟ aRE all ovER the place. ThERE aRE a numbER of woRds that usE “e‟ and “r‟. These letters have much higher entropy and subsequently it is much more difficult to guess the word they are associated with. Information entropy is partly behind the points individual letters are worth in the game of Scrabble.

Think about that the next time you come up with a password or code.

Statistical entropy comes up in a number of other fun situations that are fascinating to think about, try some for yourself:

  1. Entropy of a crosswalk or intersection as a predictor of accidents. Is a roundabout more entropic than a 4 way stoplight?
  2. Entropy/diversity of a forest as a predictor of resilience to disease and natural disaster. Which is healthier? The old growth forest or the replanted clear cut? It’s a tough one and the answer depends on context. My logging friends will give a very different answer than someone from the Sierra Club.
  3. Entropy/memetic diversity of a society as a predictor of resilience/robustness.

Just like learning, entropy is a one way street. Go ahead and try to get this out of your head after you’ve applied it.