This post was developed by 527 students Patrick Adam, Yueqi Hu, and Nathan Royer.
In 1697, the English engineer, Thomas Savery constructed a rudimentary steam engine (see right figure); and some 80 years later, in 1781, the Scottish inventor, James Watts patented a steam engine. Here for the first time in history was a contrivance that made possible the use of “heat” to produce “mechanical motion”. However, these early developed heat engine were inefficient.
In the years following, many efforts were made to improve the efficiency from these new-fangled machines. In 1824 a French engineer named Nicolas Léonard Sadi Carnot wrote a paper called “Reflections on the Motive Power of Fire” on the efficiencies of the steam engines. In this paper, Carnot identified that for any engine to repeatedly convert heat into mechanical work, it is necessary for a constant temperature differential to exist. Carnot subscribed to a prevailing theory of the time, that Heat was actually the external manifestation of the internal presence of some form of hot fluid. Consequently, Carnot assumed that heat must flow, in a similar fashion to how water flows. Carnot declared that: Heat always spontaneously flows in the direction from hot to cold and will continue to do so while any temperature differential exists.
In 1845, the English physicist, James Prescott Joule showed how a certain amount of mechanical work done by the vigorous stirring of a paddlewheel in water, could produce a directly proportional rise in the temperature of the water. With this work, Joule gave birth to the concept of the “Mechanical Equivalence of Heat”.
Later, at a meeting of the British Association for the Advancement of Science in Oxford in 1847, Joule communicated his work and ideas to the Belfast born mathematician and physicist William Thomson (who was later given the peerage Baron Kelvin of Largs). Kelvin set to work on the theories of Carnot and Joule and understood that “Heat” and “Work” are just “different forms of energy”. He realized that in the operation of an engine, heat is not being conserved, but actually being “converted” into work. Thus, Carnot’s principle regarding the conservation of heat was replaced with the principle of “The Conservation of Energy”. Although it is possible to completely convert work into heat (by friction), Kelvin found that it is impossible to completely convert heat into work, without having some “useless” heat left over. Kelvin described this perceived inevitable loss of heat as the principle of “The Dissipation of Energy”.
In 1850, German physicist Rudolf Clausius supposed that some changes occurs while a working body, e.g. steam, transfers heat to work. A mathematical interpretation of this “change” is by questioning the nature of the inherent loss of usable heat when work is done. Clausius described it as the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body during a change of state . In this paper, he also formulated the first principle of thermodynamics, stating that in a physical system without energy exchanges with the external world, the total energy is conserved. And he proposed the second principle of thermodynamics stating that heat cannot flow from a cold to a warm body without the expense of energy.
In 1854, following a mathematical presentation of the first principle of thermodynamics, Clausius then presented the mathematical formulation of entropy, He stated : If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heat Q from work at the temperature T, has the equivalence-value: Q/T. And the passage of the quantity of heat Q from the temperature T1 to the temperature T2, has the equivalence-value:
Although at this point in the development of his theories he called it “equivalence-value”, perhaps referring to the concept of the mechanical equivalent of heat which was developing at the time.
In the period 1857–1865, Clausius published his works on the kinetic theory of gases, including in the theoretical treatment the translational, rotational and vibrational motions of the molecules . In 1865 he introduced the fundamental concepts of classical thermodynamics, that of entropy , as the ratio ∆S = ∆Q/T between the quantity ∆Q of heat transferred in a process from a warm to a cold source and therefore not utilized to make work and the temperature T at which the process takes place. Clausius selected the word entropy due to its resemblance to that of energy. Although Clausius did not specify why he chose the symbol “S” to represent entropy, it is arguable that Clausius chose “S” in honor of Sadi Carnot.
The Entropy / Disorder Debate
Entropy has often been mischaracterized as a measure of disorder in a system, however this analogy is very inaccurate and misleading when talking about thermodynamics. Information entropy which IS a measure of disorder, should not be conflated with thermodynamic entropy which is essentially a measure of the spontaneous dispersal of energy (how widely spread out it becomes at a specific temperature) . Stated differently, “It is simply a measure of the total amount of energy that had to be dispersed within the substance (from the surroundings) from 0 K to T, incrementally and reversibly and divided by T for each increment, so the substance could exist as a solid or (with additional reversible energy input for breaking intermolecular bonds in phase changes) as a liquid or as a gas at the designated temperature. Because the heat capacity at a given temperature is the energy dispersed in a substance per unit temperature, integration from 0 K to T of Cp/T dT (+ q/T for any phase change) yields the total entropy.” 
Later, scientists such as Ludwig Boltzmann, Josiah Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis which defines the physical meaning from a microscopic view expressed by S = kB ln W where S is entropy, kB is Boltzmann’s constant and W is the number of microstates. A microstate is each arrangement of the energy of each particle in the whole system at one instant and has an associated probability of occurring. So the number of microstates accessible to a system indicates all the different ways energy can be arranged in the system, therefore the larger the number of microstates accessible the larger a system’s entropy will be at a given temperature. To give you an idea of the sheer number of microstates that can exist in a substance, the total number of atoms in the universe is estimated to be about 1080 , while the total number of microstates accessible in crystalline ice at 273 K is 101,299,000,000,000,000,000,000,000 .
A Peek at Microscale Thermodynamics
For simple monoatomic systems there exists a microstate that has a probability of existing that is orders of magnitude larger than all other microstates combined, known as the most probable distribution, and is denoted wmp such that .
S = kB ln wmp
Assuming low pressure and low to moderate temperature of the system of particles, Bose-Einstein (B-E) and Fermi-Dirac (F-D) statistics reduce to the same form. Using the method of corrected Bolzmann statistics by dividing the B-E statistic by N! the thermodynamic probability for a statistical distribution is given by
where j is the energy level, g is the number of quantum states, and N is the total number of particles in the system. The quantum state can be thought of as the statistical weight assigned to each energy level. A quantum state is a discreet energy state represented by four quantum numbers: (n) – principal quantum number (number of electron shells), (l) – azimuthal quantum number (shape of orbital), (m) – magnetic quantum number (projection of angular momentum), and (s) – spin quantum number (angular momentum of electron).
Using Sterling’s formula, the most probable distribution in logarithmic form is
However, for the most probable distribution, wmp the ratio can be found from the equilibrium distribution and simplified such that
where is known as the partition function for a system of N independent particles.
How does the statistical microscopic derivation agree with the definition from classical thermodynamics? The change in entropy is defined as
and for a reversible process
From the statistical thermodynamic definition of a reversible work process, work can be defined as
and the internal energy by
By substituting dln(Zn) into dS above we arrive at
which matches the previous definition of entropy ∆S = ∆Q/T .
 Rudolf Clausius, On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat. Poggendorff’s Annalen der Physick. 1850
 Rudolf Clausius, Ueber eine veränderte Form des zweiten Hauptsatzes der mechanischen Wärmetheoriein. Annalen der Physik und Chemie 93(12): 481–506. 1854.
 Rudolf Clausius, On the Application of the Mechanical theory of Heat to the Steam-Engine. 1856.
 Rudolf Clausius, The mechanical theory of heat, Book, 1865.
 The Elusive Nature of Entropy and Its Physical Meaning. Kostic, milivoje M. 2, s.l. : entropy, 2014, Vol. 16. 2: 953-967.
 Lambert, frank. entropysite. [Online] http://entropysite.oxy.edu/entropy_is_simple/index.html#microstate.
 —. a description of microstates. [Online] http://entropysite.oxy.edu/microstate/.