Skip to main content Skip to navigation
Hydrogen Properties for Energy Research (HYPER) Lab Patrick Adam’s Posts

Finding Cryogenic Material Propeties

Many people don’t consider from day to day how we know properties of any given material for use in design. It seems to be common knowledge that water freezes at 0°C, and it’s easy enough to look up thermal conductivities or heat capacity of common metals, gasses, and building materials. What happens, however, when your operating conditions are hundreds of degrees below room temperature? You can’t assume the same, easily found values anymore – you have to find someone who has taken the measurements at those extreme temperatures. So where do you go? Here’s a list of some good options we’ve used in the past to find data.

  1. Engineering Equation Solver (EES)
    • The lab uses EES for much of the thermodynamic calculations we do, and one reason is that it has standard curves for thermodynamic properties of many materials across a wide range of operating conditions. Through available function calls, you can get accurate thermodynamic properties for the most common real and ideal fluids, even at cryogenic temperatures. EES also has a selection of commonly used incompressible substances. Whenever using a material for the first time, make sure you look at the substance properties and references to ensure you understand the valid operating conditions and assumptions the substance is using.
  2. NIST Cryogenic Materials Database
    • NIST has data for several common structural materials at cryogenic materials. Some of these are referenced as incompressible substances in EES, some are not. The specific properties given varies for the different substances.
  3. Researchmeasurments.com / Jack Ekin’s Experimental Techniques for Low-temperature Measurements: Cryostat Design, Materials Properties, and Superconductor Critical-Current Testing
    • This site provides supplemental information and updates for the book Experimental Techniques for Low-temperature Measurements: Cryostat Design, Materials Properties, and Superconductor Critical-Current Testing, published by Oxford University Press in 2006, 2007, and 2011. The book is a handy guide we often use for reference in building our cryogenic systems. The site has many figures and data tables from the book, including many on the varies properties of materials commonly used in cryogenic design. The book provides much further insight into design that is not available on the website, and the lab owns several copies for reference.

These are the sources we’ve used the most in the lab – please let us know if you have a favorite we haven’t listed!

ME 527 Lesson 18: Entropy

This post was developed by 527 students Patrick Adam, Yueqi Hu, and Nathan Royer.

History

In 1697, the English engineer, Thomas Savery constructed a rudimentary steam engine (see right figure); and some 80 years later, in 1781, the Scottish inventor, James Watts patented a steam engine. Here for the first time in history was a contrivance that made possible the use of “heat” to produce “mechanical motion”. However, these early developed heat engine were inefficient.

entropy_machineCarnot_book

In the years following, many efforts were made to improve the efficiency from these new-fangled machines. In 1824 a French engineer named Nicolas Léonard Sadi Carnot wrote a paper called “Reflections on the Motive Power of Fire” on the efficiencies of the steam engines. In this paper, Carnot identified that for any engine to repeatedly convert heat into mechanical work, it is necessary for a constant temperature differential to exist. Carnot subscribed to a prevailing theory of the time, that Heat was actually the external manifestation of the internal presence of some form of hot fluid. Consequently, Carnot assumed that heat must flow, in a similar fashion to how water flows. Carnot declared that: Heat always spontaneously flows in the direction from hot to cold and will continue to do so while any temperature differential exists.

In 1845, the English physicist, James Prescott Joule showed how a certain amount of mechanical work done by the vigorous stirring of a paddlewheel in water, could produce a directly proportional rise in the temperature of the water. With this work, Joule gave birth to the concept of the “Mechanical Equivalence of Heat”.

Later, at a meeting of the British Association for the Advancement of Science in Oxford in 1847, Joule communicated his work and ideas to the Belfast born mathematician and physicist William Thomson (who was later given the peerage Baron Kelvin of Largs). Kelvin set to work on the theories of Carnot and Joule and understood that “Heat” and “Work” are just “different forms of energy”. He realized that in the operation of an engine, heat is not being conserved, but actually being “converted” into work. Thus, Carnot’s principle regarding the conservation of heat was replaced with the principle of “The Conservation of Energy”. Although it is possible to completely convert work into heat (by friction), Kelvin found that it is impossible to completely convert heat into work, without having some “useless” heat left over. Kelvin described this perceived inevitable loss of heat as the principle of “The Dissipation of Energy”.

In 1850, German physicist Rudolf Clausius supposed that some changes occurs while a working body, e.g. steam, transfers heat to work. A mathematical interpretation of this “change” is by questioning the nature of the inherent loss of usable heat when work is done. Clausius described it as the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body during a change of state [1]. In this paper, he also formulated the first principle of thermodynamics, stating that in a physical system without energy exchanges with the external world, the total energy is conserved. And he proposed the second principle of thermodynamics stating that heat cannot flow from a cold to a warm body without the expense of energy.

working_body

In 1854, following a mathematical presentation of the first principle of thermodynamics, Clausius then presented the mathematical formulation of entropy, He stated [2]: If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heat Q from work at the temperature T, has the equivalence-value: Q/T. And the passage of the quantity of heat Q from the temperature T1 to the temperature T2, has the equivalence-value:

Q(T)

Although at this point in the development of his theories he called it “equivalence-value”, perhaps referring to the concept of the mechanical equivalent of heat which was developing at the time.

In the period 1857–1865, Clausius published his works on the kinetic theory of gases, including in the theoretical treatment the translational, rotational and vibrational motions of the molecules [3]. In 1865 he introduced the fundamental concepts of classical thermodynamics, that of entropy [4], as the ratio ∆S = ∆Q/T between the quantity ∆Q of heat transferred in a process from a warm to a cold source and therefore not utilized to make work and the temperature T at which the process takes place. Clausius selected the word entropy due to its resemblance to that of energy. Although Clausius did not specify why he chose the symbol “S” to represent entropy, it is arguable that Clausius chose “S” in honor of Sadi Carnot.

The Entropy / Disorder Debate

Entropy has often been mischaracterized as a measure of disorder in a system, however this analogy is very inaccurate and misleading when talking about thermodynamics. Information entropy which IS a measure of disorder, should not be conflated with thermodynamic entropy which is essentially a measure of the spontaneous dispersal of energy (how widely spread out it becomes at a specific temperature) [5]. Stated differently,  “It is simply a measure of the total amount of energy that had to be dispersed within the substance (from the surroundings) from 0 K to T, incrementally and reversibly and divided by T for each increment, so the substance could exist as a solid or (with additional reversible energy input for breaking intermolecular bonds in phase changes) as a liquid or as a gas at the designated temperature. Because the heat capacity at a given temperature is the energy dispersed in a substance per unit temperature, integration from 0 K to T of Cp/T dT (+ q/T for any phase change) yields the total entropy.” [6]

Later, scientists such as Ludwig Boltzmann, Josiah Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis which defines the physical meaning from a microscopic view expressed by S = kB ln W where S is entropy, kB is Boltzmann’s constant and W is the number of microstates. A microstate is each arrangement of the energy of each particle in the whole system at one instant and has an associated probability of occurring. So the number of microstates accessible to a system indicates all the different ways energy can be arranged in the system, therefore the larger the number of microstates accessible the larger a system’s entropy will be at a given temperature. To give you an idea of the sheer number of microstates that can exist in a substance, the total number of atoms in the universe is estimated to be about 1080 , while the total number of microstates accessible in crystalline ice at 273 K is 101,299,000,000,000,000,000,000,000 [7].

A Peek at Microscale Thermodynamics

For simple monoatomic systems there exists a microstate that has a probability of existing that is orders of magnitude larger than all other microstates combined, known as the most probable distribution, and is denoted wmp such that .

S = kB ln wmp

 

Assuming low pressure and low to moderate temperature of the system of particles, Bose-Einstein (B-E) and Fermi-Dirac (F-D) statistics reduce to the same form. Using the method of corrected Bolzmann statistics by dividing the B-E statistic by N! the thermodynamic probability for a statistical distribution is given by

w_mp

where j is the energy level, g is the number of quantum states, and N is the total number of particles in the system. The quantum state can be thought of as the statistical weight assigned to each energy level. A quantum state is a discreet energy state represented by four quantum numbers: (n) – principal quantum number (number of electron shells), (l) – azimuthal quantum number (shape of orbital), (m) – magnetic quantum number (projection of angular momentum), and (s) – spin quantum number (angular momentum of electron).

energy_quantum_state

Using Sterling’s formula, the most probable distribution in logarithmic form is

ln_w

However, for the most probable distribution, wmp the ratio can be found from the equilibrium distribution and simplified such that

ln_w_mp

and entropy

s

where is known as the partition function for a system of N independent particles.

How does the statistical microscopic derivation agree with the definition from classical thermodynamics? The change in entropy is defined as

dsand for a reversible process

ds_ln_Zn

 

Boltzmann_distribution

From the statistical thermodynamic definition of a reversible work process, work can be defined as

dWrevand the internal energy by

U

such that

ds_ln_Zn_simple

By substituting dln(Zn) into dS above we arrive at

ds_final

which matches the previous definition of entropy ∆S = ∆Q/T .

 

References

[1] Rudolf Clausius, On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat. Poggendorff’s Annalen der Physick. 1850

[2] Rudolf Clausius, Ueber eine veränderte Form des zweiten Hauptsatzes der mechanischen Wärmetheoriein. Annalen der Physik und Chemie 93(12): 481–506. 1854.

[3] Rudolf Clausius, On the Application of the Mechanical theory of Heat to the Steam-Engine. 1856.

[4] Rudolf Clausius, The mechanical theory of heat, Book, 1865.

[5] The Elusive Nature of Entropy and Its Physical Meaning. Kostic, milivoje M. 2, s.l. : entropy, 2014, Vol. 16. 2: 953-967.

[6] Lambert, frank. entropysite. [Online] http://entropysite.oxy.edu/entropy_is_simple/index.html#microstate.

[7] —. a description of microstates. [Online] http://entropysite.oxy.edu/microstate/.

Washington State University