Third Law of Thermodynamics
Get Third Law of Thermodynamics essential facts below. View Videos or join the Third Law of Thermodynamics discussion. Add Third Law of Thermodynamics to your PopFlock.com topic list for future reference or share this resource on social media.
Third Law of Thermodynamics

The third law of thermodynamics states as follows, regarding the properties of closed systems in thermodynamic equilibrium:

The entropy of a system approaches a constant value as its temperature approaches absolute zero.

This constant value cannot depend on any other parameters characterizing the closed system, such as pressure or applied magnetic field. At absolute zero (zero kelvins) the system must be in a state with the minimum possible energy. Entropy is related to the number of accessible microstates, and there is typically one unique state (called the ground state) with minimum energy.[1] In such a case, the entropy at absolute zero will be exactly zero. If the system does not have a well-defined order (if its order is glassy, for example), then there may remain some finite entropy as the system is brought to very low temperatures, either because the system becomes locked into a configuration with non-minimal energy or because the minimum energy state is non-unique. The constant value is called the residual entropy of the system.[2] The entropy is essentially a state-function meaning the inherent value of different atoms, molecules, and other configurations of particles including subatomic or atomic material is defined by entropy, which can be discovered near 0 K. The Nernst-Simon statement of the third law of thermodynamics concerns thermodynamic processes at a fixed, low temperature:

The entropy change associated with any condensed system undergoing a reversible isothermal process approaches zero as the temperature at which it is performed approaches 0 K.

Here a condensed system refers to liquids and solids. A classical formulation by Nernst (actually a consequence of the Third Law) is:

It is impossible for any process, no matter how idealized, to reduce the entropy of a system to its absolute-zero value in a finite number of operations.[3]

There also exists a formulation of the third law which approaches the subject by postulating a specific energy behavior:

If the composite of two thermodynamic systems constitutes an isolated system, then any energy exchange in any form between those two systems is bounded.[4]

History

The third law was developed by chemist Walther Nernst during the years 1906-12, and is therefore often referred to as Nernst's theorem or Nernst's postulate. The third law of thermodynamics states that the entropy of a system at absolute zero is a well-defined constant. This is because a system at zero temperature exists in its ground state, so that its entropy is determined only by the degeneracy of the ground state.

In 1912 Nernst stated the law thus: "It is impossible for any procedure to lead to the isotherm T = 0 in a finite number of steps."[5]

An alternative version of the third law of thermodynamics as stated by Gilbert N. Lewis and Merle Randall in 1923:

If the entropy of each element in some (perfect) crystalline state be taken as zero at the absolute zero of temperature, every substance has a finite positive entropy; but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances.

This version states not only ?S will reach zero at 0 K, but S itself will also reach zero as long as the crystal has a ground state with only one configuration. Some crystals form defects which cause a residual entropy. This residual entropy disappears when the kinetic barriers to transitioning to one ground state are overcome.[6]

With the development of statistical mechanics, the third law of thermodynamics (like the other laws) changed from a fundamental law (justified by experiments) to a derived law (derived from even more basic laws). The basic law from which it is primarily derived is the statistical-mechanics definition of entropy for a large system:

${\displaystyle S-S_{0}=k_{\text{B}}\ln \,\Omega }$

where is S entropy, kB is the Boltzmann constant, and ? is the number of microstates consistent with the macroscopic configuration. The counting of states is from the reference state of absolute zero, which corresponds to the entropy of S0.

Explanation

In simple terms, the third law states that the entropy of a perfect crystal of a pure substance approaches zero as the temperature approaches zero. The alignment of a perfect crystal leaves no ambiguity as to the location and orientation of each part of the crystal. As the energy of the crystal is reduced, the vibrations of the individual atoms are reduced to nothing, and the crystal becomes the same everywhere.

a) Single possible configuration for a system at absolute zero, i.e., only one microstate is accessible. Thus S = k ln W = 0. b) At temperatures greater than absolute zero, multiple microstates are accessible due to atomic vibration (exaggerated in the figure). Since the number of accessible microstates is greater than 1, S = k ln W > 0.

The third law provides an absolute reference point for the determination of entropy at any other temperature. The entropy of a closed system, determined relative to this zero point, is then the absolute entropy of that system. Mathematically, the absolute entropy of any system at zero temperature is the natural log of the number of ground states times Boltzmann's constant kB = .

The entropy of a perfect crystal lattice as defined by Nernst's theorem is zero provided that its ground state is unique, because ln(1) = 0. If the system is composed of one-billion atoms, all alike, and lie within the matrix of a perfect crystal, the number of combinations of one-billion identical things taken one-billion at a time is ? = 1. Hence:

${\displaystyle S-S_{0}=k_{\text{B}}\ln \Omega =k_{\text{B}}\ln {1}=0}$

The difference is zero, hence the initial entropy S0 can be any selected value so long as all other such calculations include that as the initial entropy. As a result, the initial entropy value of zero is selected S0 = 0 is used for convenience.

${\displaystyle S-S_{0}=S-0=0}$
${\displaystyle S=0}$

Example: Entropy change of a crystal lattice heated by an incoming photon

Suppose a system consisting of a crystal lattice with volume V of N identical atoms at T = 0 K, and an incoming photon of wavelength ? and energy ?.

Initially, there is only one accessible microstate:

${\displaystyle S_{0}=k_{\text{B}}\ln \Omega =k_{\text{B}}\ln {1}=0.}$

Let's assume the crystal lattice absorbs the incoming photon. There is a unique atom in the lattice that interacts and absorbs this photon. So after absorption, there is N possible microstates accessible by the system, each of the microstates corresponding to one excited atom, and the other atoms remaining at ground state.

The entropy, energy, and temperature of the closed system rises and can be calculated. The entropy change is:

${\displaystyle \Delta S=S-S_{0}=k_{\text{B}}\ln {\Omega }}$

From the second law of thermodynamics:

${\displaystyle \Delta S=S-S_{0}={\frac {\delta Q}{T}}}$

Hence:

${\displaystyle \Delta S=S-S_{0}=k_{\text{B}}\ln(\Omega )={\frac {\delta Q}{T}}}$

Calculating entropy change:

${\displaystyle S-0=k_{\text{B}}\ln {N}=1.38\times 10^{-23}\times \ln {\left(3\times 10^{22}\right)}=70\times 10^{-23}\,\mathrm {J\,K^{-1}} }$

We assume N = 3 × 1022 and ? = . The energy change of the system as a result of absorbing the single photon whose energy is ?:

${\displaystyle \delta Q=\varepsilon ={\frac {hc}{\lambda }}={\frac {6.62\times 10^{-34}\,\mathrm {J\cdot s} \times 3\times 10^{8}\,\mathrm {m\,s^{-1}} }{0.01\,\mathrm {m} }}=2\times 10^{-23}\,\mathrm {J} }$

The temperature of the closed system rises by:

${\displaystyle T={\frac {\varepsilon }{\Delta S}}={\frac {2\times 10^{-23}\,\mathrm {J} }{70\times 10^{-23}\,\mathrm {J\,K^{-1}} }}=0.02857\,\mathrm {K} }$

This can be interpreted as the average temperature of the system over the range from ${\displaystyle 0.[7] A single atom was assumed to absorb the photon but the temperature and entropy change characterizes the entire system.

Systems with non-zero entropy at absolute zero

An example of a system which does not have a unique ground state is one whose net spin is a half-integer, for which time-reversal symmetry gives two degenerate ground states. For such systems, the entropy at zero temperature is at least kB ln(2) (which is negligible on a macroscopic scale). Some crystalline systems exhibit geometrical frustration, where the structure of the crystal lattice prevents the emergence of a unique ground state. Ground-state helium (unless under pressure) remains liquid.

In addition, glasses and solid solutions retain large entropy at 0 K, because they are large collections of nearly degenerate states, in which they become trapped out of equilibrium. Another example of a solid with many nearly-degenerate ground states, trapped out of equilibrium, is ice Ih, which has "proton disorder".

For the entropy at absolute zero to be zero, the magnetic moments of a perfectly ordered crystal must themselves be perfectly ordered; from an entropic perspective, this can be considered to be part of the definition of a "perfect crystal". Only ferromagnetic, antiferromagnetic, and diamagnetic materials can satisfy this condition. However, ferromagnetic materials do not, in fact, have zero entropy at zero temperature, because the spins of the unpaired electrons are all aligned and this gives a ground-state spin degeneracy. Materials that remain paramagnetic at 0 K, by contrast, may have many nearly-degenerate ground states (for example, in a spin glass), or may retain dynamic disorder (a quantum spin liquid).[]

Consequences

Fig. 1 Left side: Absolute zero can be reached in a finite number of steps if . Right: An infinite number of steps is needed since .

Absolute zero

The third law is equivalent to the statement that

It is impossible by any procedure, no matter how idealized, to reduce the temperature of any closed system to zero temperature in a finite number of finite operations.[8]

The reason that T = 0 cannot be reached according to the third law is explained as follows: Suppose that the temperature of a substance can be reduced in an isentropic process by changing the parameter X from X2 to X1. One can think of a multistage nuclear demagnetization setup where a magnetic field is switched on and off in a controlled way.[9] If there were an entropy difference at absolute zero, T = 0 could be reached in a finite number of steps. However, at T = 0 there is no entropy difference so an infinite number of steps would be needed. The process is illustrated in Fig. 1.

Specific heat

A non-quantitative description of his third law that Nernst gave at the very beginning was simply that the specific heat can always be made zero by cooling the material down far enough.[10] A modern, quantitative analysis follows.

Supposed that the heat capacity of a sample in the low temperature region has the form of a power law asymptotically as , and we wish to find which values of ? are compatible with the third law. We have

By the discussion of third law (above), this integral must be bounded as , which is only possible if ? > 0. So the heat capacity must go to zero at absolute zero

if it has the form of a power law. The same argument shows that it cannot be bounded below by a positive constant, even if we drop the power-law assumption.

On the other hand, the molar specific heat at constant volume of a monatomic classical ideal gas, such as helium at room temperature, is given by CV = (3/2)R with R the molar ideal gas constant. But clearly a constant heat capacity does not satisfy Eq. (12). That is, a gas with a constant heat capacity all the way to absolute zero violates the third law of thermodynamics. We can verify this more fundamentally by substituting CV in Eq. (14), which yields

In the limit this expression diverges, again contradicting the third law of thermodynamics.

The conflict is resolved as follows: At a certain temperature the quantum nature of matter starts to dominate the behavior. Fermi particles follow Fermi-Dirac statistics and Bose particles follow Bose-Einstein statistics. In both cases the heat capacity at low temperatures is no longer temperature independent, even for ideal gases. For Fermi gases

with the Fermi temperature TF given by

Here NA is Avogadro's number, Vm the molar volume, and M the molar mass.

For Bose gases

with TB given by

The specific heats given by Eq. (14) and (16) both satisfy Eq. (12). Indeed, they are power laws with ?=1 and ?=3/2 respectively.

Even within a purely classical setting, the density of a classical ideal gas at fixed particle number becomes arbitrarily high as T goes to zero, so the interparticle spacing goes to zero. The assumption of non-interacting particles presumably breaks down when they are sufficiently close together, so the value of CV gets modified away from its ideal constant value.

Vapor pressure

The only liquids near absolute zero are 3He and 4He. Their heat of evaporation has a limiting value given by

with L0 and Cp constant. If we consider a container, partly filled with liquid and partly gas, the entropy of the liquid-gas mixture is

where Sl(T) is the entropy of the liquid and x is the gas fraction. Clearly the entropy change during the liquid-gas transition (x from 0 to 1) diverges in the limit of T->0. This violates Eq.(8). Nature solves this paradox as follows: at temperatures below about 50 mK the vapor pressure is so low that the gas density is lower than the best vacuum in the universe. In other words: below 50 mK there is simply no gas above the liquid.

Latent heat of melting

The melting curves of 3He and 4He both extend down to absolute zero at finite pressure. At the melting pressure, liquid and solid are in equilibrium. The third law demands that the entropies of the solid and liquid are equal at T = 0. As a result, the latent heat of melting is zero and the slope of the melting curve extrapolates to zero as a result of the Clausius-Clapeyron equation.

Thermal expansion coefficient

The thermal expansion coefficient is defined as

With the Maxwell relation

and Eq. (8) with X = p it is shown that

So the thermal expansion coefficient of all materials must go to zero at zero kelvin.

References

1. ^ J. Wilks The Third Law of Thermodynamics Oxford University Press (1961).[page needed]
2. ^ Kittel and Kroemer, Thermal Physics (2nd ed.), page 49.
3. ^ Wilks, J. (1971). The Third Law of Thermodynamics, Chapter 6 in Thermodynamics, volume 1, ed. W. Jost, of H. Eyring, D. Henderson, W. Jost, Physical Chemistry. An Advanced Treatise, Academic Press, New York, page 477.
4. ^ Heidrich, M. (2016). "Bounded energy exchange as an alternative to the third law of thermodynamics". Annals of Physics. 373: 665-681. Bibcode:2016AnPhy.373..665H. doi:10.1016/j.aop.2016.07.031.
5. ^ Bailyn, M. (1994). A Survey of Thermodynamics, American Institute of Physics, New York, ISBN 0-88318-797-3, page 342.
6. ^ Kozliak, Evguenii; Lambert, Frank L. (2008). "Residual Entropy, the Third Law and Latent Heat". Entropy. 10 (3): 274-84. Bibcode:2008Entrp..10..274K. doi:10.3390/e10030274.
7. ^ Reynolds and Perkins (1977). Engineering Thermodynamics. McGraw Hill. pp. 438. ISBN 978-0-07-052046-2.
8. ^ Guggenheim, E.A. (1967). Thermodynamics. An Advanced Treatment for Chemists and Physicists, fifth revised edition, North-Holland Publishing Company, Amsterdam, page 157.
9. ^ F. Pobell, Matter and Methods at Low Temperatures, (Springer-Verlag, Berlin, 2007)[page needed]
10. ^ Einstein and the Quantum, A. Douglas Stone, Princeton University Press, 2013.