Boltzmann constant
Boltzmann constant
The Boltzmann constant (kB or k), named after its discoverer, Ludwig Boltzmann, is a physical constant that relates the average relative kinetic energy of particles in a gas with the temperature of the gas. [4] It occurs in the definitions of the kelvin and the gas constant, and in Planck's law of black-body radiation and Boltzmann's entropy formula. The Boltzmann constant has the dimension energy divided by temperature, the same as entropy.
As part of the 2019 redefinition of SI base units, the Boltzmann constant is one of the seven "defining constants" that have been given exact definitions. They are used in various combinations to define the seven SI base units. The Boltzmann constant is defined to be exactly 1.380649×10−23 J/K.[5][6]
This definition allows the Kelvin to be defined in terms of the Boltzmann constant, the metre, the second, and the kilogram. Before 2019, its value in SI units was a measured quantity. Measurements of the Boltzmann constant depended on the definition of the kelvin in terms of the triple point of water. The measured values were used to determine the quantity that is used in the 2019 definition, to make the definition's value for the kelvin identical to the old value to within the limits of experimental accuracy at the time of the definition.
Bridge from macroscopic to microscopic physics
The Boltzmann constant, k, is a scaling factor between macroscopic (thermodynamic temperature) and microscopic (thermal energy) physics. Macroscopically, the ideal gas law states that, for an ideal gas, the product of pressure p and volume V is proportional to the product of amount of substance n (in moles) and absolute temperature T:
where R is the gas constant (8.31446261815324 J⋅K−1⋅mol−1).[7] Introducing the Boltzmann constant transforms the ideal gas law into an alternative form:
where N is the number of molecules of gas. For n = 1 mol, N is equal to the number of particles in one mole (Avogadro's number).
Role in the equipartition of energy
Given a thermodynamic system at an absolute temperature T, the average thermal energy carried by each microscopic degree of freedom in the system is 1/2kT (i.e., about 2.07×10−21 J, or 0.013 eV, at room temperature).
In classical statistical mechanics, this average is predicted to hold exactly for homogeneous ideal gases. Monatomic ideal gases (the six noble gases) possess three degrees of freedom per atom, corresponding to the three spatial directions, which means a thermal energy of 3/2kT per atom. This corresponds very well with experimental data. The thermal energy can be used to calculate the root-mean-square speed of the atoms, which turns out to be inversely proportional to the square root of the atomic mass. The root mean square speeds found at room temperature accurately reflect this, ranging from 1370 m/s for helium, down to 240 m/s for xenon.
Kinetic theory gives the average pressure p for an ideal gas as
Combination with the ideal gas law
shows that the average translational kinetic energy is
Considering that the translational motion velocity vector v has three degrees of freedom (one for each dimension) gives the average energy per degree of freedom equal to one third of that, i.e. 1/2kT.
The ideal gas equation is also obeyed closely by molecular gases; but the form for the heat capacity is more complicated, because the molecules possess additional internal degrees of freedom, as well as the three degrees of freedom for movement of the molecule as a whole. Diatomic gases, for example, possess a total of six degrees of simple freedom per molecule that are related to atomic motion (three translational, two rotational, and one vibrational). At lower temperatures, not all these degrees of freedom may fully participate in the gas heat capacity, due to quantum mechanical limits on the availability of excited states at the relevant thermal energy per molecule.
Role in Boltzmann factors
More generally, systems in equilibrium at temperature T have probability Pi of occupying a state i with energy E weighted by the corresponding Boltzmann factor:
where Z is the partition function. Again, it is the energy-like quantity kT that takes central importance.
Consequences of this include (in addition to the results for ideal gases above) the Arrhenius equation in chemical kinetics.
Role in the statistical definition of entropy
In statistical mechanics, the entropy S of an isolated system at thermodynamic equilibrium is defined as the natural logarithm of W, the number of distinct microscopic states available to the system given the macroscopic constraints (such as a fixed total energy E):
This equation, which relates the microscopic details, or microstates, of the system (via W) to its macroscopic state (via the entropy S), is the central idea of statistical mechanics. Such is its importance that it is inscribed on Boltzmann's tombstone.
The constant of proportionality k serves to make the statistical mechanical entropy equal to the classical thermodynamic entropy of Clausius:
One could choose instead a rescaled dimensionless entropy in microscopic terms such that
This is a more natural form and this rescaled entropy exactly corresponds to Shannon's subsequent information entropy.
The characteristic energy kT is thus the energy required to increase the rescaled entropy by one nat.
Role in semiconductor physics: the thermal voltage
In semiconductors, the Shockley diode equation—the relationship between the flow of electric current and the electrostatic potential across a p–n junction—depends on a characteristic voltage called the thermal voltage, denoted VT. The thermal voltage depends on absolute temperature T as
where q is the magnitude of the electrical charge on the electron with a value 1.6021766208(98)×10−19 C.[2] Equivalently,
History
Although Boltzmann first linked entropy and probability in 1877, the relation was never expressed with a specific constant until Max Planck first introduced k, and gave a precise value for it (1.346×10−23 J/K, about 2.5% lower than today's figure), in his derivation of the law of black body radiation in 1900–1901.[12] Before 1900, equations involving Boltzmann factors were not written using the energies per molecule and the Boltzmann constant, but rather using a form of the gas constant R, and macroscopic energies for macroscopic quantities of the substance. The iconic terse form of the equation S = k ln W on Boltzmann's tombstone is in fact due to Planck, not Boltzmann. Planck actually introduced it in the same work as his eponymous h.[13]
In 1920, Planck wrote in his Nobel Prize lecture:[14]
This constant is often referred to as Boltzmann's constant, although, to my knowledge, Boltzmann himself never introduced it — a peculiar state of affairs, which can be explained by the fact that Boltzmann, as appears from his occasional utterances, never gave thought to the possibility of carrying out an exact measurement of the constant.
This "peculiar state of affairs" is illustrated by reference to one of the great scientific debates of the time. There was considerable disagreement in the second half of the nineteenth century as to whether atoms and molecules were real or whether they were simply a heuristic tool for solving problems. There was no agreement whether chemical molecules, as measured by atomic weights, were the same as physical molecules, as measured by kinetic theory. Planck's 1920 lecture continued:[14]
Nothing can better illustrate the positive and hectic pace of progress which the art of experimenters has made over the past twenty years, than the fact that since that time, not only one, but a great number of methods have been discovered for measuring the mass of a molecule with practically the same accuracy as that attained for a planet.
In 2017, the most accurate measures of the Boltzmann constant were obtained by acoustic gas thermometry, which determines the speed of sound of a monatomic gas in a triaxial ellipsoid chamber using microwave and acoustic resonances.[15][16] This decade-long effort was undertaken with different techniques by several laboratories;[1] it is one of the cornerstones of the 2019 redefinition of SI base units. Based on these measurements, the CODATA recommended 1.380 649 × 10−23 J⋅K−1 to be the final fixed value of the Boltzmann constant to be used for the International System of Units.[17]
Value in different units
Values ofk | Units | Comments |
---|---|---|
1.380649×10−23 | J/K | SI by definition, J/K = m2⋅kg/(s2⋅K) in SI base units |
8.617333262145×10−5 | eV/K | electronvolt =1.602176634×10−19 J1/k=11604.518121550 K/eV |
2.0836612(12)×1010 | Hz/K | 2014 CODATA value[2] 1 Hz⋅h =6.626070040(81)×10−34 J[2] |
3.1668114(29)×10−6 | EH/K | EH= 2R∞hc=4.359744650(54)×10−18 J[2] =6.579683920729(33) Hz⋅h[2] |
1.0 | Atomic units | by definition |
1.38064852(79)×10−16 | erg/K | CGS system, 1 erg =1×10−7 J |
3.2976230(30)×10−24 | cal/K | 1 steam tablecalorie =4.1868 J |
1.8320128(17)×10−24 | cal/°R | 1 degree Rankine =5/9 K |
5.6573016(51)×10−24 | ft lb/°R | 1 foot-pound force =1.3558179483314004 J |
0.69503476(63) | cm−1/K | 2010 CODATA value[2] 1 cm−1 ⋅hc=1.986445683(87)×10−23 J |
0.0019872041(18) | kcal/(mol⋅K) | R noted kB, often used in statistical mechanics—using thermochemical calorie = 4.184 joule |
0.0083144621(75) | kJ/(mol⋅K) | R noted kB, often used in statistical mechanics |
4.10 | pN⋅nm | kTin piconewton nanometer at24 °C, used in biophysics |
−228.5991678(40) | dBW/(K⋅Hz) | in decibel watts, used in telecommunications (see Johnson–Nyquist noise) |
1.442 695 041... | Sh | in shannons (logarithm base 2), used in information entropy (exact value1/ln(2)) |
1 | nat | in nats (logarithm basee), used in information entropy (see § Planck units, below) |
Since k is a physical constant of proportionality between temperature and energy, its numerical value depends on the choice of units for energy and temperature. The small numerical value of the Boltzmann constant in SI units means a change in temperature by 1 K only changes a particle's energy by a small amount. A change of 1 °C is defined to be the same as a change of 1 K. The characteristic energy kT is a term encountered in many physical relationships.
The Boltzmann constant sets up a relationship between wavelength and temperature (dividing hc/k by a wavelength gives a temperature) with one micrometer being related to 14 387.770 K, and also a relationship between voltage and temperature (multiplying the voltage by k in units of eV/K) with one volt being related to 11 604.519 K. The ratio of these two temperatures, 14 387.770 K/11 604.519 K ≈ 1.239842, is the numerical value of hc in units of eV⋅μm.
Planck units
The Boltzmann constant provides a mapping from this characteristic microscopic energy E to the macroscopic temperature scale T = E/k. In physics research another definition is often encountered in setting k to unity, resulting in the Planck units or natural units for temperature and energy. In this context temperature is measured effectively in units of energy and the Boltzmann constant is not explicitly needed.[18]
The equipartition formula for the energy associated with each classical degree of freedom then becomes
The use of natural units simplifies many physical relationships; in this form the definition of thermodynamic entropy coincides with the form of information entropy:
where Pi is the probability of each microstate.
The value chosen for a unit of the Planck temperature is that corresponding to the energy of the Planck mass or 1.416808(33)×1032 K.[2]
See also
CODATA 2018
Thermodynamic beta