A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies frequently appear as the logarithm of a partition function. The Onsager reciprocal relations in particular, are developed in terms of entropic potentials. In mathematics, free entropy means something quite different: it is a generalization of entropy defined in the subject of free probability. A free entropy is generated by a Legendre transformation of the entropy. The different potentials correspond to different constraints to which the system may be subjected.
1. Examples
The most common examples are:
Name |
Function |
Alt. function |
Natural variables |
Entropy |
[math]\displaystyle{ S = \frac {1}{T} U + \frac {P}{T} V - \sum_{i=1}^s \frac {\mu_i}{T} N_i \, }[/math] |
|
[math]\displaystyle{ ~~~~~U,V,\{N_i\}\, }[/math] |
Massieu potential \ Helmholtz free entropy |
[math]\displaystyle{ \Phi =S-\frac{1}{T} U }[/math] |
[math]\displaystyle{ = - \frac {A}{T} }[/math] |
[math]\displaystyle{ ~~~~~\frac {1}{T},V,\{N_i\}\, }[/math] |
Planck potential \ Gibbs free entropy |
[math]\displaystyle{ \Xi=\Phi -\frac{P}{T} V }[/math] |
[math]\displaystyle{ = - \frac{G}{T} }[/math] |
[math]\displaystyle{ ~~~~~\frac{1}{T},\frac{P}{T},\{N_i\}\, }[/math] |
where
-
- [math]\displaystyle{ S }[/math] is entropy
- [math]\displaystyle{ \Phi }[/math] is the Massieu potential[1][2]
- [math]\displaystyle{ \Xi }[/math] is the Planck potential[1]
- [math]\displaystyle{ U }[/math] is internal energy
|
-
- [math]\displaystyle{ T }[/math] is temperature
- [math]\displaystyle{ P }[/math] is pressure
- [math]\displaystyle{ V }[/math] is volume
- [math]\displaystyle{ A }[/math] is Helmholtz free energy
|
-
- [math]\displaystyle{ G }[/math] is Gibbs free energy
- [math]\displaystyle{ N_i }[/math] is number of particles (or number of moles) composing the i-th chemical component
- [math]\displaystyle{ \mu_i }[/math] is the chemical potential of the i-th chemical component
- [math]\displaystyle{ s }[/math] is the total number of components
- [math]\displaystyle{ i }[/math] is the [math]\displaystyle{ i }[/math]th components.
|
Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is [math]\displaystyle{ \psi }[/math], used by both Planck and Schrödinger. (Note that Gibbs used [math]\displaystyle{ \psi }[/math] to denote the free energy.) Free entropies where invented by French engineer François Massieu in 1869, and actually predate Gibbs's free energy (1875).
2. Dependence of the Potentials on the Natural Variables
2.1. Entropy
- [math]\displaystyle{ S = S(U,V,\{N_i\}) }[/math]
By the definition of a total differential,
- [math]\displaystyle{ d S = \frac {\partial S} {\partial U} d U + \frac {\partial S} {\partial V} d V + \sum_{i=1}^s \frac {\partial S} {\partial N_i} d N_i. }[/math]
From the equations of state,
- [math]\displaystyle{ d S = \frac{1}{T}dU+\frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i . }[/math]
The differentials in the above equation are all of extensive variables, so they may be integrated to yield
- [math]\displaystyle{ S = \frac{U}{T}+\frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right). }[/math]
2.2. Massieu Potential / Helmholtz Free Entropy
- [math]\displaystyle{ \Phi = S - \frac {U}{T} }[/math]
- [math]\displaystyle{ \Phi = \frac{U}{T}+\frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) - \frac {U}{T} }[/math]
- [math]\displaystyle{ \Phi = \frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) }[/math]
Starting over at the definition of [math]\displaystyle{ \Phi }[/math] and taking the total differential, we have via a Legendre transform (and the chain rule)
- [math]\displaystyle{ d \Phi = d S - \frac {1} {T} dU - U d \frac {1} {T} , }[/math]
- [math]\displaystyle{ d \Phi = \frac{1}{T}dU + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac {1} {T} dU - U d \frac {1} {T}, }[/math]
- [math]\displaystyle{ d \Phi = - U d \frac {1} {T}+\frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i. }[/math]
The above differentials are not all of extensive variables, so the equation may not be directly integrated. From [math]\displaystyle{ d \Phi }[/math] we see that
- [math]\displaystyle{ \Phi = \Phi(\frac {1}{T},V, \{N_i\}) . }[/math]
If reciprocal variables are not desired,[3]:222
- [math]\displaystyle{ d \Phi = d S - \frac {T d U - U d T} {T^2} , }[/math]
- [math]\displaystyle{ d \Phi = d S - \frac {1} {T} d U + \frac {U} {T^2} d T , }[/math]
- [math]\displaystyle{ d \Phi = \frac{1}{T}dU + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac {1} {T} d U + \frac {U} {T^2} d T, }[/math]
- [math]\displaystyle{ d \Phi = \frac {U} {T^2} d T + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i , }[/math]
- [math]\displaystyle{ \Phi = \Phi(T,V,\{N_i\}) . }[/math]
2.3. Planck Potential / Gibbs Free Entropy
- [math]\displaystyle{ \Xi = \Phi -\frac{P V}{T} }[/math]
- [math]\displaystyle{ \Xi = \frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) -\frac{P V}{T} }[/math]
- [math]\displaystyle{ \Xi = \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) }[/math]
Starting over at the definition of [math]\displaystyle{ \Xi }[/math] and taking the total differential, we have via a Legendre transform (and the chain rule)
- [math]\displaystyle{ d \Xi = d \Phi - \frac{P}{T} d V - V d \frac{P}{T} }[/math]
- [math]\displaystyle{ d \Xi = - U d \frac {2} {T} + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac{P}{T} d V - V d \frac{P}{T} }[/math]
- [math]\displaystyle{ d \Xi = - U d \frac {1} {T} - V d \frac{P}{T} + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i. }[/math]
The above differentials are not all of extensive variables, so the equation may not be directly integrated. From [math]\displaystyle{ d \Xi }[/math] we see that
- [math]\displaystyle{ \Xi = \Xi \left(\frac {1}{T}, \frac {P}{T}, \{N_i\} \right) . }[/math]
If reciprocal variables are not desired,[3]:222
- [math]\displaystyle{ d \Xi = d \Phi - \frac{T (P d V + V d P) - P V d T}{T^2} , }[/math]
- [math]\displaystyle{ d \Xi = d \Phi - \frac{P}{T} d V - \frac {V}{T} d P + \frac {P V}{T^2} d T , }[/math]
- [math]\displaystyle{ d \Xi = \frac {U} {T^2} d T + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac{P}{T} d V - \frac {V}{T} d P + \frac {P V}{T^2} d T , }[/math]
- [math]\displaystyle{ d \Xi = \frac {U + P V} {T^2} d T - \frac {V}{T} d P + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i , }[/math]
- [math]\displaystyle{ \Xi = \Xi(T,P,\{N_i\}) . }[/math]