The aim of statistical mechanics is the evaluation of the laws of classical thermodynamics
for macroscopic systems using the properties of its atomic particles.
In addition to the classical
TD the statistical approach provides information on the nature of statistical errors and variations of thermodynamic parameters.
macro state | micro state |
e.g.- is characterized by \(T, p, V, N, ...\) |
|
Question: Which weight has a micro state in a macro state?
Principle:
Maximize ”Degree of Uncertainty”
within the restrictions of the macro state
”Degree of uncertainty” = ”Thermodynamic entropy”
Event \(i\) with probability \(p_i\),
| \begin{equation*} 1\geq p_i \geq 0 \qquad , \end{equation*} | (2.1) |
and
| \begin{equation*} \sum_i p_i = 1 \qquad . \end{equation*} | (2.2) |
The degree of uncertainty is defined by the contents of information of a statement.
For a function \(I\) to be an information \(I(p_i)\) of a statement with a probability \(p_i\) we need several properties:
\(I(1) = 0\) (From a certain statement to be fulfilled we get no new information)
\(I(p_i) \qquad\) monotonously increasing with \(1/p_i\)
\(I(p_i p_j) = I(p_i)+I(p_j) \qquad\) for two independent events the information just adds up
This three properties are fulfilled using the function
| \begin{equation*} I(p_i) = -k \ln(p_i) \qquad . \end{equation*} | (2.3) |
with \(k\): (arbitrary) Measurement unit for information.
For our information function we find:
\(I(0) = \infty\) (This is at least plausible)
To calculate the average information we must multiply the information of a statement with
its weight (probability) of occurrence.
Thus we get
| \begin{equation*} S' = \sum_i p_i I(p_i) = -k \sum_i p_i \ln(p_i) \qquad . \label{inf_avg} \end{equation*} | (2.4) |
For the equilibrium of a physical system the degree of uncertainty \(S'\) must
be maximized:
The mathematical effort is to find
| \begin{equation*} S = \max S' \label{S_max} \end{equation*} | (2.5) |
within the restrictions of the macro state.
The entropy \(S\)
is therefor just the maximum average information.
Justification of this principle:
The results describe all experiments: Macro states are dominated by micro states with large probabilities.
As we will learn in the next sections for classical particles in an isolated system the maximum of \(S'\)
(cf. Eq. (2.4)) is found if all states \(i\)
are occupied with the same probability \(p_i = p\), i.e.
| \begin{equation*} 1 = \sum\limits_{i=1}^{W} p_i = W p \qquad \mbox{resp. } \qquad p = \frac{1}{W} \qquad . \end{equation*} | (2.6) |
Inserting this result into Eq. (2.4) we find the famous equation
| \begin{equation*} S = \max S' = - k \sum\limits_{i=1}^{W} p \ln(p) = - k \ln\left(\frac{1}{W}\right) = k \ln(W) \label{S_W} \end{equation*} | (2.7) |
The relation between ”average information” and ”degree of uncertainty” may be somewhat counter intuitive; so we will discuss it for an example:
Let us assume a set of classical particles. All particles shall occupy state 1, i.e. \(p_1 = 1\) and \(p_i = 0\) for \(i \gt 1\).
Since
| \begin{equation*} \lim_{x \rightarrow 0} [x \ln(x)] = 0 \qquad \mbox{ and } \qquad [1 \ln(1)] = 0 \end{equation*} | (2.8) |
we find for the average information \(S'=0\). We know ”everything” about the occupation of the states \(i\); therefor the degree of uncertainty is 0. Maximizing the information for one state therefor minimizes the average information for the ensemble. Since \(p_i\) and \(I(p_i)\) are positive numbers in fact \(S' = 0\) is the global minimum of \(S'\).
A second possible misunderstanding concerns the relation between \(W\) and the particle number \(N\). \(W\) is incomparably larger than \(N\). Let us discuss this for a (most) simple example with only two different possible states, e.g. left half, right half of a box. Each particle therefor has two possibilities for occupying states leading to \(W = 2^N\) possible arrangements, i.e. microstates. Including this into Eq. (2.7) we get
| \begin{equation*} S = k \ln\left(2^N\right) = k N \ln(2) \qquad . \label{S_ln2} \end{equation*} | (2.9) |
Eq. (2.9) demonstrates
that of course the entropy is an extensive parameter; it scales with the size of the system. For a thermodynamic system
\(N\) is already a large number, but \(W\) is much larger.
For typical thermodynamic
systems each particle can occupy many different states, so the factor of 2 in our above example must be replaced typically
by numbers in the order of \(10^{20}\) and thus \(W \approx 10^{(20\,N)}\) which is indeed a huge
number.
© J. Carstensen (Stat. Meth.)