Boltzmann Entropy

In my junior year, I was taught thermodynamics and statistical physics in a one-semester course. In thermodynamics, there is a well-known equation that summarizes all the key concepts in thermodynamics: \begin{equation}dE=T\,dS-P\,dV+\mu\,dN\,\tag{1},\end{equation} where $E,T,S,P,V,\mu, N$ are a system's energy, temperature, entropy, pressure, volume, chemical potential and particle number, respectively. In statistical physics, there is also another famous equation on the microscopic interpretation of entropy: \begin{equation}S = k_B\,\log \Omega\,,\tag{2}\end{equation} where $k_B$ is the Boltzmann constant and $\Omega$ is the number of microstates corresponding to a system's macrostate. 

The question is how to prove the entropy in thermodynamics (1) is the same as the Boltzmann entropy (2)? My course teacher might present the proof in class, but I really don't have any impression. In recent days after work, I browsed some textbooks on thermodynamics and statistical physics for such a proof, and this post is a summary of what I have learned.

Thermodynamics

The key concepts and ideas in thermodynamics can be summarized in less than one page. Here it is.

Zeroth Law

The zeroth law of thermodynamics states that two bodies, that are separately in a thermal equilibrium with the same third body, are also in thermal equilibrium between each other. 

The zeroth law suggests that thermal equilibrium forms an equivalence relation. As a result, all systems that are in thermal equilibrium should be in the same equivalent class tagged by some "label".

Empirically, such a "label" for thermal equilibrium is temperature. We then introduce a simple model, ideal gas, as a thermometer to measure the temperature in physics. The temperature $T$ in ideal gas can be read by \begin{equation} T=\frac{P\,V}{Nk_B}\,,\tag{3}\end{equation} where $k_B$ is Boltzmann's constant and $P, V, N$ are pressure, volume, particle number of the ideal gas.

First Law

The first law of thermodynamics is just the energy conservation: \begin{equation}dE = dQ-dW\,.\end{equation} Heat is a form of energy rather than some hypothesized caloric, and the increase of a system's internal energy equals the heat it absorbs minus the work it preforms on the surroundings.

Second Law

The second law of thermodynamics is highly nontrivial in thermodynamics and even in entire physics. Sadi Carnot discovered this law in his theorem on the efficiency of heat engine. 

Besides idea gas, heat engine is another important theoretical model in thermodynamics. A heat engine absorbs heat $Q_1$ from a hot reservoir with high temperature $T_1$, emits heat $Q_2$ to a cold reservoir with low temperature $T_2$, and thus outputs work $W=Q_1-Q_2$ to its surroundings. The efficiency of the heat engine is defined by \begin{equation}\eta \equiv\frac{W}{Q_1}=1-\frac{Q_2}{Q_1}\,.\end{equation} The key insight to is to divide all the heat engines into two classes: irreversible engine (IRE for short) and reversible engine (also called Carnot engine, CE for short).

Statement 1: The efficiency of an IRE cannot be higher than the efficiency of a CE. 
Proof: Let IRE absorb heat $Q_1$ from the hot reservoir, emit heat $Q'_2$ to the cold reservoir, and output work $W'=Q_1-Q'_2$. Suppose the efficiency of IRE is higher, then we are able to use part of output work $W(<W')$ from IRE to drive CE reversely: the reverse CE absorbs heat $Q_2(>Q'_2)$ from the cold reservoir and brings the same heat $Q_1$ back to the hot reservoir with the input work $W$. As a result, the whole system absorbs the net heat $Q_2-Q'_2$ from a single (cold) reservoir, and fully coverts it to the net output work $W'-W=Q_2-Q'_2$, which is forbidden by the second law. 

Statement 2: All CEs have the same efficiency.
Proof: Suppose we have two CEs, say A and B. By the statement 1, we can use A to drive B reversely and  conclude that $\eta_A\leq \eta_B$. Similarly, we can use B to drive A reversely and have $\eta_B\leq \eta_A$. As a result, $\eta_A = \eta_B$.

Statement 3: The efficiency of all CEs is $1-\frac{T_2}{T_1}$.
Proof: Since all CEs have the same efficiency, it is enough to compute the efficiency in one particular CE made by idea gas. Idea gas is simple enough to allow us do some exact calculations. This CE forms a Carnot cycle consisting of two isothermal processes and two adiabatic processes. The calculation shows that $\frac{Q_2}{Q_1}=\frac{T_2}{T_1}$. Note: this also shows that the temperature defined by idea gas (3) agrees with the temperature defined by the CE. 

With the above three statements, we prove the Carnot's theorem that no heat engine can work more efficiently than $1-\frac{T_2}{T_1}$. As a result, we obtain the relation $\frac{Q_1}{T_1}-\frac{Q_2}{T_2}\leq 0$, and generalize it to a more general form $\oint \frac{dQ}{T}\leq 0$ that leads to the definition of entropy in thermodynamics (1). 

Microcanonical Ensemble

To prove the Boltzmann entropy (2) from the thermodynamics (1), we need to consider the microcanonical ensemble in statistical physics. In microcanonical ensemble, the number of microstates of a macrostate is a function of the system's energy $E$, volume $V$ and particle number $N$, denoted by $\Omega(E, V, N)$.

Now we consider two systems $\Omega_1(E_1, V_1, N_1)$ and $\Omega_2(E_2, V_2, N_2)$. Both of them are isolated from the rest of the world and are only allowed to exchange heat internally between each other. So total energy $E=E_1+E_2$ is conserved. $V_1, N_1, V_2, N_2$ are fixed and only $E_1$ can change. The probability of that the first system has energy $E_1$ is \begin{equation}\mathbb{P}(E_1)\propto\Omega_1(E_1)\Omega_2(E-E_1)\,\end{equation} up to a normalization constant $\int_0^E \Omega_1(\omega)\Omega_2(E-\omega) d\omega$. 

When the two systems are in thermal equilibrium, $E_1$ takes the value that maximizes $\mathbb{P}(E_1)$. In other words, as the time passes, the two systems will eventually reach thermal equilibrium and such equilibrium state should be most probable in order to be picked by nature (this already implies the microscopic explanation of entropy). So in thermal equilibrium, we have \begin{equation} \frac{\partial }{\partial E_1}\Omega_1(E_1)\Omega_2(E-E_1)=0\,,\end{equation} which can be reduced to \begin{equation}\left(\frac{\partial \log\Omega_1(E_1, V_1, N_1)}{\partial E_1}\right)_{V_1,N_1}=\left(\frac{\partial \log\Omega_2(E_2, V_2, N_2)}{\partial E_2}\right)_{V_2,N_2}\,.\tag{4}\end{equation}

Now we claim that the quantity $\left(\frac{\partial \log\Omega(E, V, N)}{\partial E}\right)_{V, N}$ is related to the temperature. As discussed in the zeroth law above, a quantity to be the temperature must be satisfies two requirements: 
  • It takes the same value for all systems in thermal equilibrium; 
  • When the system is ideal gas, it gives the same result as that in Eq. (3). 
The first requirement has been satisfied as shown in Eq. (4). We just need some calculations to verify the second requirement: For idea gas, we can compute \begin{equation}\Omega(E,V,N)=\frac{V^N}{N!}\int d^{3N}p\,\delta\left(\sum_{i=1}^N \frac{p_i^2}{2m}-E\right)=\frac{V^N}{N!}\frac{(2mE)^{3N/2}}{E}S_{3N-1}\,,\end{equation} where $S_{3N-1}\equiv\int d^{3N}x_i\,\delta\left(\sum_{i=1}^{3N}x^2_i - 1\right)$ is the area of a unit (3N-1)-dimensional sphere. As a result, \begin{equation}\left(\frac{\partial \log\Omega(E, V, N)}{\partial E}\right)_{V, N}=\left(\frac{3N}{2}-1\right)\frac{1}{E}\approx \frac{3N}{2E}\tag{5}\end{equation} when $N\rightarrow \infty$. On the other hand, in the kinetic theory of idea gas, the pressure is the force per area exerted by the Newtonian particles randomly hitting and rebounding form the gas container's surface, from which we can derive the relation \begin{equation}PV=\frac{2}{3}E\,.\tag{6}\end{equation} As a result, combining Eq. (3), (4), (5), (6), we conclude that in microcanonical ensemble, the quantity \begin{equation} \left(\frac{k_B \partial \log \Omega(E, V, N)}{\partial E}\right)_{V,N}=\frac{1}{T}\tag{7}\end{equation} defines the inverse of the temperature.

Finally, from the thermodynamics law (1), we know that the entropy is $T=\left(\frac{\partial E}{\partial S}\right)_{V,N}$. Comparing it to the Eq. (7) leads to the Boltzmann entropy (2).

Remarks

In thermodynamics, we start by empirical or intuitive objects like temperature, idea gas, heat engine, and eventually reveal some hidden but fundamental quantities like entropy. This presentation is more common in college physics and friendly for first-time learners. But the logic is less concise. For example, we know that entropy is a state function of a system, no matter whether the system is in thermal equilibrium or not. But introduction of entropy in the thermodynamics conveys an illusion that the entropy replies on the thermal equilibrium and temperature.

In statistical physics, one can start directly with the Boltzmann entropy (2) and microcanonical ensemble. After showing Eq. (4), we take Eq. (7) as the definition of temperature. In this way, we do not need to introduce any specific system like idea gas, and we are able to logically present other thermodynamics results in a much simpler way. See Landau's statistical physics book for more details.   

Comments

Popular posts from this blog

529 Plan

How to offset W2 tax

Retirement Accounts