Boltzmann Entropy
In my junior year, I was taught thermodynamics and statistical physics in a one-semester course. In thermodynamics, there is a well-known equation that summarizes all the key concepts in thermodynamics: \begin{equation}dE=T\,dS-P\,dV+\mu\,dN\,\tag{1},\end{equation} where $E,T,S,P,V,\mu, N$ are a system's energy, temperature, entropy, pressure, volume, chemical potential and particle number, respectively. In statistical physics, there is also another famous equation on the microscopic interpretation of entropy: \begin{equation}S = k_B\,\log \Omega\,,\tag{2}\end{equation} where $k_B$ is the Boltzmann constant and $\Omega$ is the number of microstates corresponding to a system's macrostate. The question is how to prove the entropy in thermodynamics (1) is the same as the Boltzmann entropy (2)? My course teacher might present the proof in class, but I really don't have any impression. In recent days after work, I browsed some textbooks on thermodynamics and statistical physic