More on Entropy
I reviewed some basics of thermodynamics as well as the Boltzmann entropy in this post . Here, I continue to review the Gibbs entropy, the information entropy as well as the cross-entropy that is widely used in machine learning. In the modern view, entropy is a measure of the ignorance of an observer on the system. Boltzmann Entropy Physics students start to learn the concept of entropy from the Carnot engine and Boltzmann entropy. The Boltzmann entropy applies to a single system with the fixed energy: \begin{equation}S=k_B\log \Omega\,,\tag{1}\end{equation}where k_B is the Boltzmann constant and \Omega is the number of microstates corresponding to a system's macrostate. Boltzmann also derived another form of entropy that is very close to the modern form in mathematics. Consider a single system consisting of N particles, we partition N particles into n groups. Let N_i be the number of particles in the group i, then the number of microstates is \begin{equa...