PlanetPhysics/Statistical Entropy

Statistical Entropy is a definition of entropy based on statistical Thermodynamics. The definition is

$$S = k_B \ln \Omega$$

where $$k_B$$ is Boltzmann's constant, $$1.38066 \times 10 ^ {-23} J K^{-1}$$, and $$\Omega$$ is the number of microstates corresponding to the observed thermodynamic macrostate.

A microstate of the thermodynamic system is one possible complete microscopic description of the system. For example, for an ideal gas, this would contain one possible set of values for all the positions and velocities of all the particles on the gas.

A macrostate of the thermodynamic system is one possible set of values for the externally measurable information about the system, such as the temperature, pressure, and volume.

The definition above assumes that all the microstates are equally probable. If they are not, the equation is

$$S = - k_B \ln \sum_i {p_i \ln p_i} $$

where the microstates are indexed by $$i$$ and $$p_i$$ is the probability that the system is in microstate $$i$$.

The equation was first introduced by Ludwig Boltzmann in 1877.