Gibb's Entropy
#Physics
$\displaystyle S=-k_{B}\sum_{s}p_{s}\ln p_{s}\geq0$
- Measures uncertainty of the system
- $\displaystyle p_{i}=P(\psi_{s})$, or is the probability of a microstate $\displaystyle s$
- By equiprobability postulate, $\displaystyle p_{s}=P(\psi_{s})=\frac{1}{\Omega}$, where $\displaystyle \Omega$ is the multiplicity function
- Reduces to regular entropy under this condition
- Uniform $\displaystyle P(\psi_{s})$ maximizes uncertainty and therefore Gibb's entropy as well
- If we were 100% certain of the state being for example $\displaystyle \psi_{1}$, then $\displaystyle p_{1}=1$ while $\displaystyle p_{s}$ for $\displaystyle s\neq1$ is 0, which would mean that $\displaystyle S=\ldots\ln(1)=0$