Divergence
#Math
$\displaystyle \nabla\cdot \vec{F}: \text{vector}\rightarrow \text{scalar}$
$\displaystyle \nabla\cdot \vec{a}=\partial_{i}a_{i}$
$\displaystyle (\nabla \cdot \mathbf{A}){j}=\partial{j}A_{ij}$
- For a matrix $\displaystyle \mathbf{A}$
- Final dimension of $\displaystyle \nabla \cdot \mathbf{A}$ is $\displaystyle n\times 1$
$\displaystyle \nabla \cdot \vec{F}=\frac{\partial_{r}}{r^{2}}(r^{2}F_{r})+\frac{\partial_{\theta}}{r\sin \theta}(F_{\theta}\sin \theta)+\frac{\partial_{\phi}}{r\sin \theta}F_{\phi}$
Properties
$\displaystyle \nabla\cdot(k\vec{a})=k(\nabla\cdot \vec{a})$
- $\displaystyle k$ is a constant
$\displaystyle \nabla \cdot (f\vec{a})=f(\nabla \cdot \vec{a})+\vec{a}\cdot (\nabla f)$
- Effectively a chain rule
- $\displaystyle f$ is a scalar function
- The mnemonic to remember is that the value has to be scalar, and it's still high d low, low d high