profile pic
⌘ '
raccourcis clavier

divergence

operates on vector field producing a scalar field giving quantity of the gector field’s source at each points.

represents the volume density of the outward flux of a vector field from an infinitesimal volume around a given point.

definition

the divergence of a vector field F(x)\mathbf{F}(\mathbf{x}) at point x0x_{0} is defined as the limit of the ratio of the surface integral of F\mathbf{F} out of the closed surface of a volume VV enclosing x0x_0 to the volume of VV, as VV shrinks to zero

divFx0=limV01VS(V)Fn^dS\operatorname{div} \mathbf{F} \big|_{\mathbf{x}_0} = \lim_{V \to 0} \frac{1}{|V|} \oiint_{S(V)} \mathbf{F} \cdot \hat{\mathbf{n}} \, dS

where V|V| is the volume of VV, S(V)S(V) is the boundary of VV and n^\hat{\mathbf{n}} is the outward unit normal to that surface.

Cartesian coordinates

for a continuously differentiable vector field F=Fxi+Fyj+Fzk\mathbf{F} = F_x \mathbf{i} + F_y \mathbf{j} + F_z \mathbf{k}, divergence is defined as the scalar-valued function:

divF=F=(x,y,z)(Fx,Fy,Fz)=Fxx+Fyy+Fzz\begin{aligned} \operatorname{div} \mathbf{F} = \nabla \cdot \mathbf{F} &= \left( \frac{\partial}{\partial{x}}, \frac{\partial}{\partial{y}}, \frac{\partial}{\partial{z}} \right) \cdot \left( F_x, F_y, F_z \right) \\ &=\frac{\partial{F_x}}{\partial{x}} + \frac{\partial{F_y}}{\partial{y}} + \frac{\partial{F_z}}{\partial{z}} \end{aligned}

Jacobian matrix

Suppose a function f:RnRm\mathbf{f}: \mathbf{R}^n \to \mathbf{R}^m is a function such that each of its first-order partial derivatives exists on Rn\mathbf{R}^n, then the Jacobian matrix of f\mathbf{f} is defined as follows:

Jf=[fx1fxn]=[Tf1Tfm]=[f1x1f1xnfmx1fmxn].\begin{equation} \begin{aligned} \mathbf{J}_{\mathbf{f}} &= \begin{bmatrix} \frac{\partial \mathbf{f}}{\partial x_1} & \cdots & \frac{\partial \mathbf{f}}{\partial x_n} \end{bmatrix} \\ &= \begin{bmatrix} \nabla^T f_1 \\ \vdots \\ \nabla^T f_m \end{bmatrix} \\ &= \begin{bmatrix} \frac{\partial f_1}{\partial x_1} & \cdots & \frac{\partial f_1}{\partial x_n} \\ \vdots & \ddots & \vdots \\ \frac{\partial f_m}{\partial x_1} & \cdots & \frac{\partial f_m}{\partial x_n} \end{bmatrix}. \end{aligned} \end{equation}

Jacobian determinant

When m=nm = n, the Jacobian matrix is a square, so its determinant is a well-defined function of xx 1

When m=1m=1, or f:RnRf: \mathbf{R}^n \to \mathbf{R} is a scalar-valued function, then Jacobian matrix reduced to the row vector Tf\nabla^T f, and this row vector of all first-order partial derivatives of ff is the transpose of the gradient of ff, or Jf=Tf\mathbf{J}_f = \nabla^T f

gradient

a vector field f\nabla f whose value at a point pp gives the direction and the rate of fastest increase.

In coordinate-free term, the gradient of a function f(r)f(\mathbf{r}) maybe defined by:

df=fdrdf = \nabla f \cdot d \mathbf{r}

where dfdf is the infinitesimal change in ff for an infinitesimal displacement drd \mathbf{r}, and is seen to be maximal when drd \mathbf{r} is in the direction of the gradient f\nabla f

definition

the gradient of ff (f\nabla f) is defined as the unique vector field whose dot product with any vector v\mathbf{v} at each point xx is the directional derivative of ff along v\mathbf{v}, such that: 2

(f(x))v=Dvf(x)(\nabla f(x)) \cdot \mathbf{v} = D_v f(x)
"\\usepackage{pgfplots}\n\\usepackage{tikz-3dplot}\n\\pgfplotsset{compat=1.16}\n\n\\begin{document}\n\\begin{tikzpicture}\n\n\\begin{axis}[\n view={25}{30},\n xlabel=$x$,\n ylabel=$y$,\n zlabel=$z$,\n xmin=-80, xmax=80,\n ymin=-80, ymax=80,\n zmin=-4, zmax=0,\n grid=major\n]\n\\addplot3[\n surf,\n domain=-80:80,\n y domain=-80:80,\n samples=40,\n samples y=40,\n faceted color=orange,\n fill opacity=0.7,\n mesh/interior colormap={autumn}{color=(yellow) color=(orange)},\n shader=flat\n] {-(cos(x)^2 + cos(y)^2)^2};\n\n\\addplot3[\n ->,\n blue,\n quiver={\n u={4*cos(x)*sin(x)*(cos(x)^2 + cos(y)^2)},\n v={4*cos(y)*sin(y)*(cos(x)^2 + cos(y)^2)},\n w=0\n },\n samples=15,\n samples y=15,\n domain=-80:80,\n y domain=-80:80,\n] {-4};\n\n\\end{axis}\n\\end{tikzpicture}\n\\end{document}"¡50050¡50050¡4¡20xyz
source code

Remarque

  1. See also Jacobian conjecture

  2. another annotation often used in Machine learning is grad(f). See also autograd