Recall that we have three rules for vector multiplication (not including the outer product):

where is a scalar, and and are vectors as usual. We evidently must have three similar rules for the gradient operator treated as if it is a vector (operator):

where is a multivariate scalar function, and is a multivariate vector function. We call these, respectively, the gradient of a scalar function, the divergence of a vector function, and the curl of a vector function.

The gradient is the directed slope of at a point. The divergence is a measure of the in/outflow of a vector field relative to a point. The curl is a measure of the rotation of a vector field about a point. All three are defined at (in the neighborhood of) a point in space by means of the limiting process indicated above and presume that the objects they act on are well-behaved enough to permit limits to be taken.

In Cartesian components, the gradient of a vector is:

and the curl is:

What are the analogues of the scalar rules we listed above? We now have three versions of each of them. The chain rule is formed by composition of the rule for the total differential with rules for the component differentials and we won't have much use for it. The sum rule, however, is important (all three ways) if obvious.