next up previous contents
Next: The Metric Tensor Up: The Lorentz Group Previous: The Geometry of Space-Time   Contents

Tensors in 4 Dimensions

Let us now consider the specific nature of tensors on four-dimensional space-time. Tensors of rank $ k$ 18.1 are categorized (for each coordinate index) by their transformation properties relative to a transformation of the underlying coordinate system $ x \to x'$ as defined above. This transformation is implicit in all the discussion below.

A scalar (tensor of rank zero) is unchanged by such a transformation. This is not a trivial statement! It is trivial for scalar numbers like $ \pi$ , no doubt, but in physics the interesting part of this requirement occurs when discussing the scalars that result algebraically from fully contracting products of tensors over all of their indices using the metric tensor. This will be made quite clear below.

For a vector (tensor of rank one) we have two possibilities. Either it transforms like the coordinate itself and we have a

contravariant vector
$ (A^0,A^1,A^2,A^3)$ such that A^&alpha#alpha; = ^&alpha#alpha;x^&beta#beta; A^&beta#beta;
(noting that all the indices are on top, along with the new primed coordinate). This makes the differential transformation relationship to the underlying ordinary (contravariant) coordinates explicit and is obviously an identity for those coordinates.

Alternatively, we have a

covariant vector
$ (B_0,B_1,B_2,B_3)$ such that B_&alpha#alpha; = x^&beta#beta;^&alpha#alpha; B_&beta#beta;
(with the coordinate indices on top and the new primed coordinate on the bottom). Again, note that this is precisely what we expect - the transformation is in the opposite sense of that of the underlying coordinates. We need in both cases, of course, to figure out the matrix of e.g. $ \partialdiv{x^\beta}{x^{\alpha}}$ explicitly.

In a moment we will see explicitly what exactly the difference is between these two types of first rank tensors. First, however, we should note that

contravariant tensors of rank 2
transform like F^&alpha#alpha;&beta#beta; = ^&alpha#alpha;x^&gamma#gamma; ^&beta#beta;x^&delta#delta; F^&gamma#gamma;&delta#delta; .
Similarly, we have
covariant tensors of rank 2
G_&alpha#alpha;&beta#beta; = x^&gamma#gamma;^&alpha#alpha; x^&delta#delta;^&beta#beta; G_&gamma#gamma;&delta#delta;
and
mixed tensors of rank 2
H^&alpha#alpha;_&beta#beta;= ^&alpha#alpha;x^&gamma#gamma; x^&delta#delta;^&beta#beta; H^&gamma#gamma;_&delta#delta;.

It is clearly a trivial exercise to determine the co/contra variant transformation properties of higher rank tensors. We can form higher rank tensors by means of an outer (dyadic) product, where we simply take two tensors of some rank and multiply them out componentwise, preserving products of any underlying basis vectors as they occur. For example we can construct a second rank tensor by: F^&alpha#alpha;&beta#beta; = A^&alpha#alpha;B^&beta#beta; where $ \alpha$ and $ \beta$ run over the full range of index values. Note well that this defines a square matrix in this case of basis vector dyads as objects such as $ \hx \hx$ , $ \hx\hy$ , ... occur.

One important question is whether all e.g. second rank tensors can be written as products of first rank tensors. It is not the general case that this is possible, but in many of our uses of these ideas in physics it will be. In this case the generalized product forms a division algebra where we can factor e.g. second rank tensors into first rank tensors in various ways. Division algebras are discussed in the Mathematical Physics section as well, and interested students should return there to read about geometric algebras, the result of fully generalizing the notion of complex numbers to complex spaces of arbitrary dimension while preserving the factorizability of the algebraic objects.

In addition to extending the rank of tensor objects by forming dyadic, triadic, or n-adic products of tensors, we can reduce the rank of tensors by means of a process called contraction. A contraction of two tensors is the result of setting two of the indices (typically they must be a covariant/contravariant pair) to be equal and performing the Einstein summation over the shared range. This reduces the rank of the expression by one relative to that of its constituents, hence the term ``contraction''. An expression can be contracted over several components at a time when doing algebra so second rank tensors can be contracted to form a 4-scalar, for example, or third rank tensors can be contracted to first.

Our familiar notion of multiplying a vector by a matrix to produce a vector in proper tensor language is to form the outer product of the matrix (second rank tensor) and the vector (first rank tensor), set the rightmost indices to be equal and sum over that index to produce the resulting first rank tensor.

Hence we define our scalar product to be the contraction of a covariant and contravariant vector. B ·A = B_&alpha#alpha;A^&alpha#alpha; Note that I've introduced a sort of ``sloppy'' convention that a single quantity like $ B$ or $ A$ can be a four-vector in context. Clearly the expression on the right side is less ambiguous!

Then: B' ·A' & = & x^&gamma#gamma;^&alpha#alpha; B_&gamma#gamma; ^&alpha#alpha;x^&delta#delta; A^&delta#delta;
& = & x^&gamma#gamma;x^&delta#delta; B_&gamma#gamma;A^&delta#delta;
& = & &delta#delta;_&gamma#gamma;&delta#delta; B_&gamma#gamma;A^&delta#delta;
& = & B_&delta#delta;A^&delta#delta;= B ·A and the desired invariance property is proved. Hmmm, that was pretty easy! Maybe there is something to this notation thing after all!


next up previous contents
Next: The Metric Tensor Up: The Lorentz Group Previous: The Geometry of Space-Time   Contents
Robert G. Brown 2017-07-11