## Thursday, 22 November 2018

### Rules for tensors, matrices and indices

This all started because I was confused about the difference between co- and contra-variant vectors and how to turn tensors into matrices. There seemed to be conflicting definitions, most of which I have now resolved. The exploration took me almost two months, when I intended a few days. I have learnt a lot.

During the two months I created some word macros to help write 211 equations.I also heavily revised Exercise 1.07 Tensors and VectorsCommentary on Appendix A: Mapping S2 and R3 and had a lively discussion on Physics Forums: Question on co- / contra-variant coordinates.

I have now found about 25 useful rules for tensors, vectors, indices and matrices. They come first, then many notes and examples finally a contents at the end.

The rules

1. A tensor of type (or rank) (k , l) has k upper indices and l lower indices.
2. If we just say a tensor has rank z, we are being a bit vague: z = k + l.
3. A scalar is a type (0,0) tensor. Scalars are invariant under coordinate changes.
4. A vector (or contravariant vector) is a type (1,0) tensor, e.g. i . Each i is a component of the vector V.
5. A dual vector (or one-form or covariant vector or covector) is a type (0,1) tensor, e.g. ωi
6. Upper indices are co↑travariant, lower indices are co↓ariant.
7. Both kinds of vectors can be written as column or row matrices.
8. When we say a vector is covariant or contravariant we really mean its components are covariant or contravariant. A vector is invariant under coordinate changes. Its components are not. Therefore length and velocity are not real vectors in Minkowski spacetime.
9. A rank 2 tensor can be written as a two dimensional matrix. Components must be written so that the first index indicates row components and the second index column components. Whether the indices are up or down is irrelevant for this rule.
10. By 'contracting' two tensors Ti l , R k j we mean multiplying and adding components, e.g. l, k. We would write this Ti l , R l j which would produce Xi j. This is most easily done by multiplying the matrices T, R or  Ti l , R k j . Subject to these rules:
• When converting tensors to matrices for multiplication the summed indices must be last and first (or adjacent): Ti Rk j  not Ti k  R j k . Ti k  R j k   can be calculated without matrices but it's horrid.
• If you know the matrix T for a rank 2 tensor Ti k, then the matrix of the tensor with indices reversed k i is the transpose of T or TT .
• You can only contract upper and lower indices. Contracting two upper or two lower indices would give something that is not a proper tensor.
• The 3 bullets above even apply to the partial derivative matrix jai ..
• The metric ημν  is a type (0,2) tensor and it lowers an index.
The inverse metric ημν raises an index.
η μν ηνκ  = δ μκ  , the Kronecker delta (identity matrix).
Replace η by g when in curved spacetime.
• The order of the tensors does not matter (unlike matrices). Ti k R j  = R k j Ti k always. TR = RT rarely. The same applies for tensors of any rank including vectors.
11. Co-/contra- variant vector components combined give correct magnitude (length) and dot products.
• Norm (=length²): |v= ημν  Vν Vμ  = Vμ Vμ .
• Dot product: v∙w = ημν  Vν  Wμ  = Vμ  Wμ .
• The 2 bullets above only work on flat manifolds (all ημν are constant).
12. More tensor rules
• Raising or lowering an index does not change its left-right order. Free indices must be the same on both sides of the equation. Dummy indices only appear on one side of the equation - they are being summed over.
Correct: αβμ δ = η μγ T αβ γδ .
Wrong: T αβδ μ  = η μγ T αβ γδ . Indices changed order.
Wrong: αβμθ  η μγ αβ γδ . Free indices not the same.
Wrong: αβμγ  η μγ αβ γδ. Dummy index on both sides.
• You may raise and lower dummy indices simultaneously: λ Bλ   = Aσ B σ
• There are yet more rules.
For these rules and over 34 pages of examples and notes see
Commentary 1.1 Tensors matrices and indexes.pdf.