Tensor Calculus

Tensors are the mathematical objects that allow us to express physical laws in a form that is independent of the choice of coordinates.

What is a Tensor?

A tensor is a mathematical object that transforms in a specific way under coordinate transformations. The key property is that tensor equations are covariant - they have the same form in all coordinate systems.

You're already familiar with some tensors:

  • Scalars (rank-0 tensors) - single numbers like temperature
  • Vectors (rank-1 tensors) - quantities with magnitude and direction
  • Matrices (rank-2 tensors) - 2D arrays of numbers

Index Notation

In GR, we use index notation to work with tensors. Indices can be:

  • Superscripts (upper indices) - contravariant components: VμV^\mu
  • Subscripts (lower indices) - covariant components: VμV_\mu
Vector in contravariant form
Vμ=(V0,V1,V2,V3)V^\mu = (V^0, V^1, V^2, V^3)

In 4D spacetime, indices run from 0 to 3, where 0 is the time component and 1, 2, 3 are spatial components.

Einstein Summation Convention

Einstein introduced a powerful notational shortcut: when an index appears both as a superscript and subscript in a term, we sum over that index.

Summation Convention
AμBμ=μ=03AμBμ=A0B0+A1B1+A2B2+A3B3A^\mu B_\mu = \sum_{\mu=0}^{3} A^\mu B_\mu = A^0 B_0 + A^1 B_1 + A^2 B_2 + A^3 B_3

A repeated index (one up, one down) is called a dummy index or contracted index. This makes tensor equations much more compact.

Transformation Laws

The defining property of tensors is how they transform. Under a coordinate transformation xμxμx^\mu \to x'^\mu:

Contravariant Vector Transformation
Vμ=xμxνVνV'^\mu = \frac{\partial x'^\mu}{\partial x^\nu} V^\nu
Covariant Vector Transformation
Vμ=xνxμVνV'_\mu = \frac{\partial x^\nu}{\partial x'^\mu} V_\nu

Notice that contravariant and covariant indices transform with oppositeJacobian matrices. This is why we distinguish them with upper and lower positions.

Higher Rank Tensors

A general tensor of type (m,n)(m, n) has mm contravariant and nn covariant indices:

Type (2,1) Tensor
TμνρμνT^{\mu\nu}_{\phantom{\mu\nu}\rho}

Each index transforms independently according to its position:

General Transformation
Tμνρμν=xμxαxνxβxγxρTαβγαβT'^{\mu\nu}_{\phantom{\mu\nu}\rho} = \frac{\partial x'^\mu}{\partial x^\alpha} \frac{\partial x'^\nu}{\partial x^\beta} \frac{\partial x^\gamma}{\partial x'^\rho} T^{\alpha\beta}_{\phantom{\alpha\beta}\gamma}

Tensor Operations

Contraction

Setting an upper and lower index equal and summing reduces the rank by 2:

Tμμμ=T000+T111+T222+T333T^\mu_{\phantom{\mu}\mu} = T^0_{\phantom{0}0} + T^1_{\phantom{1}1} + T^2_{\phantom{2}2} + T^3_{\phantom{3}3}

Outer Product

Multiplying tensors increases rank:

Cμν=AμBνC^{\mu\nu} = A^\mu B^\nu

Parallel Transport on Curved Surfaces

Vectors change direction when parallel transported around a closed loop on a curved surface. This is how curvature is detected!

Positive = saddle shape, Negative = bowl shape, Zero = flat

Why Tensors Matter in GR

The principle of general covariance states that the laws of physics should take the same form in all coordinate systems. Tensor equations automatically satisfy this requirement - if a tensor equation holds in one coordinate system, it holds in all of them!