Skip to main content

Contraction

At a Glance

  • Contraction generalizes vector inner products and matrix multiplication to higher dimensions.
  • Sums over matching pairs of dimensions between two tensors.
  • The fundamental operation for composing tensors in networks and quantum algorithms.
  • Contracted dimensions must have matching sizes.
  • Result contains only the uncontracted dimensions from both input tensors.

Overview

Tensor contraction is a fundamental operation in multilinear algebra that generalizes familiar operations like vector inner products and matrix multiplication to higher-dimensional arrays. It is the primary way tensors are composed into more complex tensors.

Conceptually, contraction takes two tensors and "contracts" (sums over) one or more pairs of matching dimensions, similar to how matrix multiplication sums over the shared dimension between two matrices. This operation is essential in quantum mechanics, general relativity, and machine learning.

tip

For an introduction to tensors and their properties, see the Tensor concept page.

Mathematical Definition

Given two tensors AA and BB, a contraction specifies:

  • Which dimensions of AA to contract (e.g., dimensions [i1,i2,][i_1, i_2, \ldots])
  • Which dimensions of BB to contract (e.g., dimensions [j1,j2,][j_1, j_2, \ldots])

The contracted dimensions must have matching sizes. The operation sums over all values in these dimensions, producing a result tensor whose dimensions are the uncontracted dimensions from both inputs.

Formally, for tensors AA and BB:

Cαγ=βAαβBβγC_{\alpha\gamma} = \sum_{\beta} A_{\alpha\beta} B_{\beta\gamma}

where α\alpha, β\beta, and γ\gamma represent multi-indices for the uncontracted-left, contracted, and uncontracted-right dimensions respectively.

Vector Inner Product as Contraction

The inner product of two vectors is the simplest example of tensor contraction. It takes two order one tensors and contracts their only dimension, producing a scalar.

Mathematical notation:

For vectors u,vCn\mathbf{u}, \mathbf{v} \in \mathbb{C}^n:

uv=i=0n1uivi\langle \mathbf{u} | \mathbf{v} \rangle = \sum_{i=0}^{n-1} u_i^* v_i

Code example:

// Vector inner product: contract the single dimension
var u = random_tensor([5],as_complex) // Vector of length 5
var v = random_tensor([5],as_complex) // Vector of length 5

var inner_product = contract_conjugate(u, v, [0], [0])

print(inner_product) // Complex order 0 tensor, a single scalar

Matrix-Matrix Product as Contraction

Matrix multiplication is a contraction over one shared dimension.

Mathematical notation:

For matrices ACm×nA \in \mathbb{C}^{m \times n} and BCn×pB \in \mathbb{C}^{n \times p}:

Cik=j=0n1AijBjkC_{ik} = \sum_{j=0}^{n-1} A_{ij} B_{jk}

Code example:

// Matrix multiplication: contract one shared dimension
var A = random_tensor([3, 4],as_complex) // 3×4 matrix
var B = random_tensor([4, 5],as_complex) // 4×5 matrix

var C = contract(A, B, [1], [0]) // Contract dimension 1 of A with dimension 0 of B
print(C.shape()) // [3, 5]

Hermitian Inner Product with Conjugation

When working with complex tensors, we often need the conjugate contraction for Hermitian inner products. This specialized contraction kernel optimizes the conjugation by fusing the operation with the contraction, lowering both memory and time consumption.

Mathematical notation:

For tensors A,BCd1×d2A, B \in \mathbb{C}^{d_1 \times d_2}:

A,B=i=0d11j=0d21AijBij\langle A, B \rangle = \sum_{i=0}^{d_1-1} \sum_{j=0}^{d_2-1} A_{ij}^* B_{ij}

Code example:

// Hermitian inner product: contract all dimensions with conjugation
var A = random_tensor([3, 4],as_complex)
var B = random_tensor([3, 4],as_complex)

var inner_product = contract_conjugate(A, B, [0, 1], [0, 1])
// Result is a 0-dimensional tensor ( a scalar)
// Computes: Σᵢⱼ conj(A[i,j]) * B[i,j]

print(inner_product) // Complex order 0 tensor

Higher-Dimensional Tensor Contraction

Contractions generalize naturally to higher dimensions, which is essential for tensor networks.

Mathematical notation:

For a 3-tensor TCd0×d1×d2T \in \mathbb{C}^{d_0 \times d_1 \times d_2} and a 2-tensor MCd1×d3M \in \mathbb{C}^{d_1 \times d_3}:

Rikl=j=0d11TijkMjlR_{ikl} = \sum_{j=0}^{d_1-1} T_{ijk} M_{jl}

Code example:

// Contract a 3D tensor with a matrix
var T = random_tensor([2, 3, 4],as_complex) // 3D tensor
var M = random_tensor([3, 5],as_complex) // 2D matrix

var result = contract(T, M, [1], [0]) // Contract dimension 1 of T with dimension 0 of M
print(result.shape()) // [2, 4, 5]
// Computes: result[i,k,l] = Σⱼ T[i,j,k] * M[j,l]

Partial Conjugate Contraction

This operation computes a matrix product where the left-hand tensor is conjugated, but not transposed.

Mathematical notation:

For tensors ACm×nA \in \mathbb{C}^{m \times n} and BCn×pB \in \mathbb{C}^{n \times p}:

Cik=j=0n1AijBjkC_{ik} = \sum_{j=0}^{n-1} A_{ij}^* B_{jk}

Code example:

// Partial conjugate contraction (adjoint-like operation)
var A = random_tensor([2, 3],as_complex) // 2×3 tensor
var B = random_tensor([3, 5],as_complex) // 3×5 tensor

var result = contract_conjugate(A, B, [1], [0])
print(result.shape()) // [2, 5]
// Computes: C[i,k] = Σⱼ conj(A[i,j]) * B[j,k]

Multiple Dimension Contraction

You can contract multiple dimensions simultaneously (useful in tensor network algorithms).

Mathematical notation:

For 4-tensors ACd1×d2×d3×d4A \in \mathbb{C}^{d_1 \times d_2 \times d_3 \times d_4} and BCd2×d3×d5×d6B \in \mathbb{C}^{d_2 \times d_3 \times d_5 \times d_6}:

Ci,l,m,n=j=0d21k=0d31AijkmBjklnC_{i,l,m,n} = \sum_{j=0}^{d_2-1} \sum_{k=0}^{d_3-1} A_{ijkm} B_{jkl n}

Code example:

// Contract multiple dimensions at once
var A = random_tensor([2, 3, 4, 5],as_complex) // 4D tensor
var B = random_tensor([3, 4, 6, 7],as_complex) // 4D tensor

var result = contract(A, B, [1, 2], [0, 1]) // Contract dimensions [1,2] of A with [0,1] of B
print(result.shape()) // [2, 5, 6, 7]
// Computes: result[i,m,l,n] = Σⱼₖ A[i,j,k,m] * B[j,k,l,n]

Key Properties

  1. Associativity: Contractions can be performed in different orders (though computational efficiency and numerical accuracy may vary drastically)
  2. Dimension Matching: Contracted dimensions must have the same size
  3. Index Ordering: The order of uncontracted dimensions is preserved from left (first tensor) to right (second tensor)
  4. Generality: Many linear algebra operations (dot product, matrix multiply, trace, etc.) are special cases of contraction

See Also

  • Tensor: Introduction to tensors and their properties
  • Factorization: Tensor decomposition methods

Documentation Contributors

Alexandre Foley