Talk:Tensor/Old version

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Tensors are quantities that describe a transformation between coordinate systems. They were resurrected from obscurity by Einstein in order to formulate General relativity in such a way that the physical laws that were described were independent of the coordinate system chosen.

See Tensor/Alternate and Tensor for alternative treatments of tensors.

Throughout we will use the Einstein summation convention in which repeated indices imply a sum over components. Capitalized indices will likewise be summed. The reader should already be familiar with vector spaces and their properties.

An example in 3 space follows:

Let A be a vector and {E1,E2,E3} be a set of linearly independent vectors that span a 3 dimensional space (all 3 dimensional spaces are spanned if any is as a result of the properties of a vector space).

A=A1E1 + A2E2 + A3E3

This is rather painful to do as we have to find the projection of A into all of these vectors and solve n equations in n unknowns. We can solve this in a computationally (and eventually much clearer fashion) by finding reciprocal vectors for our set of vectors {Ei} with i from 1 to n.

The reciprocal vectors can be found as such. We take

E1 = E2 cross E3 * 1/V where V is the volume of the parallepiped defined by the three E vectors. This volume can be found by finding the triple product (which can be written as E1 dot E2 cross E3, or by finding the determinant of the matrix of the three vectors |EIJ| with I,J = 1,2,3 (The determinant actually finds a volume!).

Excercise: What can you say about three vectors where the triple product equals 0 (or alternatively has a matrix determinant equal to 0)?

E2 is likewise equal to E3 cross E1 * 1/V and
E3 = E1 cross E2 * 1/V.
This can be generalized to an nth dimensional space, keeping in mind that the operations must be done in clockwise cyclic order.


In linear algebra and abstract algebra, the 'tensor product of two vector spaces V and W is a vector space T, together with a bilinear operator B: V x W -> T, such that for every bilinear operator C: V x W -> X there exists a unique linear operator L: T -> X with C = L o B

The tensor product is up to isomorphism uniquely specified by this requirement. Using a rather involved construction, one can show that the tensor product for any two vector spaces exists. The space T is generated by the image of B.

A tensor product can be defined similarly between a right module M and a left module N over the same ring R -- the only thing that changes is the definition of the bilinear operator. If M is a right module over a ring S, then the tensor has a natural right module structure over S. Similarly, if N is a left module over T, then the tensor has a left module structure over T. R must be specified, and matters for the definition of the tensor.

For free modules, the dimension of the Tensor is the product of the dimensions of the modules. It is possible to generalize the definition into a tensor of any number of spaces -- but needless. From universality, the tensor is associative. So the tensor of V, W and X is the tensor of V and the tensor of W and X.

See also : Tensor