Tensor (intrinsic definition)
From Academic Kids

In mathematics, the modern componentfree approach to the theory of tensors views tensors initially as abstract objects, expressing some definite type of multilinear concept. Their wellknown properties can be derived from their definitions, as linear maps or more generally; and the rules for manipulations of tensors arise as an extension of linear algebra to multilinear algebra.
In differential geometry an intrinsic geometric statement may be described by a tensor field on a manifold, and then doesn't need to make references to coordinates at all. The same is true in general relativity, of tensor fields describing a physical property. The componentfree approach is also used heavily in abstract algebra and homological algebra, where tensors arise naturally.
Note: This is only a definition of tensor product between vector spaces with a chosen basis. The notion of tensor product can be generalised to vector spaces without a chosen basis, and even further; between modules. But the article is still fairly abstract. If you are baffled by this, try reading the main tensor article and the classical treatment first.
Contents 
Definition
Let V and W be two vector spaces over a common field F with basis {v_{i}} and {w_{j}}. Their tensor product
 <math>V \otimes W<math>
is a vector space over the same field F together with a bilinear map
 <math>\otimes: V \times W \rarr V \otimes W<math>
with the basis
 <math> \{ \mathbf{v}_i \} \otimes \{ \mathbf{w}_j \} = \{ \mathbf{v}_i \otimes \mathbf{w}_j \} <math>
Note that here the same symbol <math>\otimes<math> has been used with two differentalbeit relatedsenses, one between vector spaces, and one as the bilinear map.
If V and W are both finite dimensional then the dimension of <math>V \otimes W<math> is the product of the dimensions of V and W. This tensor product can by iteration be applied to more than just two vector spaces.
A tensor on the vector space V is then defined to be an element of (i.e. a vector in) the following vector space:
 <math>V \otimes ... \otimes V \otimes V^* \otimes ... \otimes V^*<math>
where V* is the dual space of V.
If there are m copies of V and n copies of V* in our product, the tensor is said to be of type (m, n) and of contravariant rank m and covariant rank n. The tensors of rank zero are just the scalars (elements of the field F), those of contravariant rank 1 the vectors in V, and those of covariant rank 1 the oneforms in V* (for this reason the last two spaces are often called the contravariant and covariant vectors).
Note that the (1,1) tensors
 <math>V \otimes V^*<math>
are isomorphic in a natural way to the space of linear transformations (i.e. matrices) from V to V. An inner product of a real vector space V; V × V → R corresponds in a natural way to a (0,2) tensor in
 <math>V^* \otimes V^*<math>
called the associated metric and usually denoted g.
Alternate notation
Rather than writing out the full tensor product to denote the space of tensors of type (m,n), the literature often uses the abbreviation
 <math>T^m_n(V) = V\otimes ... \otimes V\otimes V^* ... \otimes V^*<math>
Another, alternate notation for this space is in terms of linear maps from a vector space V to a vector space W. Let
 <math>L(V,W)<math>
denote the space of all linear maps from V to W. Thus, for example, the dual space (the space of 1forms) may be written as
 <math>V^* \approx L(V,\mathbb{R})<math>
The set of (m,n)tensors can then be written as
 <math>T^m_n(V) \approx
L(V^*\otimes ... \otimes V^*\otimes V \otimes ... \otimes V, \mathbb{R}) \approx L^{m+n}(V^*,...,V^*,V,...,V,\mathbb{R})<math> Note that in the formula above,the roles of V and V^{*} are reversed. In particular, one has
 <math>T^1_0(V) \approx L(V^*,\mathbb{R}) \approx V<math>
and
 <math>T^0_1(V) \approx L(V,\mathbb{R}) \approx V^*<math>
and
 <math>T^1_1(V) \approx L(V,V)<math>
The notation
 <math>GL(V,W)<math>
is often used to denote the space of invertible linear transformations from V to W; however there is no analogous notation for tensor spaces.
Tensor fields
See main article tensor field
Differential geometry, physics and engineering must often deal with tensor fields on smooth manifolds. The term tensor is in fact sometimes used as a shorthand for tensor field. A tensor field expresses the concept of a tensor that varies from point to point.
Basis
For any given coordinate system we have a basis {e_{i}} for the tangent space V (note that this may vary from pointtopoint if the manifold is not linear), and a corresponding dual basis {e^{i}} for the cotangent space V* (see dual space). The difference between the raised and lowered indices is there to remind us of the way the components transform.
For example purposes, then, take a tensor A in the space
 <math>V \otimes V \otimes V^*<math>
The components relative to our coordinate system can be written
 <math>\mathbf{A} = A^{ij}_k (\mathbf{e}_i \otimes \mathbf{e}_j \otimes \mathbf{e}^k)<math>
Here we used the Einstein notation, a convention useful when dealing with coordinate equations: when an index variable appears both raised and lowered on the same side of an equation, we are summing over all its possible values. In physics we often use the expression
 <math>A^{ij}_k<math>
to represent the tensor, just as vectors are usually treated in terms of their components. This can be visualized as an n × n × n array of numbers. In a different coordinate system, say given to us as a basis {e_{i'}}, the components will be different. If (x^{i'}_{i}) is our transformation matrix (note it is not a tensor, since it represents a change of basis rather than a geometrical entity) and if (y^{i}_{i'}) is its inverse, then our components vary per
 <math>A^{i'j'}_{k'} = x^{i'}_i x^{j'}_j y^k_{k'} A^{ij}_k<math>
In older texts this transformation rule often serves as the definition of a tensor. Formally, this means that tensors were introduced as specific representations of the group of all changes of coordinate systems.
/Old Talk  still has some stuff that should likely be merged in