Linearity and duality --------------------- A finite dimensional vector space is isomorphic to its dual, but not naturally, since the isomorphism depends on basis (i.e., there are many). But the isomorphism between a space and its double dual is natural. But that isomorphism still holds only for finite dimensional spaces. However, for an inner product space (vector space with inner product), the isomorphism to the dual space IS natural, and the distinction between the space and its dual can be ignored. This is because an inner product is nothing more than a chosen isomorphism t: V -> V* to the dual space. Given t, define the inner product <_, _> to be = t(v)(w). For more fun, you can use a tensor product & to build up things like V & V & V* & V* where the first two coordinates are covariant and the last two are contravariant. If V has an inner product, you can do raising and lowering by passing back and forth between V and V* (since they're naturally isomorphic). The tensor product is the "universal bilinear function": Given vector spaces U, V there is a vector space U & V and a bilinear function & such that for any bilinear function B : U * V -> W there is a unique induced linear map T : U & V -> W such that B = T o &. A basis for U & V is all pairs (u, v) of basis vectors for U and V. You can guess the rest. For vector spaces over the complexes, instead of using the dual (i.e. the space of linear functionals), you use the ADJOINT, the space of conjugate linear functionals. I.e., f(av + bw) = a*f(v) + b*f(w), where a* is the complex conjugate of a. The reason for this is of course to make the inner product positive definite. But once you make this change, all the theorems are the same. For example, over the reals, if T : V -> W induces T* : W* -> V*, then the matrix of T* (over the dual basis) is the transpose of the matrix T. Over the complexes, it's the conjugate transpose. You get this dictionary of terms: Real Complex ---- ------- dual adjoint self-dual self-adjoint T = T* symmetric hermitian A = A* orthogonal unitary T*T = TT* = I But there's one theorem which has no analog in the reals: T : U -> U commutes with its adjoint if and only if it has a diagonal matrix with respect to some orthonormal basis. That's because the eigenvalues are allowed be be complex in the complex case, and, by the fundamental theorem of algebra, the matrix will have a full set of eigenvalues. However, if the transformation is self-adjoint, then all the eigenvalues are real, and you can diagonalize over the reals. But self-adjoint is weaker than "commutes with its adjoint".