View Full Version : Vectors and Tensors
09-27-2003, 02:05 PM
If I have 2 vectors u and v then (u dot v)u gives me the projection of v on u. Alternately uu(v) also gives me the projection of v on u.
Where does the equality uu(v) = (u dot v)u come from and what is it that makes uu a tensor and not just a matrix?
09-27-2003, 03:51 PM
Actually the projection of v on u is
(u dot v) / u^2 * u .
What does uu(v) mean? I don't get the notation you are using.
09-27-2003, 04:44 PM
If uu means utu, and u is a column vector, then it's pretty easy to see that utu = u . u.
u . u is a tensor because every scalar is a tensor of rank 0.
09-27-2003, 04:47 PM
Also, uu has to be a scalar in that equation, or else the dimensions of the left-hand side just don't match those of the right-hand side.
09-27-2003, 05:11 PM
uu is a dyad. The text I'm presently looking at writes it this way but I think it's usually written with a multiplication sign with a circle around it.
I worked it out both ways and the answer comes out the same so it must be right I just can't figure out how to derive that uu(v) = u(u dot v)
09-27-2003, 09:01 PM
Do you know anything about indicial notation, Sacroiliac? That makes it pretty clear. Assuming you do, the dyad uu has indices given by
so that uu(v) would be, if I understand your notation,
uiujvj = u * (u dot v).
This is likely to be useless, so here's a more intuitive way of thinking about it. The dyad AB is the product of two vectors A and B, and you can just think about it as A acting to the left and B acting to the right.
When you multiply on the right by C, you take the dot product of the right-hand vector B with the vector C, leaving A * (B dot C).
Similarly, if you multiplied on the left by C, you take the dot product of the left-hand vector A with the vector C, leaving B * (A dot C).
I'm not sure if this helps any or not, but that's what a dyad is, pretty much by definition.
09-28-2003, 02:16 PM
Where does the equality uu(v) = (u dot v)u come from and what is it that makes uu a tensor and not just a matrix?A matrix can be considered as a special case of a tensor. A matrix is a row of columns (or equivalently, a column of rows), but you can also have tensors which act like a row of rows, or a column of columns. In index notation (also referred to as Einstein notation), you denote this by whether the index is a superscript or subscript ("upstairs" or "downstairs"), and strictly speaking, for a dot product, u dot v = uivi (or uivi), not uivi.
The distinction between rows and columns, or between upstairs and downstairs indices, is unimportant in Euclidean Cartesian coordinates, but it becomes very important in other coordinate systems (as, for example, if you're doing relativity).
09-28-2003, 04:17 PM
Thanks everyone. Chronos covariant/contravariant drove me crazy till I found out they were just each others dual spaces. For 3 dimensions the inner product is a lot easier with 3 terms than 9 terms.
A matrix is a row of columns (or equivalently, a column of rows), but you can also have tensors which act like a row of rows, or a column of columns
This made my head spin around. Could you explain?
09-28-2003, 07:04 PM
Sacroiliac, think three dimensions instead of two, if you must. At the most basic, the mathematics of tensors deal with generalizing how you can transform pairs of vectors.
vBulletin® v3.7.3, Copyright ©2000-2013, Jelsoft Enterprises Ltd.