scalar, vector, tensor

In physics, there are certain measurements/particles/waves? which are either scalar, vector, or tensor.

To me, scalar is a point or line segment, a vector is a direction with force, and a tensor is something like a spring.

I probably got it wrong:smack: .

Can someone explain the three terms to me in the simplest definition, and give a familiar example for each?

A scalar is a tensor of rank 0.
A vector is a tensor of rank 1.
A matrix is a tensor of rank 2.
Ad infinitum…

Let me give it a shot…

A scalar is a number. Examples: the temperature at a given point, the distance from here to the top of the Eiffel Tower, etc.

A vector can be thought of as an “arrow”, something with both a direction and a magnitude. Example: wind speed and direction, the distance and direction from here to the Eiffel Tower.

A tensor… Well, here’s where it gets complicated. A tensor is best thought of as something that takes a vector and gives another vector back (a linear operator, if you know what that is.) If you’ve take a few years of undergraduate physics, you learn that the moment of inertia of a rigid body (i.e. something like a top or a block of wood) is a tensor: you take the angular velocity vector of the object, plug it in to the inertia tensor, and it gives you the angular momentum vector.

I’m glossing over the distinction between covariant and contravariant tensors here. Believe me, you don’t want to know (and thankfully, in any simple examples, you don’t have to.)

You’re pretty close. The easiest way to understand this (IMO) is through the general mathematical descriptions. The difference herein being the dimensionality of each entity.

Scalar values are of singular dimension. Any quantity which can be represented by a single value is scalar. Examples of this include mass, temperature, etc. A line is not scalar, since a line must lie in some direction in space.

Vectors are essentially described by one dimensional arrays. Vectors are defined by a group of numbers (for example, a magnitude, and i j k multipliers to define a direction in 3D space). A line is a vector - look at the general 2D line equation (y=mx+b) m and b in this case are scalar values, but the line definition itself is a 1X2 vector. Similarly, a general polynomial (e.g. ax + bx^2 +cx^3 …+k) is a vector represented by a 1xn array of coefficients.

Tensors are described by arrays of any dimension. Scalars and vectors are both tensors (of zero and one dimension, respectively), but are such common cases that we give them their own names. Consequently, when we talk about tensors we are generally referring to arrays with greater than one dimension. Such examples include defining boundaries of space, quadratics and other functions which can only be described using an array of greater than one dimension.

Clear as mud?

Right. First as to particles. As you may be aware, one of the odd complications we find in quantum mechanics is spin. Particles may have spin 0, 1/2, 1, 3/2, etc, etc.

Well, particles with spin 0 are called scalar particles; particles with spin 1 are vector particles, and particles with spin 2 are tensor particles. It’s no accident that, for example, a spin 0 particle is called a scalar, but we needn’t go into the reasons behind that.

Now, a scalar is indeed a number, but a special type of number which doesn’t depend on what coordinates in which you measure it. For example, the distance between two points is the same regardless of what axes you use; this distance is a scalar. In contrast, the x coordinate of a point is also a number, but it depends on what coordinate system you’re using; it is hence not a scalar.

Vectors are the extension which include direction. So a vector is, loosely, an arrow which doesn’t depend on what coordinates you measure it in. As an example, the arrow which points from me to you; it doesn’t matter how I set up my coordinates to measure the arrow.

You can keep on extending this conceptual idea to include multiple instances of directions, and this multiple instances of directions is where it becomes easier to think of MikeS’s term “linear operator.”

But you always have to bear in mind that if changing coordinates changes the object, it’s not a tensor. Tensors in general, as people have mentioned, can be of arbitrary rank (they can have arbitrary numbers of directions, in other words). So a scalar is a type of tensor, as is a vector, and so forth.

A matrix, I will add, is not the same thing as a rank 2 tensor, because the thing that ultimately defines tensors is their transformation properties and the thing that defines a matrix is just that it’s a bunch of numbers arranged on a grid of some sort.

There’s a distinction here that’s easy to miss, so let me be absolutely explicit about it. It’s the distinction between “dimension” and “rank”.

Tensors exist in something called “space”, or more to the point “n-dimensional space”. A vector in n-dimensional space will have n components. For instance, normal space is 3-dimensional, so a typical vector like the velocity vector will have 3 components. In Relativity, we often deal in 4-dimensional spacetime, so the relativistic momentum vector (called the 4-momentum) has 4 components. In the linear algebra of quantum physics, you’ll often encounter vectors of any dimension, 2, 3, 6, or even infinity.

Rank is something else entirely. As has been said, vectors are tensors of rank 1, which means that you can easily represent them by a 1-dimensional array of n components. For n = 3:


a[sub]1[/sub]  a[sub]2[/sub]  a[sub]3[/sub]

The subscript numbers are called indices, and they specify which component of a tensor you want. An index can take any value from 1 to n.

A tensor of rank 2 can be represented by a 2-dimensional array, also known as a grid or a matrix, of n[sup]2[/sup] components:



a[sub]1,1[/sub]  a[sub]1,2[/sub]  a[sub]1,3[/sub]
a[sub]2,1[/sub]  a[sub]2,2[/sub]  a[sub]2,3[/sub]
a[sub]3,1[/sub]  a[sub]3,2[/sub]  a[sub]3,3[/sub]


If you have a good imagination, you can probably picture how you might represent a tensor of rank 3. It would have n[sup]3[/sup] components, and you would need 3 indices to specify a single component. (When you get comfortable with tensors, you don’t actually think of them as grids of components, but this visualization is useful for learning.) In general, a tensor of rank p in n-dimensional space will have n[sup]p[/sup] components. I’ve run into tensors up to rank 4 pretty frequently, like the Reimann Tensor. I’m sure that higher-order tensors are used, but it’s probably advanced stuff.

So here’s an instructive question. What’s the difference between a tensor of rank 2 in 3-dimensional space, and a tensor of rank 1 in 9-dimensional space? They both have 9 components. Why couldn’t we just write down the 9 components of our tensor above in a straight line?



a[sub]1,1[/sub]  a[sub]1,2[/sub]  a[sub]1,3[/sub]  a[sub]2,1[/sub]  a[sub]2,2[/sub]  a[sub]2,3[/sub]  a[sub]3,1[/sub]  a[sub]3,2[/sub]  a[sub]3,3[/sub]

I don’t know how to explain it succinctly, but you lose structure and meaning when you do this. Just like when working with matrices you want to be able to do operations like swapping rows and swapping columns, there are certain things (like transformation that g8rguy goes into) that you want to be able to do with tensors, and these things require you to treat them as having structure beyond “set of numbers”.

Goddamn, don’t I love these boards…

I guess I really do have to buy Mary Boas book…

IIRC the point is that a matrix can represent a rank 2 tensor. For instance a rank 2 tensor might be “rotate 90[sup]0[/sup] that-a-way” which might be represented by


(0,1) or (0,-1) or ( 0,1)
(-1,0)    (1,0)    (-2,0)

depending on your co-ordinate system(s).

If you always use a standard co-ordinate system you forget the difference and get confused (as I used to).

On the Nerd Purity Test, you can score a point for knowing the difference between a scalar and a vector and one for knowing the difference between a vector and a tensor.

Sure, Shade, and whenever I work with rank 2 tensors, I have this tendency to immediately write a matrix representation because I find it helpful.

But of course while the distinction between a 2nd rank tensor and its matrix representation may not be the easiest thing to explain, it’s important to note that the distinction exists, and that you can’t just write down some arbitrary entries in a matrix and call it a tensor.

One thing aproperty with both magnitude and direction isn’t necessarily a vector, it also has to obey the vector transformation properties. For example angular displacement is not a vector quantity desipte having magnitude and direction as it doesn’t obey the rules for transformation. Simlairly any tensor msut obey certain rules for transformation.