Tensors for dummies...

In this thread many of the posters engaged in a highly technical discussion of the definition and applications of tensors. Some posters, who could not follow the discussion due to its advanced nature, expressed frustration at its opacity to a general audience. While I do not share the view that technical discussions have no place here, I do share the curiosity of a person interested in physics, but unfamiliar with (and perhaps quite incapable of grasping, in my case) its advanced mathematical tools.

One of my favorite popularizations of science is Richard Feynman’s QED: The Strange Theory of Light and Matter. It’s not an easy read, but it’s accessible, with effort, to those having only basic geometric and algebraic skills, plus a willingness to learn. What Feynman set out to do in this book is describe the path integral method of making calculations in quantum field theory in a way that conveyed the essence of the approach without requiring a graduate degree in physics. As an analogy, he explained that one could carry out subtraction by placing a known number of beans in a pot, removing a certain number of beans, and then counting the beans still in the pot to determine the remainder. One could also use arithmatic (imagine how you would subtract 177 from 243 to see what I mean), but it would be possible, if slow and inelegant, to get the same result the bean-counting way. So, Feynman claimed, he was not “dumbing-down” QED; rather he was taking its basic components and presenting them in the most intuitive way he could think of, thus allowing curious students to do simplified calculations of path integrals describing electromagnetic phenomena. The desired goal was to not only familiarize students with some of the principles of quantum field theory, he wanted to show what such principles implied about the way nature behaves.

So, I have an exorbitant request: Can the brilliant and learned (I do not use those adjectives at all facetiously) physicists and mathematicians who contribute to this forum help some of us understand tensors and their application to physics (i.e. the world as we understand it), perhaps in manner somewhat like Feynman did in QED? It’s a tall order, and I’m mindful of the effort that could be involved in granting the request. All I can offer in return is my desire and effort to follow along as best I can (others likely share my interest); and perhaps also, like Feynman, I can also offer up the possibility that you might find it a rewarding heuristic and pedagogical exercise, if heuristics and pedagogy are things you find satisfying or important.

Well, it’s worth a shot ;).

Here’s one way of visualizing a tensor.

Imagine a matrix, a matrix is a mathematical entity that displays symbols in rows and columns. A matrix can be square (equal number of rows and columns), but it doesn’t have to be. Now imagine the sides of that matrix are elastic, you can stretch the length or width of the matrix as far as you like, filling in the new rows and columns with more symbols.

After you have stretched the sides out to your liking, think of your matrix as a large sheet of paper. Now duplicate that matrix as many times as you like, filling in the new rows and columns with more symbols, in effect making a book of matrix pages. So now what you have is a book full of matrices that can have any number of pages and any dimension for width (rows) or height (columns). What you have created is known as a tensor, albeit a very elementary tensor.

The next logical step would be to think of each of the symbols in each of the matrices themselves being books of elastic matrix pages.

Roughly speaking, a tensor of rank n is an object that requires n indices to specify an element. So a scalar, like 2, is a tensor of rank 0, because there’s no need for indices to specify any element. A vector is a tensor of rank 1 because you only need one index to specify an element. A matrix (well, not all matrices) is a tensor of rank 2 because you need two indices to specify an element.

Tensors of rank 3 and above don’t really correspond to anything that a non-mathematician would be familiar with, but they’re out there.

It’s perhaps a good idea to understand some of the motivation behind formulating physics in terms of tensors:

We should not expect the laws of physics to be frame-dependent, that is to say we should not expect the laws of physics, as viewed by some observe,r to be dependent on their velocity, accelertaion, etc. We can think of tensors as ‘objects’ which do not change under a transformation from one frame to another (even if their componets change). At this point is probaly best to think about how a vector is essientially the same ‘object’ even in different frames, as a vector is a tensor of type (1,0).

A zeroth rank tensor is a scalar,it iks a single number like a distance fom one point to another. A first rnak tensor is a vector (or a one-orm) like for example a radius vector. tensors in n diemnsions have n[sup]r[/sup] compoents (where r is their rank).

You can think of tensors as matrices, for example a vector is a column array and a tensor of type (1,1) is a square array. Tensors of greater than rank two cannot be expressed as normal arrays, howver if you think of an ‘n-dimensional’ (I use the term dimension very, very loosely) array then that correspons to an nth rank tensor. But as a word of warning: thinking of tensors as arrays can get you into trouble if you don’t know the underlying definition of a tensor.

I’ll give a proper defitnion tommorow.

I can try to explain how they’re used in general relativity.

Imagine a large map with a bunch of cities on it. Now write down the temperature at each city on a given day at a given time. You now have a map with a number at each city. No matter how you redraw the map, the numbers at each city will always be the same. If you decide you don’t like north being up but would rather have east be up, it makes no difference. This is a scalar - something that does not change when you change how you look at things by doing things like rotating or changing where you start your measurements or things like that. I will call such changes a transformation.

Now imagine you write down the longitude and latitude of each city. So now each city has a pair of number assigned to it. But what happens if you redefine how you want to measure longitude and latitude? What if you want draw a line from Capetown, South Africa to New York City and use that as your zero for longitude and measure latitude perpendicularly. Surely you can do that and draw another map, right? If you think about it though, you can’t do this without knowing both the original latitude and the original longitude. One number simply doesn’t suffice, as it did with temperature. That is because longitude and latitude are linked to each other - physically, they are two parts of the same thing. This is called a vector. A vector is a collection of numbers which transform together. That is, to transform a vector you need all of the elements of the vector.

Now consider electric and magnetic fields. Often those are depicted as vectors. If you rotate your frame of reference, they rotate just like vectors should. However, a while ago, people discovered that thats not the whole story. If you go really fast, magnetic and electric fields start to mix. Just like longitude and latitude are two parts of the same thing, so too are electric and magnetic fields - the electromagnetic field. And this is where tensors come in. Just as vectors are a collection of what seem like scalars, tensors are a way of combining what seem like vectors. The electromagnetic field is best described as a tensor because it allows you to couple together both parts of it. Thus, at least in GR, a tensor is just a way to transform vectors together.

I hope this helps!

Man this place makes me happy sometimes! Thank you all so far, and keep it coming, please. For my part, I’m reviewing matrices, as they are, of course, relevant to this discussion (though I was never taught in school that a tensor and a matrix can be the same thing in some circumstances). I think I remember how to multiply them, and also recall a curious fact (to me, anyway) that multiplied matrices could yield a product with fewer dimensions than its multiples, which seemed to mean something deep at the time, though I don’t know what! Anyhow, I’m grateful to all of the above for even trying, and I await further instruction eagerly!

Learn something new every day.

hmm be careful as you really have to look at matrices in the context of linear algebra. Unless you restrict yourself to talking about vectors as column vectors, covectors as row vectors(or I suppose vice versa) and tensors of type (1,1) as squae matrices, you’ll find at the very least some arbitariness in defining tensors in terms of arrays. For example you can write a tensor of type (0,2) as a square array, but multiply it (take the inner product) by a vector (a column vector) and though in terms of the matrices you get another (column) vector out, it is in fact a covector (which had earlier been defined as row vectors).

Thanks for the thread, Loopydude!

Feynman has the reputation of being a brilliant physicist, but was an even better explainer, and a very funny guy as well! I remember years ago seeing his undergrad lectures on physics and being amazed at how clear (and exciting!) he made it all seem. (They were for a course that had the room just before the class I was in. I kept coming earlier and earlier just to watch the movies.) I doubt we have anybody here quite as good as Feynman, but I know we have some who come close…

… and on preview I see that a couple of those explainers have arrived! Thanks for the replies!

I suppose, since we’re in GR, that the elements of those vectors we’re transforming include e, m, and c, or at least two of the three. Is that right?

Hand-waving answers are expected here, of course. They’re what I asked for in the other thread. :smiley:

Well no not really. m is the length of a tensor (the four mometumnm vector) in GR and is therefore a zero rank tensor. E is a componet of the same tensor, and therfore frmae-depnednt and not a tensor. c would pop up alot, but it pops up so much we just set it as equal to 1 and ignore it.