Was Newton wrong or just incomplete?

I’ve heard it said many times on this board (most recently today) that Einstein didn’t prove Newton wrong, just incomplete. I have in the past argued against this view, and I’d like to settle it once and for all. This is possibly more suited to Great Debates, but it feels like this question should have a factual answer.

The way I understand it, and I’m certainly no expert, Newton was wrong, but so close to being right that you might as well use Newton rather than Einstein in the majority of situations, since Newton is so much easier to calculate than Einstein. The closer you get to the speed of light, the larger Newton’s margin of error becomes, but the margin of error is always there. It’s not the case that in speeds of up to 50 mph or whatever, Newton is correct, but suddenly at that point he becomes incorrect.

Right?

As far as life here on Earth for things macroscopic, Newton was absolutely correct.

Haj

Absolutely, 100% correct, not even a tiny immeasurable shadow of a smidgeon wrong?

If you want to think of Newton’s Laws as “wrong,” go ahead. But relativity and quantum mechanics are just as “wrong,” since they don’t agree with each other.

From a scientist’s point of view, Newton’s Laws are merely incomplete. Under certain conditions (namely v << c) it’s a perfectly correct and useful theory to use. But with any law or theory in physics, you’ve got to know when it’s applicable and when it’s not. Currently there are no laws and theories that work under every conceivable condition in the universe.

I have a feeling my head’s going to ache, but if you have the time and inclination, please explain how they contradict each other.

So it isn’t true that Newton produces a tiny, tiny margin of error even at low speeds? If it isn’t, at what speed does the first margin of error emerge? If it is, isn’t it true that Newton was in fact wrong?

Newton derived his theories based on his observations of the Universe. The logical and mathematical steps he took after making his observations to develop his theories were absolutely, 100% correct. The effect of relativity upon earthly physics is so minute, that Newton could not possibly have measured it. Indeed, I believe the first time they were ever successfuly measured was late in the 20th century.

So the question is an ambiguous one. Yes, relativistic effects apply even at low speed, in tiny amounts. But because they could not possibly effect any measurement you’re going to make while traveling at human speeds to a significant degree, applying Newton’s formulae will yield correct results. That is, results which will not differ significantly from results found through the use of Relativity.

In other words, I’ll agree with scr4 and say: it depends.

Hmmm. I gave the answer as an engineer maybe a physicist would answer differently. Terms that are negligible drop out and count for nothing. That is, they are indestiguishable from noise.

I’ll try to give an example. Let’s say that I am making five pound wieghts. They won’t be exactly five pounds thnough because there will be error in my manufacturing process. If you measured them they might really be between 4.98 and 5.02 pounds. Let’s also say that my scale wasn’t perfectly accurate so my distribution might really be between 4.975 and 5.025 pounds. If there was an effect that would further change things by 0.000001 pounds it would be negligable, in the noise, and it would be exactly the same as if it wasn’t there.

Haj

It is true. You can say that Newton’s Laws are always wrong, but often it’s close enough that it’s useful to have around. Another way to think of it is that for situations where non-relativistic effects are negligible (far smaller than experimental error), Newton’s Laws are the correct. I think the latter is how most scientists and engineers think.

Um, I obviously meant “for situations where relativistic effects are negligible”.

In a recent thread involving Maxwell’s equations, which are a cornerstone of special relativity, an effect was discussed which is not accounted for by Maxwell’s equations.

Centripetal acceleration in a curved electrical current

So special relativity could be said to be incomplete. No-one worries about it too much, because the effect is small.

Note: I’m not dissing Newton. God Almighty, no. He did everything he could with the tools he had, and that was infinitely more than I could do with the tools I have today. The guy rules.

Okay then, that’s what I thought. So, strictly speaking, Newton is wrong by a tiny tiny tiny amount, but to all useful purposes he appears to be totally correct, and therefore there’s no point in not using Newton.

Not to all useful purposes. I remember reading that if relativistic time dilation effects were not taken into account, GPS readings could be off by tens of meters. There are also atomic clocks which can detect the relativistic effect due to the difference in gravitation field between two floors of the same building.

I’m not dissing Newton either - in fact, my cat is named after him.

DarrenS is right.

Because of the satellites high altitude, the gravitational curvature is less than on the surface of the earth. So we know from General Relativity that this makes the satellite’s clocks tick faster to an observer on the earth (by about 55 microseconds per day if my memory serves me.)

Also, the GPS satellites are moving relative to the surface of the earth, and so, amusingly, according to Special Relativity, an observer on earth would see their internal clocks tick slightly slowly. Without correcting for SR, the GPS satellites would lose about 7 microseconds per day.

GPS measurements need to be extremely precise, of course, down to the nanosecond, since they’re calculating the source of radio waves, which travel at the speed of light.

Am I right in recalling that there was a slight anomoly between Newton’s laws and the observed orbit of Mercury and that prior to Einstein, this was widely belived to be due to another, undetected, planet? I am not sure when the anomoly was noticed, but it was a long time ago. In fact, the anomoly was explained by applying the principles of relativity.

All scientific theories are approximations to reality. Theory here being in the sense of a “model”. Anomolies may or may not be detectable at any given point in history, but that does not mean that they are not correct, just that they are known to be or will probably one day be shown to be, incomplete.

AFAIK there are incompatibilities between relativity and quantum mechanics although I don’t even begin to understand the details. This would certainly suggest that at least one of these theories will turn out to be incomplete.

This puts me in mind of an old story told by an old professor of mine:

"So you’re in one corner of the room, and Raquel Welch is in the other corner. You walk halfway to her. Then you walk halfway the remaining distance, and so on.

The mathematician says you will never quite get to her. The engineer says you will get her for all intents and purposes."

I read somewhere (no, I don’t have a specific cite: it was long ago in a university far, far away - though I believe it might have been something written by Petr Beckmann) that Newton wasn’t really wrong even when relativistic effects are taken into account. Apparently in the Principia, Newton did not write: F = m·a (force equals mass times acceleration), but rather expressed it as F = d(m·v)/dt (force equals the derivative, with respect to time, of the product of mass times velocity). This equation holds true even at relativistic speeds, when mass is not constant. Those who followed Newton were the ones who made the “obvious” simplification that, since mass is constant (they assumed), the second equation could be written as F = m·(dv/dt) (force equals mass times the derivative of velocity with respect to time), and then simplified as F = m·a - an equation which is correct as long as mass is constant.

In any case, Newton wasn’t “wrong” for not anticipating the Fitzgerald Contraction or any other relativistic effects, any more than Euclid was “wrong” for not developing elliptical geometry: in both cases they developed mathematical models and tools which accurately described the observed physical universe they lived in. Nothing wrong about that.

The perihelion of Mercury precesses (at, I should add, the tiny rate of 5600 arcseconds/century). The vast majority of this precession was explained by interaction with the other planets, but remaining was an error of some 43 arcseconds/century that could not be accounted for. Relativity correctly predicts this value.

More pedantically, neither is complete as it stands. The other forces in nature are all explained by quantum mechanics, but gravity is not. Thus, GR must be modified to include QM, and QM must be modified to include gravity. This is, however, far more easily said than done.

For all intents and purposes, you can neglect relativity under 0.1c. In other words, that’s about speeds of about 67 million miles per hour, or 18,600 miles per second or 98,000,000 feet per second. Metric, that’s 30,000,000 meters per second or 108,000,000 kilometers per hour or 30,000 kilometers per second.

As noted, there are cases where the precision needed is such that the relativistic effect, while small, is essential in getting a correct calculation. The mathematicans are the ones interested in exactly what is true, the rest of the scientists are interested in what’s a good enough approximation for the calculations.

This is a debate thread, because the answer is philosophical. You can say incomplete = wrong if you want. You can say the answer from both forms is indistinguishable for the data used at the time if you want.

To say it is wrong, you must allow that it is mostly right, or nearly right, or a close approximation, or however else you want to word it to allow that the equations work. To say it is incomplete is another way of saying it is mostly right.

Note that Newton did describe Force as the time derivative of momentum. F = d(mv)/dt However, I’m pretty sure Newton treated mass as a constant. The reason the above is important pre-relativity is the idea of burning fuel. If you’re burning fuel, your mass has a rate of change wrt time. Note that Newton was using the Galilean transform, and didn’t anticipate the Lorentz transform.

If I exaggerate the point made by others, isn’t it true that everything is a tiny, tiny, tiny bit wrong? What statement is guaranteed to be absolutely right in all circumstances?

There is always another interpretation, a counter-example, a refined measurement that deviates from the statement.

Hence, Newton was wrong, and Einstein was probably wrong too. However, I don’t know what benefit is to be had by classifying their work that way, unless you have new evidence or a new theory to propose.