Darn! Now that I’ve written a reply, I can’t find the original thread!

The question (of + and - zero) reminds me of the early development of Calculus. The theory of limits had not been developed. Differential Calculus was based on infinitesimals. You started out with the idea that rate-of-change could be approximated by comparing delta y with delta x; that is, a very small chage in y (“rise”) divided by a very small change in x (“run”) gave you the rate of change of a function, at least approximately. (Where the function was a constant, such a x = 3, or was a constant times x, such as y=x/2, we need go no further, since ANY delta, no matter how large, would give the same result.)

To get the desired practical accuracy, and more important, the desired theoretical accuracy for more involved functions, the idea of infinitesimals was intoduced. That is, the next step was to substitute a boost in x that was INFINITELY SMALL, and compare the change in y, which would also be infinitely small, at least presumably.

The idea was that you could not use a “zero” flux of x and then compare the flux of y, which would have to be zero as well. I should say at this point that otherwise the function would not be “continuous”. (This is a concept which was not fully understood until “limits” were developed.) While there ARE functions that have discontinuities (it does not exclude them from consideration as functions) you cannot have a differential at a point unless you have “continuity” at that point. I won’t try to define continuity. My very excellent freshman text approached a workable definition was using FIVE steps.

Why can’t you use these zero fluxes? Well 0 y / 0 x is indeterminate, for the simple reason that 0/0 is indeterminate. This is based on the fact that ANY number times zero is zero.

So what exactly is an “infinitesimal”? The idea was that a value could be INFINITELY small in magnitude (absolute value for real numbers) and yet somehow NOT be equal to zero. In effect, you could have your cake and eat it as well.

Along came someone schooled in philosophy and logic and yelled, Wait a minute!

You simply can’t have that! Zero is unique and proveably the ONLY value that is “infinitely small” in magnitude/absolute value.

But… Calculus still worked just fine, even with a faulty theoretical basis. Hence mathematicians set out to find one that did work as a basis for substitution… and eventually found “limits”!

(I believe it was the Bishop of Canterbury who raised this objection.)

So the uniqueness of ZERO was saved, and years later the “Seinfeld” show emerged as one of its fruits.

Bada-dadada… dada-dada!

True Blue Jack