Why 0 ÷ 0 = 1

the division operation is just not defined when the denominator is 0. it’s just that simple. no way to justify how it is defined is valid. limits are not valid, because taking the limit is not the same operation as division.

most operators we use these days, including multiplication, division, addition, and subtraction, have their definitions in group and ring theory (abstract algebra).

an operator is just a function. like all functions, it needs a definition. the division operator we most often talk about is defined from an ordered pair of reals to a real. if the second of the pair it operates on is 0, there is no number it yields. it’s just defined that way.

asking what a number divided by 0 yields is like asking what radiohead marrying a prime number yields. it’s just nonsensical. the words and operators just aren’t defined to be meaningful with those terms.

Ah yes - I can see I was quite wrong - I blame it having been late in the evening for myself! :wink:

Yes - undefined - quite right.

On one of my first calculators (I think it was a TI-55), if you entered 0÷0 the answered returned as 1.
(of course, it was blinking, which indicated error, but I still thought it was kinda cool to have a calculater that produced a wrong answer).


Originally posted by nogginhead
Take the limit of x/x as x -> 0.

x/x = 1, so x/x = 1 as x -> 0.

Why are 1/x and x[sup]2[/sup]/x relevant? We’re not talking about anything divided by 0, only 0.

So under this argument, any number is right. Including 1.

Excluding toy math systems (i.e. those without applications outside mathematics) I believe this definition only applies when b=0. So it’s a specific rule about what happens when you divide by 0, and is thus an arbirtrary rule. I can just as easily make an arbitrary rule that x/0 = +infty if x>0, = -infty if x < 0, and =1 if x = 0.

We also define 0! =1, for example, because it’s convenient.

However, defining 0! = 1 really is convenient. The usual formulas work with this definition.
E.g., (n+1)!/n! = n+1, for all positive integer values of n, including n=0.

Defining x/0 = +infty doesn’t actually mean anything, because +infty isn’t a real number. If you want to extend the real number system to include +infty and -infty, you will lose a good many relationships. Or, you would have to have two classes of numbers, the finite ones and the infinite ones. Then you could try to derive different threorems for various subsets.

The bottom line is that defining 0/1 = +infty is not convenient.

A little learning is a dangerous thing;
drink deep, or taste not the Pierian spring:
there shallow draughts intoxicate the brain,
and drinking largely sobers us again.

Alexander Pope

I was just wondering about the marbles everyone is talking about, because I lost mine and I think that maybe they might be mine.

OK, december, as long as we’re sniping with quotations, how about:

and, more to the point

My point was that if one were to find it convenient to define 0/0 as 1, it would be as acceptable an arbitrary rule as to define it as undefined. Same for x/0=infty, though that wasn’t what I was driving at especially.

Fine. So what’s the limit of 0/x as x approaches 0?

**

Yes, any number. That’s the point. 0/0 is undefined because there’s no unique solution to the equation 0*x=0. An operation which doesn’t yield a unique result is not well defined.

It’s the mathematical definition of division, I’m afraid. You believe wrong.

Well, unless you’re referring to the more common-sense, “how many A’s in a B” definition of division…which, as Mangetout and astorian have pointed out, still doesn’t define 0/0.

The definition using limits doesn’t let you define it either, 'cause x/x and 0/x don’t have the same limit as x approaches 0. And that’s it; we’re out of definitions of division. Except, of course, the arbitrary one:

Except that as silverfish demonstrated, if you assume 0/0=1 then you can “prove” that x=1 for any x, which makes your arbitrary definition about as useful as hammer made of sponge cake.

I feel kind of silly just coming in here and pointing to everyone else’s posts, but it seems the points need to be repeated.

I haven’t read many of the replys on this post, buthere are my thoughts. Not necessarily going by what I learned in school, but anyways. 0= a number, but it holds no value (i.e. it is neutral)
0/0=0. Yet it is still a number (and without zero, you cannot get to one, because it is at the beginning of the number scale (excluding negative integers in this discussion)). 0/0=1. Does this make sense to anyone else?:confused:

… officially abandoning line of argument …

Just to say to orbifold that aspect of the definition I was referring to is the part about uniqueness/definedness, which only exists to deal with the 0 problem.

my point is that just defining something away doesn’t make it not exist.

That’s a reasonable point nogginhead, except that we’re not “defining something away”. There’s no reasonable definition of 0/0 in the first place. Saying that 0/0 is undefined isn’t an arbitrary decision, it’s a recognition of the fact the definition of division we already have for other numbers just doesn’t work for 0/0, and there are good reasons why we can’t make it work.

I give in already. It’s best to define it as undefined.

Excuse me for asking, but how the hell do you have zero groups of anything, even if it’s zero groups of zero? I would definately go with the undefined that I was taught, because it makes sense .

x÷x=1, yes, but not for 0÷0 which is = 0 therefore not = 1;

0÷0 = 0 × (1/0) = zeroes cancel = 1! ===> No way. You can’t treat 0 as an egg that is physical. 0 means nothing, therefore the operation “0 × (1/0)” is basically is “0 x something” which is 0.

When x=0.001, Sinx = 0.84147098, therefore sinx/x = 0.99999983 = 1.

The amazing thing to me is using a limit to attempt to show what happens when you divide by zero. Especially since limits were introduced to precisely what happens when not at a specific point.

lim x->0 specifically refers to x =/= 0. Sheesh.

emarkp, you bring up a good point. The only reason to bring up limits would be to suggest a reasonable definition of 0/0. Since x/x and 2x/x reach different limits as x approaches 0, this does not get us anywhere.

The bottom line is: a/b is the unique number c that satisfies a = bc. For b = 0, there is either no value for c (for a non-zero) or no unique number (for a = 0) that satisfies a = bc.

There is no useful definiton if a/0.

Well, this is a truly absurb way of looking at things! I would say limits were introduced precisely as a way of finding out what is happening at a specific point when it would be difficult to do so directly; we do it by getting arbitrarily close to the point and see what happens!

Thus limits are extremely useful in this case because while we can’t understand 0/0, we can understand it in the sense where the numerator and denominator are continuous functions of x [call them f(x) and g(x)] that approach 0 as x->0. That limit leads us to the conclusion that we should consider 0/0 to be indeterminate we can find functions f(x)/g(x) that give any value in the limit that x->0.

There should be a “because” before the “we can”.

No. You are not looking at what happens at a specific point. You are looking at what happens “near” a point. The fact that lim (x->0) sin x/x = 1 does not mean that sin 0/0 = 1. Look at the definition of limit. If the limit is taken as x approaches a, the value at a is specifically excluded.

It is interesting that limits are brought up. The primary motivation for using limits in calculus is to evaluate lim (delta-x->0) delta-y/delta-x. Delta-y and delta-x both approach zero as delta-x approaches 0. The fact that this limit is (with qualifications) defined, but 0/0 is not tells me that limits are no help here.

Let’s consider

lim x^3- 8
[sup]x->2[/sup] x-2