Math Question

In math why should the denominator be rationalized. As in why can there be no square roots (or whatever roots) as a denominator. Well, there can be, but it’s standard to not do that, why is this done? Can someone help me?


i am special. i am cool. i am doper 3000!

Because it’s sloppy looking, and an equivalent fraction can be easily written that uses whole numbers.

Because mathematicians are obsessive/compulsive.

<P ALIGN=“CENTER”>Tris</P>

Have a place for everything and keep the thing somewhere else. This is not advice, it is custom.
Mark Twain

OK. I’m intrigued. How would a whole number version of 1/pi be done?

That seems to be pretty much limited to middle and high school. I’m a second-year math major, and I’ve never seen any of my instructors bother with it, especially in linear algebra, where they crop up all the time in unit vectors and things like that.


“That’s entertainment!” —Vlad the Impaler

The poog specifically asked about square roots (or whatever roots).

pi is a nice neat number in normal notation, and does not offend any mathematician’s sence of order.

Pi is, however, irrational.

[quote]
In math why should the denominator be rationalized? *

Yeah, he did ask specifically about square roots, but my interpretation was that he was wondering about irrational numbers in general.

(Quote. Itallics. Damn, I can’t make up my mind!)

it doesnt have to. but i guess it looks neater and math teachers are anal about it.

one other thing, it might be easier to think of physically when you see it…


Chief’s Domain - http://www.seas.ucla.edu/~ravi

I’m a grad student in math, and I’ve never cared one way or another about it.

Historically, though, I think the reason is probably this. When you’re “dividing by hand” in the standard way, it’s simpler if the denominator has a finite number of different digits, so that when you come to that step of multiplying by the denominator and then subtracting, you only deal with finitely many digits. The numerator, on the other hand, isn’t something to worry about–just expand it out to however many digits you need, then divide the way you normally do. (I suppose the same could be said of the denominator, actually, but it seems easier to keep track of such things as accuracy this way).

Now that calculators and computers can do it so fast either way, it’s not really important anymore.

I don’t think there is any hard and fast rule about what for is correct for the denominator. It’s just that having the denominator as simple as possible makes the value easier to combine with other expressions.


Virtually yours,

DrMatrix

It’s like reducing rational numbers to lowest terms – simply viewed as more elegant, easier to follow, and easier to compare.

There a lots of ways of writing down a given number, and if I get a different answer from yours, we need to see if the answers are equal. By having a “standard” agreement about how to write numbers, it becomes easier for us to compare our answers (because a lot of possible forms of the same number are eliminated from consideration.)

poogas21, be like a mathematician with constipation & work it out with a pencil.

Cabbage was right, it stems from pre-calculator days when people had to use a slide rule, large charts of numbers, or some other virtually extinct method. why its continued I’m not sure.