I’m a grad student in math, and I’ve never cared one way or another about it.

Historically, though, I think the reason is probably this. When you’re “dividing by hand” in the standard way, it’s simpler if the denominator has a finite number of different digits, so that when you come to that step of multiplying by the denominator and then subtracting, you only deal with finitely many digits. The numerator, on the other hand, isn’t something to worry about–just expand it out to however many digits you need, then divide the way you normally do. (I suppose the same could be said of the denominator, actually, but it seems easier to keep track of such things as accuracy this way).

Now that calculators and computers can do it so fast either way, it’s not really important anymore.