What order do people process numbers?

People go to war over endianness: do we put the most significant bit first or the least? Different computers do it different ways. Many number forms we see put most significant first, such as the time (hours, minutes, seconds), but Americans have this weird thing with the order of the date (month, day, year).

More fundamentally, though, is the order of the number itself. 193 is 100 plus 90 plus 3, not 300 plus 90 plus 1. My friend assures me that Arabic numerals, when imported from Arabic (read: right to left) writing, were not reversed. In other words, an Arabic reader encounters the least significant digit first when reading text, whereas we encounter the most significant first.

So the deep question: are humans big endian or little endian?

Well, some could be Middle-endian :smiley:

Ahem. I’m not really GQ here, but: You could consider math to be an extension of natural humanity, in which case I say big endian, as only the order of magnitude really matters. (until recently, relativly speaking).

I have 3 spears. Better save them.
The tribe has 4x members. That’s quite a few…
The stampeding herd has 2xx animals in it. Run!
And so on. :slight_smile:

I’m not quite sure what you’re getting at, but the “weird” thing we do with the date is put them in order from smallest possiblity (1-12) to largest (1-2003+). Weather that shows any indication of the importance of the largest number, I have no idea.

Personally, I see a number line in my head. It starts with 0-10, makes a left to 20, then arcs to the right to 100, makes a slight right to 1000 ( still a slight arc) then a sharp left and so on. I can visualize all of it from the opposite side, but normally I only see either side up to 100. After that, I’m sure I could see it either way, haven’t tried. Not important. What’s important is that it’s all equally significant as far as the digits are concerned. It’s merely how closely I focus in on the particular number (or division, multiplication, etc.) I happen to need to see for the mental problem solving.
Does this make any sense to anyone else? It has always made perfect sense to me, even as a child.

Wow! Better not mess with MajorTom.

Which is to say, “I don’t get it.” Can you draw us a map?

elfkin477, that’s a good way of putting it. It’s all about the mental model, and that one works to explain why it’s in that order. Not that I’m endorsing it – I think the size of the unit is more important than its range.

Nanoda, you could argue that the most important bit (the order of magnitude) should go closest to the object being enumerated. “My tribe has x4 members.” That’s quite a few, if the number is little-endian. Put the important information close together, right?

Even computers store numbers in reversed orders do it byte-wise. The bit orders inside a byte is never changed. For example, a 16-bit integer will be sorted MSB (most significant byte) first, then LSB. But calculations are always done in forward-order.

That doesn’t mean an Arab will interpret the “3” in “193” as most significant.

Too many dimensions to draw a map. I could take you there though. By the way my days of the week have colors attached.

Hmm. I guess I’m not cut out for this. I’ll post this hard sought and quite relevent link, as I want to ask MajorTom:
Does your numbering system have anything to do with synesthesia? (see other thread here.)

I’m not sure what that link said, cause it was very long and very complex. But I did get this out of it. Having the least significant digit first, like in arabic, makes perfect sense from a reading point of view. Which could mean that the history goes thus:

We have the number 123. The arab originators of the number system read this right to left as 3+20+100 and do impressive mathematical works (we ignore all previous number systems cause we don’t want to explain their influence).

Europeans adopt this number system since all the best maths is being imported from arabs, and keep the order to avoid getting confused by arab maths.

Europeans read the number left to right… No they don’t! They read the whole number as one, if it has few digits, and parse it right to left if it has many. Otherwise you have to go 1, no 10+2, no 100+20+3. Reading left to right you don’t know what the individual digits indicate, since this depends on how many digits follow.

And this ends this completely unscientific, but hopefully thoughtprovoking, lecture. :wink:

Nanoda:

Some seem to have been, as e.g. in German numbers are spelled out three-digit-wise-middle-endian, i.e. 123456 = ein[1]hundertdrei[3]undzwan[2]zigtausendvier[4]hundertsechs[6]undfünf[5]zig [digits inserted for translation purposes].

A manifestly perverse order that means you need to mentally swap digits when writing down a number from dictation. I have found that when I copy down a list of large numbers it helps if I do not try to understand them as numbers but mentally treat them as strings of characters.

I’m pretty sure i’m big-endian, but I havent really paid much attention except when I multiply 2 or 3-digit numbers in my head. I do the big digits first, THEN do the smaller digits and add them to the result (probably because the larger ones are easier to remember?)

  Actually, I was kidding about the days of the week. They used to have some color when I was a kid. Probably just a memory tool to learn them when I was about 2 or 3. I've forgotten what most of them were. My numbers have no color. Just odd or even, but each has it's place on that line heading to infinity.

naita, you make a strong argument for little-endian numbers. Now that you mention it, I do have to parse the full number (especially the number of digits in it) before I can name the first digit as n hundreds or n thousands, say. Based on that argument I’d say people do process numbers from small to big; they have to to figure out what ordinal the largest-order digit is.

Like Ludovic I add or subtract in my head using most-significant first. Odd, certainly wasn’t taught that, but it seems to work better. Anyone else?

(Multiplying, I approximate, then use MSB first when it’s gotten simple.)

Not sure if this is relevant but:

If I wrote a 17 digit number on a piece of paper and asked you which number was in the ten-millions place, wouldn’t you then process that number from right to left to count the place values? Wouldn’t that then put the emphasis on the “least significant” digit first? Things work differently with smaller numbers, because we can recognize thier place values by simply looking at them. So in the case of larger numbers I would say that we actually tend to process them from right to left.

By the way Major Tom, how often do you refer to this mental number line when working with numbers?

Usually when doing the math in my head, I see the line. Sometimes I don’t really need it, but it’s only a brain click away. I multiply 3 and 4 digit no’s mentally, but it takes awhile. I ain’t no savant! I round things off, do the mental math, then take the subtrahend if rounding up and do some more math etc. only trouble is remembering all the pruducts to add up when done. Same works for division but mentally a little more complicated. decimals dont bother me a bit.
The number line has decimals too. It’s just a matter of how closely I focus in on it. Each division of integers can easily become a whole new line from 0 to whatever depending on the accuracy needed.
I usually do this while driving. Seems to be the time when one part of the brain can concentrate on immediate needs (traffic etc) and the other part can figure out how bad in percentages someone screwed me out of a few cents or vice versa.
I’m a little amazed at how nobody else in this thread doesn’t see one too. How can one understand math without it. I don’t mean differential equations etc. but simply sitting at a conference table and immediately realizing that the math ain’t right!