is Base 12 mathmatics easier?

Hard to explain, but octal is easier to visualize, count on your fingers, and to memorize. I’m in the Octal Rules faction of society.

Aside from what FranticMad just posted, octal seems to me to be a good balance between the size of the multiplication table (which in my mind rules out hexadecimal and duodecimal) and the length of the resulting numbers (which rules out binary).

(Aside to X~Slayer(ALE): I have actually seen serious arguments for converting to binary. Frederik Pohl wrote an essay on it once.)

It would, but if we lived in a base12 society, dividing things by 5 would be fairly unnatural.

another way to thing about it is this:

0.5(base10) == 0.6(base12)

The ancient Sumerians had a system that was base-12/base-60 and they all died. Coincidence, I think not.

Of course, if we were having this conversation in 1950, we probably wouldn’t be bringing up octal and hexadecimal. So many of us have had to think about bit patterns for binary computer representations that we’ve become acculturated to them. I’m not sure they are all that ideal for general use - factorability by only powers of two is a pain when you’re buying something at the grocery.

BTW, the fact that there are 8 bits in a byte makes me choose hex over octal any day. Every two hex digits is a byte. I would a lot rather look at a dump consisting of:

XX XX XX XX
XX XX XX XX
XX XX XX XX

than:

oOO oOO oOO oOO
oOO oOO oOO oOO
oOO oOO oOO oOO

where o is 0, 1, 2 or 3 and O is 0-7.

I find it a lot more “natural” to parse relevant bit masks and so on out of hex than octal. 3 bit fields just seems “weird”. Except for character tables, of course, because somebody decided that those would customarily be octal.

Of course, next I could do my little spiel about how much better off we’d be if we had settled on a 12 bit “byte” …

Karma police, arrest this man.

Is there any indication that the human brain has a biological preference for any particular number system?

You may enjoy this page.

Hmm… I scored better on the fifths than I did on the half! :stuck_out_tongue:

yabobSure, hex is great for dumps, and everything that implies.

Octal is for real men doing real mathematics. Threes are natural. They keep a table steady while you’re dining with a date. A menage a trois would be just wrong with 16 people in the room. I rest my case.

Real Programmers write octal directly to core. They hand-hack opcode placement so binary that is a subroutine on the first iteration can be data on the second, and they aren’t afraid of writing self-modifying code.

:smiley:

…cause we love to watch a pipelined architecture make everything slower. :wink: