Forgive me if this sounds naive, but he’s a Q for all you computer geeks:
Everyone knows that computers read binary (base 2) code. I assume (…and maybe I’m wrong here…) this was done because primative computers only has the ability to “read” and/or “write” ones and zeros (ons and offs, as it were).
But, hell, that was fifty years ago. Has anyone invented the technology that would enable a computer to use “tri-nary” (base 3) code? Or how about good ol’ decimal (base 10) code?
Binary seems very simple, very elegant, and probably pretty foolproof – but also inefficient (storage-wise) and cumbersome (calculation-wise). I assume it’s sort of like building a house with Legos – sure it can be done, but wouldn’t real bricks work better?
Now, before everyone chimes in about the cost, difficulty and undesirability of replacing an entrenched, accepted technology, be warned that that is not what this Q is all about. I just want to know if there are substantial benefits of using a higher-base code, and if anyone has ever tried it. Thanks one and all.