Early computers did have longer bytes, and shorter ones too. There was a CDC mainframe that had ten-bit bytes specificallly, like you mention. Six, nine, and twelve bits were other choices.
The eight-bit byte is the standard everyone was settling on by the 1970s —probably because it’s a power of two, and because it’s just large enough to hold the sorts of character sets that were popular at the time.
I think it would. That blunder had nothing to do with binary arithmetic or wrap-around issues. Rather, there were databases and programs that represented years as numbers from 0 to 99, making 2000 indistinguishable from 1900. The underlying representation in binary didn’t factor into it.
If your computers are based on binary arithmetic, you can hardly avoid octal or hexadecimal as handy shorthands. You’d have to use a decimal computer instead, or maybe a ternary one.
Actually if guns were created from scratch today, the firing mechanism would probably be electrical. The whole deal with a hammer and firing pin hitting the base of the cartridge is due entirely to how gun design evolved:[ul][]hole in breach to apply hot coal or wire to[]pulling a lever moves a “slow match” to the hole[]pulling a lever strkes a spark that sets off a powder charge that ignites the main charge[]pulling a lever strikes a percussion cap that ignites the main chargepulling a lever strikes a percussion cap built into the cartridge[/ul]
Cetaceans have made a good start on that. Admittedly, they aren’t giving political stump speeches via blowhole, but they have a wide range of acoustic capability that could easily provide necessary information bandwidth for verbal communication. That is, if they had anything they wished to convey to humanity.
However, the separate orifice and trachea of the cetaceans evolved due to specific environmental pressures, and other aquatic mammals have not seen fit to duplicate the arrangement. (Since cetaceans are the only marine mammals that travel regularly long distances in the open ocean rather than living in littoral zones, this likely increased the utility of a blowhole.) So, while it can be done–albeit, not in any way that would be recognizably human) there is presumably no great value in doing so, at least from a selective point of view.
[QUOTE=Lumpy]
Actually if guns were created from scratch today, the firing mechanism would probably be electrical. The whole deal with a hammer and firing pin hitting the base of the cartridge is due entirely to how gun design evolved:[ul][li]hole in breach to apply hot coal or wire to[]pulling a lever moves a “slow match” to the hole[]pulling a lever strkes a spark that sets off a powder charge that ignites the main charge[]pulling a lever strikes a percussion cap that ignites the main charge[]pulling a lever strikes a percussion cap built into the cartridge[/ul][/li][/QUOTE]
Also, I doubt we’d have revolvers- remember, they only came about because it was pretty much the only practical proposition for allowing a single-barrelled cap & ball firearm to hold more than one shot.
I can’t imagine anyone would be using Fahrenheit if we had to come up with a temperature scale from scratch today, either…
No, the CDC 6000 series had six-bit bytes for characters (no lower case). The 10 was characters in a word – they had 60-bit words, which could hold 10 six-bit bytes. (Or instructions of 15-, 30-, or 60-bits.)
And you are certainly correct about the Y2K problem still happening. That was an economic decision, saving money by taking 33% less space to store each date. (And storage space was very expensive at that time!) And I don’t know that I would call it a blunder. It saved companies many millions of dollars over the years those systems were in use. Probably more than was spent on fixing it in 1998-1999. (And that cost was exaggerated, too. Many companies where I worked at that time charged a lot of overdue maintenance & upgrade work to “Y2K”, because everyone agreed that HAD to be done.)
Really, the Y2K problem happened because the programmers back then built systems much better than anyone (including themselves) expected. Nobody thought that their programs would still be running decades later, but they were. Programs nowdays don’t seem to be written that well. Microsoft operating systems have a lifetime of less than 5 years.
That’s true. Over the years, computer game packaging seems to be sliding downward – slimmer, plain smaller in general. My copy of Sims 2: Seasons (which I gotta return) is a smaller hardcase as opposed to the box for King’s Quest 5, which as I recall was the size of a huge tome.
While it makes sense to fit for existing space, there’s also more pressure to move more volume. Shelving in stores is, or is becoming, more modular.
Well, the Fahrenheit scale was proposed only 20 years earlier than Celsius/ Centigrade. I can’t imagine that Fahrenheit became so intrinsically linked with science and industry in that short a time that it couldn’t be replaced.
I wish that the U.S. had embraced the metric system from the start, but we are too deeply entrenched in using such an awkward and inconsistent system of weights and measures whose units have no rhyme or reason for their being and bear little relation to one another that there is any hope for full-scale conversion to metric.
I would think that the phone numbering scheme would be different than what it is today if it were redesigned with future expansion in mind. The best fix for this would be a xxx-xxxx-xxxx system that would allow for (theoretically) 10,000 prefixes per area code, which would eliminate the need for the mess of new area codes that have proliferated in recent years.
Likewise, zip codes would make more sense if they contained six digits instead of five. Each state would be assigned one two-digit prefix (i.e. “13” for Idaho, as it’s the 13th state in alphabetical order when you include the District of Columbia) and then the remaining four digits would comprise the rest of the code. This would allow for 10,000 codes per state, which would be enough for most states.
Since I will probably miss the edit window I will add this in a new post.
Musical notation could be redesigned, too. The current convention of sharps and flats is daunting and perhaps intimidating to many budding musicians. If we simply assigned the twelve tones currently recognized by most western music its own letter and have a scale of A through L (and why do we start with C at the beginning of the scale?) this would make more sense than to have a system of flats and sharps, especially when this creates duplicate notes (a Bb is the same as an A#, for example). Of course someone who is better versed in music theory will probably find flaws in this proposed method.
TCP/IP would be message/packet oriented rather than byte-stream oriented. Not having any notion of message boundaries makes a lot of things more complicated than need be.
DOS could have used forward slash rather than backward slash for pathname separation and made everybody’s life easier.
Having only one of big-endain vs little-endian would be nice.
And I wish to God that someone had decided once and for all if things like filenames are case-sensitive or not.
Australia announced its switch to metric in 1970 and had almost entirely switched by 1981. There’s no reason we couldn’t do the same here, especially since so much of our existing supply chain (China and Japan, for example) is almost completely metricized.
Metric conversion has failed in the US (and Britain, somewhat) because people are too lazy to learn it and the government has made no effort to promote it.
That’s only in the key of C. In other keys, you start on other notes at the beginning of the scale. In the key of A, you start with A.
The key of C does seem to be the “default key” on the piano, because it only uses the white keys. Other keys are more logical “default keys” on other instruments, and people sing in different keys, so you do need other keys if you’re going to accompany a singer or other instruments.
I’m not much better versed in music theory, but here’s one: there is no key in which you go through all the notes from A to L in sequence. Your scales (I’ll assume we’re not throwing out the concept of major and minor scales, just changing the notation) are going to be a sequence with some missing letters, and I don’t really see how that’s much of an improvement. In fact, I think it’s worse:
C major scale in current notation:
C D E F G A B C
C major scale in your notation (I’ll assume we’re using A as the first letter):
A C E F H J L A
G major scale in current notation:
G A B C D E F# G
G major scale in your notation:
H J L A C E F H
F major scale in current notation:
F G A Bb C D E F
F major scale in your notation:
F H J K A C E F
In the standard notation, at least the letters generally stay the same (but sometimes have sharps or flats after them). In your notation, the sequence of letters changes. It also gets rid of quick mnemonics for keys: “G major is all natural (not sharps or flats) keys except F sharp” or “F major is all naturals except B flat”.
It pains me, as a Unix person and oftentime Microsoft- and Windows-hater, to say this, but I think DOS got this one right and Unix got it wrong. Forward slashes happen in normal writing, back slashes don’t. You don’t want to have any character as your pathname separator that anyone might be at all likely to use in a filename. I think clueless users are more likely to use / than \ in a filename.
They’d have had to pick something else for escaping characters, though. The classic example is if a clueless user manages to create a file named * (just an asterisk). Since that’s a wildcard character, it causes problems if the user then tries to remove the file *- if they naively do “rm *”, they remove all the files in the directory with the * file. You can get around that with “rm *”, which tells the shell that the asterisk is just an asterisk and not a wildcard.
The date problem was because of the 8 bits per byte. Two extra bits and the date would have likely not had a Y2K glitch. It would have happened later. The processors did 8 bit register loads and shifts. Had they handled ten bits at a time and later 20, much would have been different. I can remember using every bit in a byte to represent data, which is what the problem was with Y2K. Today you’d use a whole byte even if something only needed to represent on or off.
(other examples omitted) The advantages that you are listing have to do with a large vocabulary, which in my mind is completely separate from the advantages cited by iamthewalrus(:3= in favour of Esperanto: Regular grammar, regular conjugation, regular syntax.