~ (Literally)

The 3 digit numbers are extended ASCII codes and the 4 digit ones are Unicode. If you google for “ascii chart” and “unicode chart” you’ll find complete listings of the symbols with their numbers.

And how about Alt+0008 (backspace) (although this reply box is making a liar out of me)
Alt+0013 (Enter)
Alt+0011 (Page Break)

And if anyone can explain exactly what it is Alt+0026 does, I’d be happy (some sort of Undo/Redo is the best guess I can make, but it seems to behave a bit oddly here)

026 (control-Z) is officially SUB for “substitute”. According to the history link I gave above, it was added between ASCII 63 and 67, and probably was intended as a control for editor functions, as you suggest. Currently, it is the end of file character on DOS, functioning like control-D on UNIX.

Most of the original “control” characters were eventually subverted in various contexts for things other than what their names would suggest. BS, LF, FF, CR, ESC, and DEL have pretty much remained intact (let’s not have a CR/LF vs. NL discussion right now, though).

I keep giving that history link because it contains a lot of details of ASCII’s evolution to the 1967 form, which is essentially our current 7 bit ASCII. Without wading through those details, you should take away the information that it evolved from a lot of earlier standards, and was designed as a long, drawn out committee process, with several competing influences fighting over the limited space. Some things WERE introduced because of the wishes of the programming language designers at the time, notably COBOL advocates. Some things, like the tilde, and other “diacriticals” were favored for non-English text representation and markup.

Of course, like the control characters, punctuation marks got appropriated by software designers for a variety of syntactical uses, such as the “modern” uses of the tilde we have listed.

Interesting that the quote I gave mentioned “graphical build up” by producing the character, followed by backspace, followed by its diacritical. That was a solution obviously aimed at paper terminals, which were prevelant in 1967. Things would have been better for translation to CRT screens if the “build up” had been the other way around - diacritical mark first. That way, you would lose the diacritical, not the letter, on a CRT. But people naturally though of typing the character first, backspacing, and typing a modifier like an underscore on paper terminals. When paper was still common, you would sometimes get text produced that way that displayed as series of underscores on CRT’s. People used to write various filters to change character-bs-underscore sequences in files to underscored characters for various CRT’s, or at least swap things around to put accent characters first.

Hardware, software and just about anything else I can think of should always be designed for the end user and not for the convenience of the developer/designer. Indeed as Shalmanese points out, if there had been no tilde on the keyboard, programmers would have simply arbitrarily picked another character for whatever meaning the tilde now has for them. In designing a keyboard, the thought process should have been directed at what characters the millions of end users would want, not what the handful of programmers wanted.

The basic question I see in the OP is “Why the tilde? Why not any of the other possible characters? What were they thinking?”

At the time the design of the computer keyboard was standardized, there wasn’t “millions of end users”, but only the “handful of programmers”. Remember that computers were around for quite some time before they became a general household and business tool. The cent sign is perhaps the only symbol you mentioned before that would have made sense to add to keyboards at the time. Generally graphic additions such as the bullet and the square came along much later.

Yes, remember that when these early standards were being hashed out computers were mainframes–locked away in back rooms, owned by businesses and universities–and only qualified personell were allowed access to them. The idea that every secretary, nurse, writer, accountant, student, garage mechanic, and store clerk would have a computer on their desk was ludicrous. They didn’t spend a lot of time on usability studies since they assumed that every computer user would also be a computer engineer or trained programmer.

Well, when exactly was the “design of the computer keyboard … standardized”? To a degree that would go all the way back to the early manual typewriters, but for our purposes we could consider it to be back in the '80s about the time the PC was first being marketed. Prior to that mainframes had their own proprietary keyboards.

The whole purpose of the PC (Personal Computer) was to put these tools on the desktops of ordinary people everywhere. At that point the keyboard designers should have been thinking about the end users, not the programmers.

It’s worth noting that many of the special characters that are now included on keyboards are pretty arbitrary. Why the pound sign, the at sign or the caret? If you look at the ASCII table it looks like the tilde was included just to fill up a space. They had 128 spaces, and they had run out of the really important characters, but they had to put something there, so the tilde it was.

No, but its similarly lucridious to think that a programmer would go out and INVENT a symbol just so that he could use it in his language. Im not talking about the ASCII standard here, that could well have been influenced by a programming language. Im referring to the FIRST keyboard which had a tilda present. Obviously, this would have been done for reasons other than programming.

From www.asciitable.com:

What appears to have happened is tha ASCII was developed specifically for teletypes and eventually was adopted as a de facto standard. It’s possible that the tilde had an important function for the things that teletypes were used for.

I used to use Extended ASCII when I first began to type diacritic characters.

But then I found out that Unicode was the wave of the future, and switched to exclusively Unicode.

And by now, Unicode is the wave of the present, and everything else belongs in the trashcan of computing history.

UNICODE! UNICODE! UNICODE!

Well, here’s a relevant cite

In 1961, it was grouped with other “math symbols”, then as a alternative for the overline, the ^, and the # symbols, then
officially added to the ASCII standard as it’s own character in 1966, at the request of SHARE.

So that’s why it’s in ASCII. As for why it got onto keyboards, I would suggest that, of the original 128 ASCII characters, all of the “printable” characters are present on a modern keyboard. The bullet, the cent sign, etc, aren’t part of that set.

Also see this page

And SHARE, I should point out, is the IBM users group. So it was included as a character that the end users at the time wanted.

-lv

Oh yeah? You do better without using the IPA.

seh-NYOR

What’s wrong with IPA? There is an ASCII adaptation of IPA for online use, you know. I was objecting to how you transcribed the first syllable as “see,” implying a nonexistent Spanish word *siñor.