But who decided we really need ~`{}<>|/^@_? Or even &#%? None of them are characters you would often need to use in typing text.
It should be noted that the first computer language to use * and / was Fortran and all subsequent languages basically inherited them from that language. It’s not that the other languages were descended from Fortran, but that all programmers were familiar with Fortran’s use of those symbols for those operations.
In Fortran (and many other languages) variables all start with an alphabetic character. Compilers use that rule to parse code – whenever they see a letter following a non-letter in the code, they say (in effect) “OK, here comes a variable.” So using an X for multiplication would mean they’d have to make an exception for that letter. That is, variables would begin with any letter but X. But X is a commonly used variable in math, so this was not a good idea.
The tilde, backtick, and carat characters were only on typewriters for use as diacritics in languages that are not English. If you wanted to talk about, say, Carlos Peña, you’d type his last name “Pen←~a”.
Well, that’s three of them.
The square brackets would be used for editorial comment within a quote, and &, #, and % would be used for the same things we use them for today. | is mostly there for use in screen graphics (it’s not on most typewriters), and that’s also the reason \ was created (though it’s since found extremely widespread use in programming). @ would be used in invoices, and _ is for making underlines or blanks for filling something in. I don’t know what the angle brackets and curly braces would be used for, though.
You Forth HeardOf? You backwards WriteForces.
Also, the reason the caret is used in exponentiation in some languages (BASIC and derivatives, but not FORTRAN*, BASIC’s direct ancestor) is because it was originally an uparrow in early versions of the ASCII standard. Therefore, 2^3 would have looked like 2↑3 on systems that implemented that early ASCII in their display hardware (either a teletypewriter or a cathode ray tube, likely on a terminal).
*(Why did FORTRAN use the two asterisks? Probably because it was implemented on many different computers, many of which had their own character set, character encoding, and, therefore, their own unique repertoire of characters they could handle. The asterisk, due to its presence on typewriters, was more likely to be supported than the caret.)
FORTRAN predates ASCII if I remember correctly. When I learned it, we were restricted to the EBCDIC character set, which didn’t have a ^.
It does, as does COBOL.
EBCDIC was popular because IBM invented and used it. (They still use it. Old software never dies in IBM-land.) I don’t think it gained much traction outside of IBM’s immediate sphere of influence, though.
I just noticed this. It was written by Tom Jennings, one of those obscure historical characters who’s played a larger role in your life than you likely imagine:
That last one is particularly interesting: Since he wrote a popular BIOS, his code is in a huge number of personal computers around the world, and it gets run every single time they’re turned on.
So that’s your historical moment of Zen for the day.
When I took computers the prof mentioned “the whole world uses ASCII except IBM” and everyone laughed, because at that time IBM had over 95% of the computer market. Almost EVERYTHING computerwise - electricity bills, registration cards, government forms, etc. were either punch cards (EBCDIC punched) or forms that got punched onto those 80-column cards.
Ten years later, the same comment drew a laugh because - thanks to the IBM PC using ASCII - it meant IBM was a dinosaur and behind the times with their mainframes.
Remember, the punch card technology preceded computers - designed for a census around 1900 and used by businesses for decades. IBM dominated that market before computers. They had sorters - you can sort a bunch of cards by a big number by sorting lowest digit, next lowest, etc. You could add a pile of cards by running them through a tabulator. (Provided there are no hanging chads - I never saw an IBM punch that left a hanging chad).
When computers came along, cards were the ideal input, and programs were written and punched onto cards that could be fed into the computer. Many a scream has been heard from a programmer back then who dropped his deck of cards. (Or sat them on the lid of the printer, and it started to open automatically when the box of paper was empty!)
The character set available to programmers was the one that business customers needed over the years by default and was available on punch cards. A character or number could be put in each of 80 columns on the card. Numbers were holes punched in rows 0 to 9. Characters were holes in the one of the top 2 rows (unnumbered) or 0, and a second hole in 0 to 9; giving 29 characters; plus other combinations for puntuation. IBM devised a denser format, where a column could represent a byte, but the cards wer much mor fragile.
Cards were needed because memory was ***&& expensive. A colleague of mine remarked once that he learned assembler to squash down his programs, since the mainframe he worked on would not compile his bigger COBOL programs with only 40K of RAM.
IIRC, wasn’t ^ used for exponentiation in original BASIC because it was simpler than trying to parse “*” vs "**’ ?
Also note that on the original crappy dot matrix printers, and even the IBM big line printers, “*” tended to be ***big ***and plunk in the middle of the line, not politely superscripted to indicate footnotes. Thus it was in the right place and looked like a multiplication symbol but with six arms instead of four.
They are not used in place of those signs - it’s not like they designed computer keyboards afresh and deliberately omitted the correct symbols for mathematical operators. Rather it’s that those symbols are used (out of those provided on the typewriter keyboard) because they are the best available characters for the job.
Multiplication is indicated in algebra by the X, but using that in a formula is confusing as X is a common variable, so the * is the next-best symbol available.
For division the / works well because is correctly indicates that the numerator is over the denominator; if you can imagine taking a properly-written fraction and moving the denominator to the right and up onto the same line as the numerator, the fraction bar (vinculum) would indeed end up looking like the forward slash (solidus).