We could do with a bit of programming history lessons here.
As others have said, it goes back to the days of FORTRAN, which in turn seems to have borrowed the usage from common algbraic usage.
There’s more that could be said, for you software history buffs.
The most extensively used version of FORTRAN for a long long time was FORTRAN IV. But before that there was FORTRAN II, which is what I first learned. There was no FORTRAN III.
All versions of FORTRAN has a neeto-keeno :rolleyes: feature that seemed like a great idea at the time, but has since been universally recognized as a major clusterfuck. You never had to declare variables (although you could if you wanted).
The syntax was clear enough that the compiler could tell variables from keywords by the context, even without them being declared before otherwise being used. Even the key words of the language were not reserved words, and could be used as variable names (with just a few exceptions). As the compiler read the source code, line by line, the first time it encountered any mention of a variable, it was automatically declared. This saved you the hassle of having to declare them all.
So there was a default assumption about names and types: Any variable beginning with I, J, K, L, M, or N was automatically made an INTEGER; all others were automatically made a REAL (what we now commonly call a float. You could, however, explicitly declare any variable with any name to be of any type you wanted. (I don’t even remember whether that possibility first appeared in FORTRAN II or IV.) Note, FORTRAN II didn’t have very many data types. I think INTEGER and REAL was pretty much it.
FORTRAN IV added more data types – LOGICAL (what we now call Boolean), DOUBLE PRECISION (which I believe first appeared in FORTRAN IV, not II, and mean double float), and even COMPLEX. (Yes, COMPLEX was a native data type.) Variables of these types all had to be declared, as there were no names that automatically defaulted to any of these types.
So what was the clusterfuck? Have you figured it out yet?
The problem was that any name that you accidentally mis-spelled as you keypunched up your program into a deck of cards, that mis-spelled name would automatically be defined as a distinctly different variable from what you meant it to be, with no warning whatsoever to the programmer. Some compilers tried to be helpful by warning you of variable names that were only used once, or that appeared on the right-hand side of an = before appearing on the left-hand side. But this was always a hit-and-miss approach.
So programs were always full of subtle errors, arising from mis-spelled variable names that you probably never noticed. Proof-reading your programs was more important than ever before or since, and that still wasn’t fool-proof. That is why rocket ships blew up on launching pads, why submarines sank, why medical devices killed their patients, why the Great Depression happened, and why Alexander the Great never got farther than India.
Algol was the first language that came into wide-spread use that required ALL variables to be declared. A lot of programmers (this one included) thought that was a total pain in the ass at the time, and totally unnecessary bullshit. But the importance of that simple rule was probably as important as the invention of the digit 0 in the larger scheme of things.
Forget about that IMPLICIT statement in later versions of FORTRAN. That was just an after-thought. And, as you can see, it just defeated the purpose (well, one of the purposes) of requiring declarations, which programmers of the time were just beginning to appreciate.
So you can see, what we later came to know as Hungarian Notation had its earliest roots in FORTRAN, although Charles Simonyi (who was a Computer Science grad student at UC Berkeley at the same time I was a freshman or sophomore there) took that idea and built extensively upon it. (I vaguely knew who he was, but I’m certain he didn’t know me, or even know of me. He wrote the Berkeley SNOBOL compiler/interpreter, and some of my classes were done using his version of the SNOBOL language.)
So, you think Hungarian Notation is perverted? You don’t know the half of it! Hungarian Notation, as it popularly became known, is indeed a perverted and garbled version of the scheme as Microsoft (and presumably Simonyi himself) originally meant it to be. For an interesting history of this, and an essay in general on using variable naming conventions to assist with software reliability, see Making Wrong Code Look Wrong by Joel Spolsky, May 11, 2005. His discussion about Hungarian Notation, and how it got perverted, begin about two-thirds of the way through the essay. Recommended reading!