Extremeely basic programming question.

The other way around would be more sensible. Languages that have a defined Boolean data type along with strict typing to maintain that “wall of separation” between data types – these could represent TRUE and FALSE any way they want internally (with no need to be consistent with anyone else).

FORTRAN IV has such a type, and every compiler had its own way. One popular and sensible method was to represent FALSE as a word of all-zero bits and TRUE as a word of all-one bits. A word of all 1-bits, if seen as an integer, would have a numeric value of -1 in most (but not all) computers.

A few computers stored integers in “one’s complement” form instead of the more usual “two’s complement” form, with the result that negative integers were different. In one’s complement form, a word of all-zero bits was “positive zero” and a word of all 1-bits was “negative zero”, which behaved like positive zero for arithmetic purposes but was still a word of 1-bits for logical or boolean operations.

C was the first language (that I know of) to officially proclaim that logical values were simply the integers 0 and 1, and define which was which, and define “0 is false, anything non-zero is true”. Seemed like a good idea at the time (some people thought), but programming experience over the last 40 years has shown what a BAD idea this really was.

This leads to a corollary: Being a good programmer has something in common with being a good teacher:

To be a good teacher, you have to be good in TWO areas: You have to know how to teach, and you have to know a lot about some specific field that you are going to teach.

Likewise, to be a good programmer, you have to be good at programming, and you also have to be good at programming SOMETHING. If you are going to write financial software, you have to be good at programming AND you have to know all about accounting or whatever financial specialty you are into. If you are going to write software for physics applications like Chronos, you have to be good at programming AND you have to know your math and physics.

Chronos is here describing the common case of someone being good at his specialized field but only a good-enough programmer to get the basic job done. (He doesn’t say if this is himself he is describing.)

I shared an office with a computer scientist once upon a time. She told me of a study she had done for her Master’s Thesis: She collected samples of large application programs written by professionals in their fields, but who weren’t really professional programmers, and studied the structure and coding of those programs. What she found, in summary, was that it was all shitty programming.

See what you started? :stuck_out_tongue:

(Are you following all this, Sicks Ate?)

Time was, programming languages were simple, and there weren’t that many to choose from. It was FORTRAN or COBOL.

COBOL was big and messy, but FORTRAN was relatively simpler. My first language was FORTRAN II for the IBM 1620. (I’m giving away my age here.) The ENTIRE programming reference manual was a booklet about 40 pages long, which I read in its entirety in one sitting, in a couple hours, and I knew FORTRAN through-and-through.

Just try that with Java today!

Today you need to know ten or more languages and they all have simple teach-yourself books that are 600 to 1000 pages long, that you can buy for $50 to $100.

You might teach yourself a few of these and get into free-lance programming. But if you want to, you know, get an actual JOB, employers might not give your resume a second glance unless you know C++/C#, Java, JavaScript, PHP, SQL, HTML5, CSS, Perl, Ruby, Visual Basic, Everything.NET, INTERCAL, Befunge, and Malbolge.

Slightly more seriously, consider getting into web programming, which you might well be able to do free-lance. That seems to be a really big really big shew these days.

C is extremely weakly typed by modern standards in that there’s no notion of type safety for the language whatsoever. You can cast almost anything to anything else.

Well, it sort of is me. I can write an interface, and I occasionally do, but I usually don’t bother, because again, it’s just me who will be using the program. Or rather, I do create interfaces, but they’re optimized for use by me specifically, with no regard for ease of use by anyone else (for instance, writing one program to output source code that I’ll cut-and-paste into another program which I then recompile).

In any event, I’m still not nearly skilled enough to be an actual professional programmer.

Of course not, IANAP :slight_smile: But it’s fun to read and fill in blanks contextually when I can.

Speaking of Java, though, I have seen jobs on Careerbuilder etc. that specifically say ‘Java programmer.’ I assume from what you said that it’s implied that a successful candidate would know quite a bit more than ‘just Java.’

Additional required reading for anyone who envisions becoming a Real Programmer.

Real Programmers Don’t Use PASCAL by Ed Post, 1983.

It’s FORTRAN and TECO all the way down.

I laughed when I saw that the answers to an “extremely basic” question had gone on two pages.

I’m the one who originally posted the link, so I think it’s a really interesting topic of discussion. The linked study definitely matches with my experience in school and around people trying to learn to write code. Some people get the idea almost immediately, and all you have to do is point them at a reference and help them decipher the more cryptic compiler errors and they’re off to the races. Others… really struggle. And never really figure it out very well. The difference seems much starker than in any other subject I’ve ever studied (except maybe music).

From what I’ve been picking up on (and admittedly, I’m not in the thick of the industry these days), Java hasn’t really taken the world by storm like it was supposed to. JavaScript (which isn’t Java at all) is much more extensively used, due to its prominence in web development. In fact, Object Oriented Programming in general, for all its hype, has also generated a lot of criticism. While it’s apparently here to stay, its future direction isn’t all that clear. C++ and C# in particular have come in for a lot of criticism, because they (well, C++ at least) retain too much baggage from plain-old-C that doesn’t belong in an OOP language.

JavaScript suffers all the defects of a dynamically-typed language (and more) – it doesn’t care what type of data a variable contains, and doesn’t even care if you try to use a variable that has never been defined or initialized. Data structures aren’t defined, but simply contain whatever fields you happen to stick into them.

Some languages are just TOO EASY, in the sense that they let you get away with all kinds of shitty programming, without too much discipline on your part. Then you spend the next 50 years trying to debug it.

Other languages are the opposite: VERY STRICT, require extensive programming self-discipline just to get the damn thing to compile, but the compiler catches or prevents all sorts of errors for you that would be easy to make and hard to find or fix otherwise. Even so, somehow, programmers STILL write programs full of bugs. (ETA: This is exactly the kind of language that no Real Programmer would be caught using! :stuck_out_tongue: )

Just to throw fuel on the fire:

Strong or weak typing means almost nothing in particular and has nothing to do with the quality of code produced. Strong typing generally refers to declaration of data so that a compiler can optimize the space and processing requirements, and/or production of errors from mixed type usage. These declarations have little use in interpretive languages and may make the code less efficient from additional type checking required at run-time. I’ve seen terrible code written in every language, and it has nothing to do with the mode of data typing.

I’ll also point out that physicists, chemists, engineers, salesman, and virtually anybody could be a skilled programmer given sufficient experience and background knowledge, and professional programmers can produce some really shitty code. It’s just that people who spend most of their time in another field aren’t as likely to acquire the experience and background necessary to produce elegant code.

We all have our preferences, but classifying one mode of programming as better than others is usually just an expression of that preference.

It’s been said that a real Fortran programmer can write Fortran code in any language, and a real C programmer can write C code in any language. I have discovered that it is likewise true that a bad programmer can write bad code in any language.

So! To paraphrase Ed Post (cited a few posts above): The determined Terrible Programmer can write Terrible Programs in ANY language. [And routinely do.]

Strong compile-time typing, and other strong compile-time strictness, can help the programmer avoid or catch many kinds of technical errors. The really strict stuff, like Java and C++ (sort of) prevents you from setting a reference to an object that doesn’t have a field that you thought it did, or from referring to an object or function or method from someplace where you aren’t supposed to know about it, and a lot of stuff like that. Strict run-time type checking can catch data errors (doing an operation on data of the wrong data type), and in particular can catch things like out-of-bounds array references (perhaps one of the greatest things that a programming system can catch, given the amount of grief this has caused, accidentally or deliberately, especially deliberately)! But that kind of stuff comes at great expense in terms of software speed and efficiency.

This says NOTHING about a programmer’s ability to, you know, actually design an algorithm! Computer science professors got so enrapt in the ideology that a programmer could DO NO WRONG, given only a sufficiently strict language, that they forgot to teach basics like, you should initialize your variables.

True story: Circa 1971, when Pascal was all the new rage (because even the veriest beginner could do no wrong, so strict and marvelous was Pascal!) – I helped a beginning student write a simple assignment: Read a sequence of numbers and add them up.

Okay, so the student understood that he should declare a var, and read numbers, and add each new one to the variable. He also managed to figure out that he had to do something different with the FIRST number: He should simply set that into the variable rather than add it to the current value. He didn’t know how to test if his read-number loop was in its first iteration, though.

Solution: Just pre-set the variable to 0 before beginning the read-number loop. Somehow, apparently, his professor didn’t mention things like that. The point of this story: With the advent of Pascal, professors got so into teaching the Gospel of Wirth that they forgot to teach actual programming along with it! A whole generation of college-age new programmers got raised up in that environment – and some of those even became the next generation of professors!

No notion whatsoever? That’s a bold claim.

I fully agree that C casting, and in particular pointer casting, eliminate any guarantees about the contents of an object. But for the most part, C will not do any dangerous implicit casting without a warning or error. This will warn, for instance:
int pi = 3.1415;

This will give you an error:
void f(int *a) { }

char arr[100];
f(arr);

I’ll buy that C is much less strongly typed than some other languages, but at the same time it is much stronger than some others. If you go for an all-or-nothing definition, you also have to include C# and Java as weakly typed languages, since both allow unsafe code.

The determined Real Programmer can write FORTRAN programs in any language. (Ed Post, loc cit).

Even a real C programmer can write FORTRAN code in any language. Heck, all a Real C programmer is doing is writing assembly language in C. To cite one of the worst abominations, straight from Kernighan and Ritchie, the following example is given of the complete executable code needed to copy a character string from src to dst:

while ( *dst++ = *src++ ) { }

This works. But if you think you know why, you’re probably wrong (unless you aren’t). I had to stare at it for a long time to figure out why it works. On the surface, this loop continues to copy bytes as long as they are not zero. Why does it copy up to and including the 0-byte that terminates the string? Once properly understood, this code is seen to be the worst kind of violation of the very principles of structured programming that were considered so important – all the worse because it looks like such proper structured programming on the surface! (Exercise for the reader: Can you see and explain why it’s a violation of structured programming, and all the worse because it looks so right?)

I wrote two cross-compilers for FORTRAN programmers to be able to easily adapt to C and old interpretive BASIC. I created numerous similar systems to enable cross usage and projection of multiple languages. Anything that can be done in one complete programming language can be done in another. I really don’t understand syntax worship.

C compilers have gotten much better about this in recent years (meaning, last 10 - 20 or so?) than they were previously, as the importance of compiler error-checking has come to be understood.

Earlier C compilers were notoriously lax about things like this. And earlier specifications of the language didn’t help. For example, early versions of C didn’t have the “void” keyword, and int’s and pointers could be used interchangeably almost as recklessly as int’s and boolean’s are used interchangeably today. Programmers used to copy pointers around with wild abandon without even bothering to cast them. At least today, by requiring an explicit cast, the programmer is required to at least think about what he is doing, which wasn’t formerly the case.

By the way, Sicks Ate, this happens every single time !

Somebody starts a thread with some more-or-less simple programming question, and the thread goes off the deep end like this one has. Like the annual flooding of the Nile, it happens every single time! We’ve had a bunch of threads go like this!

:smiley:

Spoken like a true Assembly Language Programmer! In fact, spoken like a true Brainf*ck Programmer!

(Now there’s a language the world really needed! Although I recall we discussed a substantially similar hypothetical language in class way back circa 1970, long before Brainf*ck’s appearance in 1993!)

ETA: The above-cited Wiki page even shows a set of C #define macros that will cross-compile Brainf*ck programs into C, which you can then compile, as well as a sample ROT13 program!

It’s great! Keep it going…it’s on its way to being one of my more successful threads, which is funny because I was expecting it to be over after about post #4.

I have started looking up some things so I don’t have to ‘Grrrrwhatretheytalkinbout’.