What's the best language to learn programming with?

I’ve seen that people who learn OO programming early on do much better with it than people who had a lot of experience before it’s emergence. That doesn’t mean they understand OO, or any other aspect of programming any better though.

Jragon, I’m curious what you think the value of strong typing is? It’s very easy to explain that the string “123” and the number 123 may have different representations and so cannot always be directly compared. Is this really a reason to use one language over another?

I agree with your recommendation of Pascal as a good teaching language.
Your analogy between ‘C’ and Latin is quite useful and I will be stealing it in the future.
:slight_smile:

It teaches good thought habits.
Your design should be typed, you want to add apples to apples and oranges to oranges. (In general)
Pascal enforces what is at most a user convention in C- C allows one to freely mix fruit, vegetables, meat, cheese, and anything else.
C doesn’t complain about types, and doesn’t usually get things right if the user doesn’t specify explicitly.

I would also suggest that a coder in training would be well-served to learn object oriented programming earlier rather than later. I can’t tell you of how many well-seasoned and competent coders have let their procedural backgrounds creep into object oriented code. Even I <gasp> have been guilty of doing this. But I still ‘think’ in C and quite a bit of my pseudocode is written in a stripped down shorthand version of that wonderful old language.

I don’t have a huge bias against dynamic typing, my problem is with automagic type comparison and operation. Okay, so “123” == 123, great, is “123”+123 defined? What’s the result? String or int? Is it the same as “123”+“1”, or is that “1231”? If “123”+123 = “246”, but 123+“123”=246, then suddenly addition is no longer commutative, because “123”+123+“123” = “246123” and 123+“123”+123=368. Maybe for an int and an int-string it always becomes an int? It’s a lot more confusing than the straight forward “these are different”. I don’t mind float+int, because at least they’re both mathematical types, but with anything else it gets wonky.

Though we may be arguing different things, I’m not against the language letting you compare “123” and 123, I just prefer it in a learning language if it returns false every time. It makes it absolutely clear that everything is just numbers, and the computer can’t magically infer what you meant to do just because you knew what you meant. I believe that as a beginner you should learn that AddTheseNumbers(a,b) and FXYinnfasdlkfjGDSGJ(a,b) are the same, and that 123 and “123” are different because one’s (probably) a pointer and ones a number and those two numbers aren’t the same. And then after that initial hurdle (which shouldn’t take very long) you should be slowly shown all the nice syntactic sugar like “123”==123. Because then when trying to compare or manipulate apples and oranges, when things do go wrong (and they will eventually) you have that frame of reference for understanding that in this case the computer is being its normal dumb self and the language creator didn’t think to implement that feature, rather than flailing about trying to figure out why in this case the computer didn’t automagically divine what you meant, even though it’s analogous to the case where it did.

And I’m referring mostly to things like “==” and “+”. I wouldn’t have any problems letting beginners have dynamically typed tuples or typeless lists.

ETA: And yes, I know that in Python and Lua, 123 != “123”, so they’re cool :), there are languages where it is though. I should have been a bit more clear on that.

Yes, that’s what PHP does, and it works neatly. In a numeric context, a string is always interpreted as numeric. There’s really only one situation where that causes a problem with consistency, and that’s because PHP’s boolean type is not numeric:

“a” == 0; // true
“a” == true; // true
0 == true; // false

There’s a strict comparison operator === and various type conversion functions available as needed. And of course good coding standards dictate when they ought to be used.

PHP is not a good example of a well thought out language. But it does demonstrate that automagic type conversion can work fine without causing major issues.

In the bad old days, I’d have said Basic (actually, I’d have said Logo, then Basic). Nowadays I’d say Python. Not for any high-faluting CompSci reason, I just like Python. It’s simple but has a large library set, and lots of OS projects to learn from. Github is your friend.

You think Hello World in C is too long and complicated for teaching to a beginner?

Try writing Hello World in COBOL.

I think the solution to the “magic” problem is to just live with. Show your students the program, tell them to type it in just exactly as is, all the magic words included, and promise that you’ll get around to explaining each one in due time. Then, for starters, just focus on the one line that actually does something.

In contrast, I had an utterly awful textbook once, the authors of which felt that they just couldn’t have the student writing a function call like printf("Hello World
"); without fully explaining all about functions. So the book started with teaching functions, which was way over the heads of any beginner at that stage.

I had a friend back in college days, a CS grad student, who taught FORTRAN classes (that being the dominant language of the day). He took that approach with the dreaded FORMAT statements. For the first several weeks of the quarter, he simply hand-fed them the FORMAT statements to use in their programs, promising that the fateful day would arrive soon enough that he would actually teach them about it.

I think PHP is sort of an abomination to teach as a first language. It’s kind of a crappy language altogether. The syntax and collection of functions seem to be a rather thrown-together ad-hoc hodge-podge. My book has pages of tables listing all the special rules about what data types and values you can compare to what, and what the results will be.

You’re right that it’s good to have a language that doesn’t hide too much, for a beginner. As you know, I’ve railed about languages that hide too much, especially about how recursion works. Even C keeps that kinda-sorta mysterious. You have to teach your student about stacks and local variables. Well, you have to teach about that anyway.

I first learned recursion in assembly language – and on a CDC 6400, which didn’t have any native support for stacks or stack operations! That’s how you learn how things really get done! Maybe beginners should be learning assembly language! And maybe even preferably on an old-architecture machine.

Pascal was the worst abomination of a programming language ever to defile the face of an IBM punched card, and especially so for teaching beginners. Don’t even get me started. Kernighan pretty much covered the subject anyway with his classic essay Why Pascal is Not My Favorite Programming Language.

I was surrounded by Pascal snobs at Berkeley back in the Pascal heyday. They claimed it was a teaching language, but once they graduated, they couldn’t program 2 + 2 in any other language and refused to even try, being fully brainwashed to believe that Pascal was His Holiness Wirth’s godly gift to the world, the majesty of which no other language could begin to approach. So they went out into the Real World, taking Pascal with them, each writing their own Pascal compilers with their own extensions to the language, to cover for all the glaring defects in the language, so you had a proliferation of dialects, no two of them compatible or cross compilable.

But those guys could sure bad-mouth FORTRAN and any other language that someone might have tried to use.

Pascal set back and retarded the development of Computer Science by 500 years.

(ETA: And before Pascal came along, I thought Algol was bad.)

This can be another debate someday. I’ll just say that you’re showing your personal preferences here. There’s nothing special to learn from static typing. Every language has it’s rules for mixed types, none are better than the other.

I think you’re wrong about “none are better than the other”. Jragon is right, that PHP-style rules are a fetid breeding ground for programming errors, they are so loose. Pascal mixed-type rules were totally anal-retentive and way too rigid, to the point that made the language actually unusable in some ways. (See the Kernighan essay I linked, above.) Algol, C, and FORTRAN had rules that were within a reasonable range. In particular, they had no automatic conversion between strings and numeric types. (ETA: What the hell, Algol and FORTRAN didn’t even have strings, or they had only some rudimentary form of strings that you couldn’t do much with!)

You could have some automatic string / numeric conversions, as long as the language doesn’t get too wildly loose about, as in PHP. SNOBOL seemed to have reasonable rules.

One useful fact about PHP and SNOBOL is that the string-concatenation operator is NOT the same symbol as the numeric addition operator, so a certain class of ambiguities are avoided.

ETA: I think it’s in comparing string values with numeric values that PHP gets especially weird and awkward.

I think the point as far as a learning language is concerned, is that strong typing is very useful in that you’re not spending a lot of time and effort chasing down what weird implicit conversion is making your program act strange, when the problem at hand isn’t typing and conversions, but rather linked lists, strings, arrays or something along those lines.

Like he said, it may let you do “123” + 123, but do you know what that result will be if you’re a rank novice?

Better to make you convert “123” to an integer before you add, so that there’s not any ambiguity that would confuse you.

For professionals, I don’t think it really matters; I find it annoying both ways- I don’t always like having to explicitly convert, but I don’t like tracking down unexpected strange implicit conversions either.

I understand what you’re saying. But it’s a minor detail that really doesn’t materialize as an important issue in the end, and tracking down those problems are part of the learning process, if they are there. I use a dynamically typed language all the time, and there’s no issue at all in the learning process, it’s totally unseen, there are no anomalies. A programmer will have to learn the details of typing as it applies across languages at some point, but IMO it doesn’t really affect the learning process to work with any of the variety of typing methodologies.

I bring this up because I believe the dynamic languages which easily operate with a command line interface are the best for the learning process, allowing the student to interact quickly without dealing with intricate structural syntax and compilation related quirks. I think the most important aspect of learning to program is developing a fluency in the expression syntax of a language. It’s like learning to say “Computer” before any command given to a Starship computer. Beginners will get stuck saying over and over “How many Klingons are on the ship?”, and without saying “Computer” first it won’t respond. Sometimes when switching systems even experienced users will make a mistake like trying to talk into the mouse. In more serious terms, little things like learning the difference between “=” and “==” are more important than worrying about what kind of typing issues come up.

Command-line interface, my ass. Back in the day, you wrote your entire program, or at least substantial portions of it, all ahead of time, on paper, first. Then you sat at a keypunch machine and punched your program onto a deck of cards, and likewise any data. Then you handed your deck of cards and data to the computer operator (or more likely, put your deck into a box on the counter in the input room).

Then you went off and found something else to do for six hours, or maybe until the following day. Then you went to the output room to retrieve your deck and your printed output. Then you got to see all your compiler errors. Repunch erroneous cards. Take deck to input room. Lather, rinse, and repeat.

You might only get one or two runs a day. So you made the most of each run. You not only wrote out the whole program on paper, but you also “desk checked” it as thoroughly as possible before submitting it. You could get a printed listing of the deck from the tabulating machine, which you carefully proofread for errors. You sat and traced through all the paths of the program, following the values of variables on paper, to verify your logic.

Interactive programming ruined all that. Now, programmers sit at a live terminal, typing, testing, and debugging a line at a time. It causes programmer to have tunnel vision, that they focus too much on each line in isolation and less on the flow and logic of the whole program. Programs end up being a thrown-together hodge-podge, and often it shows.

That’s the way I learned. I also used to have to get up and walk across the room to change the channel on the TV set. Programmers need the skills of focusing not only line by line, but one expression element at a time. IMO this is where it all starts. Without that more basic level of understanding they end up coding nonsensical structures, just like the old style programmers did as they had to learn more structured languages.

ETA: If you want to outdo me in the old fart category you’ll have to tell me about your patch-board programming experiences.

There’s a big perception that programmers who learn OOP as an afterthought have a great deal of difficulty thinking in an OOP mindset. While they can write classes, they don’t think in terms of encapsulating a business idea into a class but rather decide what specific operations to do and then write dozens of utility classes with methods. Instead of creating a class called Employee with properties called IDNumber, FirstName, LastName, HireDate, and Salary, and methods called Load, Save, and ProcessPaycheck, and processing paychecks by iterating over the EmployeeCollection and FOR EACH Employee currentEmployee, call currentEmployee.ProcessPaycheck(PayPeriod currentPeriod), they iterate over a loosely typed array of strings that supposedly hold EmployeeId’s and call PayrollModule.ProcessPaycheckFor(string EmployeeId, int payPeriodFormattedAsInt). If an employee changes their name, instead of Loading an applicable Employee object, changing the values of the LastName (or other name) field(s), then calling Save, they call EmployeePersonalInformationManager.DoNameChange(string EmployeeId, string newLastName…)

^^^ I agree that learning to use the object model, rather than simply encapsulating data, was a stumbling block for me.
But it’s because nobody was really saying it that way. I hope that’s changed by now.

I’ve got an EPROM programmer sitting right here that uses toggle switches.

EPROMs? You kids are so funny. We didn’t need no stinkin’ EPROMs, we had core.

My high school had a project the physics/computer/electronics teacher had built from industry donations. It had plug-in logic cards in the rear cage (about 3x5-card sized, each was a flip-flop, pair of gates, etc. - about what went into the first generation of general logic chips). They were wired to a patch panel that you programmed with plug-in leads and then pivoted into contact with a whole square of flexble contacts. Switches and lights up top gave you I/O. It had about the same capability and range as the biggest digital trainer boards of the early 1980s - you could build counters, scalers, and even simple games out of it, besides just making all the digitals do their dances.

I mastered all the principles of digital logic on that exceedingly well-built trainer, including a lot of troubleshooting, basic maintenance and even repairing the cards when there were no more spares of that type. It was a while before I moved to chips, which were just coming into my price range of a few dollars each.

I’d still have it but for a slight miscommunication. I came in to find the teacher hauling out the dismantled unit, and he took one look at my face and said, “Oh… you DID want it.” Too late; the first load was already gone to the scrap metal recycler.

Sigh. I think I still have a couple of the cards and patch cords here somewhere.

I haven’t thought about desk checking for years. This was why programmers stayed up all night - the queue was shorter at 1 am. When I TA-ed PDP 11 Assembly language programming all the best students were there with us in the middle of the night. The really best ones brought Southern Comfort for the TAs.

However I disagree about the line at a time paradigm. Structured languages and OO languages force you to think in blocks at least, not a line at a time. Interactive environments is great for not having to worry about typos and doing work in natural chunks, but anyone who tries to code a line at a time will run into trouble.
When we taught assembler the first program we assigned was a simple formatter in Pascal, which we graded very harshly. (And threw out the grade later, though they didn’t know that.) The reason was to force the kids to get ready to write structured assembly code, which we found was much easier to debug.
I do most of my work now in Perl (which I would not recommend as a first language) but I always write out what I’m doing in English and pseudocode first. I find lots of logic errors early that way.