Learning Java Programming

For a while it was popular to use Pascal to teach programming, probably because the inventor said it was good for that. Then I think colleges switched to C++ and I think some use Java now. I was not a CS major but 30 years ago they used PL/I to teach the CS majors.

I vote for Visual Basic. At least I find it fairly intuitive, and it conveys the basic ideas of OOP.

It’s main virtue to me is that you can get something done. Its not elegant or fast but it works.

And I’m talking about VB6, the pre-dot-net version. The user-friendliness took a big step backwards with the dot-net.

This is my opinion only - I know there are others who would strongly disagree.

I haven’t programmed in VB for well over a decade. What did the .NET version introduce that made it less friendly? (I currently am a big fan of C#)

From what I understand, VB.NET is basically C# with a BASIC like syntax. YMMV, I guess.

I couldn’t disagree more. C and especially assembler have way too many ways to shoot yourself in the foot in hard-to-debug ways. That’s a very bad property for a teaching language to have; as new programmers are always going to trip over those things.

That’s called “learning.”

In VB6 ;

Private Sub Text1_KeyUp(KeyCode As Integer, Shift As Integer)
Same Thing in VB8.NET;

Private Sub Text1_KeyUp(ByVal eventSender As System.Object, ByVal eventArgs As System.Windows.Forms.KeyEventArgs) Handles Text1.KeyUp

Everything is longer, with more words and less useful information than I need.

Maybe there’s a way to shorten the syntax but its certainly not the default.

Keep it simple…get the job done.

Yes, but you’ll find that people learn a lot quicker when their program catches what they did wrong and gives a useful error message, rather than crashing randomly or worse, producing incorrect results. There’s no good reason to inflict a language that doesn’t check array bounds, for instance, on new programmers.

That’s not a very good rebuttal, you could say the same about starting physics with quantum mechanics. The idea is to keep people motivated and to be efficient in the order you present material, etc. Peel the onion, don’t start by making a working onion out of pure protein.

I agree with those that say start with something basic. It gives you a foundation (in many different areas) that helps when you learn assembler.

I recommend Thinking in Java to anyone who will listen. Bruce Eckel is a great writer and manages to teach good design without getting lost in the details of a particular API.

It’s interesting that Eckel actually hates Java but still managed to write a fantastic book teaching it. If Eckel had his way, you would learn Python instead. FWIW despite his fame in C++ circles, he does not like C++ any more.

I would not recommend that a beginner start with Java either. Ruby or Python are much better choices. Apart from being dramatically simpler, they inculcate good design habits. Java (and C# and C and C++ and Pascal and Basic) often get in the way of good design.

I taught myself programming with Basic a long time ago but would not recommend it to anyone. It’s an ugly language even in its modern VB.Net form and imparts terrible habits that are hard to shake off.

I made a living writing C for several years but I wouldn’t recommend it to anyone. It’s akin to recommending that someone forgo power tools in favour and use a hammer and chisel because that’s what they used in the old days.

These days, I use Ruby for labours of love and C# for labours of money. I agree that C# is a little bit better than Java, its not better enough to advocate picking it over Java. Choose the one that best solves the problem you need to solve. It’s not like they are too hard to pick up. People learn them every day.

If you don’t have an immediate problem to solve and are just learning for your own betterment, learn Ruby. Which ever language you learn, it’s easy to learn the second language. Most software craftsmen recommend that you learn a new language every year so that you are exposed to new ideas. Most software craftsmen are competent (if not fluent) in dozens of languages.

NITPICK: C# is no more proprietary than Java (although most of the .NET libraries are).

I’m curious about the bad design habits you see re: Java, C, C++, Pascal and VB. (Not that I necessarily agree or disagree, just curious about your opinion).

I’ll just give my opinion FWIW.

Pascal is hard to fault WRT design, except that it’s too basic; it misses OO and useful (IIRC) functional constructs. Standard Pascal seems to me to be stuck in the early 80s. Not as basic as C, but not all that much better either. In the industry, it’s always been a late comer and today it’s probably more or less dead as far as jobs go.

C is a good language to know, but/because it shoves all the difficult problems off to the programmer. IMHO this and its portability and performance characteristics makes it a very good language to know, but I would probably recommend it as a 3rd or 2nd language. Certainly not as the first.

C++ is a C dialect with OO and many powerful abstractions added. Given that, it’s ironic that it’s actually a lot harder to do well than C. There’s just too much rope. Great for certain applications where speed and power of abstraction are both extremely important (graphically intensive apps and games tend to be programmed in C++ these days), but really you should probably learn both C and some more forgiving OO capable language first.

VB (not VB.NET) I hardly know. It looks like a pretty standard modern interpreted language, except that it has a lot of BASIC cruft. I would recommend either Python or Ruby instead.

Python seems to be pretty rigidly designed for a dynamic language with emphasis on OO (without forcing you to use OO everywhere), and outfits like Google and certain educational institutes seem to prefer it to most of its competitors. It never really clicked for me, but I’ve not heard any really relevant negative stories about it either.

Ruby seems to be a free-form Python with more emphasis on a blend of OO and functional idioms. There’s a lot of interesting stuff going on in Ruby, and also a tendency to abuse (IMHO) the OO model to do things that OO programming just shouldn’t do. Personally I prefer Ruby over Python, but if I had to use just one of this list, I wouldn’t mind if it was Python.

Java is like Python in that it enforces a standard idiom. In Java’s case, the standard is “traditional” class-based OO (like C++, but without templates and with a lot more safe-guards). In my opinion traditional OO is just too simplified to really reduce common problems by much. Java (and C++) “patterns” are a good example of problems that are usually much easier solved by using first-class functions/closures like the ones available in Ruby, Python (IIRC), Perl, most Lisps and (again IIRC) C#. Java also suffers somewhat from having a huge standard library of which parts are obviously slammed together without too much thought. This IMHO shows that Java’s claims of the simplicity of its programming model are just incorrect.

wow, I didn’t think I would get this many replies. Out of curiosity, does it get easier to pick up other programming languages when you master one? I heard that it does get easier with spoken languages, but I’m not sure about computer languages.

For those who picked up language on your own, is it harder or easier to learn without instructors? I ask because I’m taking java as a class. The general consensus seems to be that Java is not the best way you want to do this. As far as I know, there aren’t any classes that offer easier languages like python or ruby. I would have to go to a community college for that and it would hinder me a bit.

Yes, it gets far easier. That’s because so many are so similar. Which is why, to really be a top-notch programmer, you shouldn’t necessarily worry so much about just picking up lots of similar languages, which is trivial, but rather about exposing yourself to lots of different paradigms and ways of thinking about programming, preventing you from calcifying in just the particular approaches you’re used to. It’s a little harder but not so much, and it’s well worth it.

And while Java may not be the academically best starting point, it’s also not irredeemably terrible; if you’re already taking it as a class and have no other options, then keep taking it. You’ll be fine; everyone starts somewhere, almost no one ideally. Just expose yourself to other languages afterwards; like I said, in the ways they’re similar, they’ll be easy to pick up, and in the ways they’re different, they’ll help rid you of whatever undesirable biases the initial Java education instilled.

Definitely easier to pick up additional languages.

Most of my languages were self-taught (BASIC then assembly, etc.). When I have a person I can just ask to clarify something it goes much much faster. I would vote for instructor.

I wouldn’t really worry too much about whether Java is the optimal choice or not, if that’s what they are offering then that’s what you’re going to learn on. If you are motivated, you will learn. And it just takes time and practice.

Superfluous Parentheses, sounds like a fair assessment. I definitely have my opinions on languages, but I’ve noticed that even the crappiest languages that I have used (RPG), have redeeming qualities. It’s certainly not a “one size fit’s all” situation.

It does get easier. And because different languages stress different topics, you also tend to get better at most languages you actually use when you learn a new one.

I’ve learned a few languages with instructors, but mostly I learned by reading good texts and trying hard to build stuff. A good programming language instructor (and a good intro book) will give you an overview of the language “way” and its core structure, and then give you enough knowledge to learn more yourself, using reference material - which is generally freely available online. Learning to write programs (i.e. how to structure code and how to pin down solutions to problems) is different, and I think that’s where learning and using many languages gets really helpful. Different languages have different design philosophies and different levels of abstraction and learning a bunch of them really helps you appreciate the pros and cons of each strategy.

Learning one or two languages that have large open-source contributions probably will help too, since those contributions are usually not as strictly adherent to the core language and you also get some more insights from that (plus, releasing open-source libraries encourages you to try and package your smaller or larger pieces of code into really reusable format).

On a different topic. Does anyone have advice on Discrete Mathematics? From other friends, I’ve heard it is a very ugly area of Mathematics and pretty difficult. I tried it a little bit and I didn’t understand it too well. the De Morgan laws confused me. I don’t understand what algorithms are. My teacher said it was instructions on how the program runs, but does it extend to mathematical formulas on how a program should decrypt things too?

“Algorithm” is just a fancy word for “program”, “unambiguous instructions”, etc. Programming is writing algorithms. But you don’t need a computer to talk about algorithms; for example, someone taught you, way back when, rules for how to mechanically carry out long division. That was an example of an algorithm. I might teach you rules for how to use a certain machine, rules which unambiguously and specifically tell you what to do in response to everything that might come up, leaving you no choice in the matter, so that you just robotically follow the rules. That would be an algorithm.

Discrete Mathematics in a very broad sense are the parts of math that you can actually model easily in computers. Computers cannot (easily/at all, depending on the subject) model with perfect accuracy the real world or abstract mathematics.

Very simplified example: say you’ve got some natural analogue system providing a voltage potential on a pair of wires (this may be a simple chemical battery cell). Over time, the difference reduces. Using a digital mechanism (i.e. a computer) you cannot get infinite precision in the amount of voltage, and also the time between measurements can’t be infinitely small.

You have to break up both measurements into finite steps. This is both a problem and a help. Some problems are actually easier to solve with finite measurements, others (the more interesting ones, apparently) are not. In any case, whatever mechanism you can propose, we - as a species, given everything our experience can provide - cannot get infinitely precise measurements anyway, not matter how hard we’d like to; AFAIK it’s just not possible - though that’s more a philosophical position.

All that said, De Morgan’s laws, which I just learned about since you brought them up, appear to be “just” laws following from the postulates of basic propositional logic. They don’t really have any more relevance to discrete math than 2 * 2 = 4.

ETA: The best analogy for “algorithm” is “recipe”. But algorithms are (or should be) completely defined so that no ambiguity is possible. This also means that algorithms aren’t really possible outside mathematics and some parts/languages of/in computer science/programming.