Yeah, applets were … unfortunate. And has shaped a lot of people’s view of the Java platform for years. Not people who actually code in Java, mind you, but other people.
How does a person remember and learn how to use every statement, method, class, library, OS of any t
Yeah, second that. Also, the fact that an entirely unrelated scripting language was called “javascript”.
BTW, two java based applications you may have installed java based clients for:
dropbox
crashplan
Plus, where I learned java was in doing servers and server side stuff. IMO, that’s where it is ideal. And as a user, you aren’t really aware of that.
The language has always had an “enterprise” mind set, too, which tends to scare people off. You try to learn it, and run into a lot of material that’s really “best practices”, standards and packaging for large scale enterprise development. Valuable and necessary for that sort of development, but not really part of the language.
Just to underscore how clueless this rant is, every single app for Android is written in Java.
This is an example of what is called subsetting. Almost everyone who uses a language is comfortable with a subset of its capabilities. If you came to it from another language you often find the subset that lets you write in your old language effectively, and then start adding stuff to make use of capabilities of the new one.
Yeah you can always look it up, but you do that if writing the new section is such a pain that you hope there is some better way.
Subsetting happens in lots of place beyond programming. I know of EDA tool users who went through all kinds of contortions to do stuff that could be accomplished by a single, obscure, command. Many of us use only a subset of features on our phones. I read an article on usability that said the main requests IBM got for new laptop features were all things already implemented but hidden under layers of menus.
According to this page Java is the #2 language after C (not C++). Maybe you were being sarcastic?
No. I said serious application. How many word processors, spreadsheets, databases, web browsers are Java based?
I know plenty of packages have a Java plug-in but don’t require it for core operation.
I and just about anyone else can go through their entire life never using Java except in minor applications like Android Apps. They may be common but they are simply not core computing products.
Just to amplify my comment here some of the major applications that aren’t written in Java for very good reasons
Apache Web Browser
MS Internet Server (IIS)
MS SQL Server
MySQL
MS Access
MS Visual Basic
All of the .net crap
Microsoft Word/Excel/ etc
Open Office
Sendmail
Exchange Mail Server
Exchange
Postgres
GIMP
Linux
Windows
i.e. just about the entire Internet
All the Javanuts can come up with is minor apps that run on Android and some relatively obscure and certainly inefficient services that run on the back of C++ front ends.
In terms of security Java sucks There is a new zero day exploit almost every day. This is why on Windows at least it constantly requires to be upgraded to fix these - usually every few days.
Java is Inefficient, bug ridden playware for mediocre programmers doing not very clever things.
Hell! Even Visual Basic runs rings around it!
I don’t know what “Apache Web Browser” is; I assume you’re talking about Apache httpd. Since you brought up the Apache project, let’s look at some of their offerings.
[ul]
[li]Apache ACE - Java[/li][li]Apache Abdera - Java[/li][li]Apache Accumulo - Java[/li][li]Apache ActiveMQ - Java, C, C++, Ruby, Perl, Python, PHP, C#[/li][li]Apache Airavata - Java[/li][li]Apache Ambari - Java, Python, JavaScript[/li][/ul]
Shit, that’s just from the “A”'s, and it’s not even nearly all of them. Why don’t you go look at the ASF project index yourself and report back to us.
If by “minor apps” you mean “every app, for the most widely-used mobile platform,” then you would be correct.
Really, this just displays a significant amount of ignorance about the history and nature of the various eco-systems.
Microsoft prefers its home grown language C# for new work. Back in the day Word was written in assembler. Apple uses its Objective-C for almost everything. GNU/Linux tends to use C++. Java is Sun/Oracle.
Interestingly all bar one of these are high level object oriented languages.
C++ is the odd man out. It is a hack that presents some vague semblance of OO built on top of C. As a language it is an abomination that should have been strangled at birth. It lacks even the most basic features one would expect, things like automatic storage management, type safety, reflection, and the like. It has compile time hacks that attempt to create a 90% solution that a lot of programmers mistake for the real thing - that is until the inherent limitations come to bite them. Tragic bolts on like the template system are no substitute for the real thing. About all they do is provide desperately difficult to track down compilation errors. C++ was a bad idea, and it hasn’t got any better.
I code in C++ most days. I have written many tens of thousands of lines of code (if not hundreds of thousands) in it. Similarly I have written huge amounts of code in many other languages, from Fortran 95, Python, C, Java, Modula and Pascal. They all suck in their own ways, but some more so than others. C++ sucks the most.
If I was to return to teaching my choice would be to teach C and Python. Not C++, C. C has the advantage that what you see is what you get. It makes no pretence of being much more than a glorified assembler. But it allows people to get down to the metal, and see what the heck is really going on. Python allows you to write complex things fast, and with the availability of a massive number of highly useful libraries and bindings, you can do serious stuff fast. It makes no pretence about what it is. If you want the OO paradigm, you get it. It doesn’t hide how it works or tell lies. What you see is what you get here too.
I too have coded in almost all languages you mention and then some you haven’t like Algol. My code output? Probably over 500K lines over the years. I was in the business long before C was even invented hacking away in Fortran and Assembler.
My specific area at the moment is real-time high reliability systems - somewhat difficult ion Linux and Windows for the real time part but I manage.
I would never in a screaming fit consider Java for that application. Nor in fact C for different reasons. And yes, I’ve coded in both, though much more in C including writing a RTOS in it.
C++ written in resource management style - like most Microsoft products are - is efficient, concise and thoroughly predictable. Java on the other hand is a hack that’s suited to low grade programmers at best and certainly not efficient nor predictable.
The MIcrosoft .net crap is just that. Their attempt at Java and an equal failure.
I’ve run large teams of programmers and can say with absolute certainty that the Java teams had woeful productivity despite having all the ‘aids’ provided by the langauge.
In terms of what can get the best result. Anything Hard Iron has to be C++. Anything to do with interfaces to databases should be Visual Basic (Yes, surprisingly, it’s a damned good product). Science? forget Fortran. Use C++ or maybe MATLAB if you aren’t trying hard. C is useful only for interface definitions. Pascal and like are an academic joke.
The plethora of scripting languages are really not very mature though obviously heavily used.
For the gent who picked up on my Apache type. yes it’s a server, not a browser. Right now I’m using it for a risk-of-life service using a C++ ‘CGI’ back-end to do fancy things with hardware.
And yes there may well be Java extensions to Apache, but are they actually useful? I can guarantee the vast majority of sites, including probably StraightDope don’t use them. The very brief experience I’ve had with Java web server extensions is the huge delays in starting up and the appalling performance.
For all the flak it gets, Visual Basic.NET is surprisingly useful. One of its biggest advantages is that it is so loose that you can get away with a lot of stuff that would cause compile-time errors in many other languages. You can throw together an executive dashboard application that draws data from two SQL Server databases, an Oracle database, and an Excel spreadsheet, mashes it up into a few charts, and throws it up onto the screen or a webpage with surprisingly little effort. It’s also great for prototyping - throw up a few screens and buttons in an afternoon without spending lots of time defining a structure (but don’t dare to try to use that prototype as version 0.1 of your final app - start from scratch! I’m so seriously!). You don’t need to worry as much about type conversions or other “details”. That advantage can quickly become a disadvantage if you are trying to write a complex enterprise application that is sending and receiving data from all over the place and needs to be robust. The very looseness that the platform tolerates can cause your application to blow up at runtime (crash) when it sees something unusual. Other platforms (e.g. C#, Java) wouldn’t have even allowed you to compile something that loose.
There’s an old saying that Fortran programmers can write Fortran code in any language, and C programmers can write C code in any language. There’s a lot of truth to that. The classic Numerical Recipes series, for instance, consists entirely of Fortran code written in various languages, and I once met a fellow who wrote C++ code in the dinky little scripting language of an IRC client.
Expedia.com, Zillow.com, and TripAdvisor.com (to name just a few) are all major websites that use Java for their user-facing pages. Apache Tomcat is a pure-Java web server that literally serves thousands of pages at high volume. It is very common for new languages to be designed to run on the JVM (e.g. Scala and Clojure) instead of being stand-alone compilers. I’m not sure why you think Java is such a bad solution.
Often that’s a bad thing. Sometimes it is good. I TAed a class teaching PDP-11 assembler, and the first thing we did was teach them Pascal (it was a long time ago) so that they’d write assembler like Pascal - in the sense of writing structured code. It worked pretty well.
I don’t quite understand all of the technical details, but from what I understand is this:
Learning programming languages aren’t the main focus, programming is.
- Programming is preparation. (Beginning)
- Programming languages are the means/process. (Process)
- The program is the end result. ** (End)**
The MOST important underlying concept to learn is the preparation stage, writing pseudocode, filling out a set of ordered instructions, picturing the process and the end result of the program…THAT is programming! THAT is the most important thing to know!
For example, if I wanted to build a treehouse, I need to do the following…
- I have to draw a outline of what it’s going to look like and divide them into smaller parts. Finding a good tree, making sure it stays in the tree, building the solid roof, the walls, the ladder, and anything else I want my treehouse to have, I must outline them all. (Programming)
- I must figure out how I’m going to build each part. Writing out the process or set of steps to take in order the complete each part. I must find the tools that’s useful for building each part and make sure I understand how to use them. If I don’t know, I use tutorials, books, manuals as references. (Programming language)
- The treehouse is complete and all that’s left is to inspect it for any hazards or problems. Polish+paint to give it a finishing touch, and then it’s finally ready for use! (Program)
To me, programming is step 1 and 2, while programming languages are step 3. You have the end result, the smaller parts, and the process written as a set of ordered instructions. Imagine that this outline is universal and can be created from ANY programming language. All you have to do is figure out WHICH language allows you to do these processes, to build and complete these smaller parts and to build the program that you want to build!
Then learn the language that suits your needs! There is NO BEST LANGUAGE, because every language allows you to do different things. You must understand what each language allows you to do, or essentially…categorize them into programming areas. This will make it easier to make programs!
Example:
I want to make a project that has characteristics of T, Y, U? Learn Java, C++, or C#!
I want to make a project that has characteristics of G, H, J? Learn Python, Ruby, or PHP!
So when you got it all figured out, you learn the basics of the chosen programming language, and then use reference resources(books, tutorials, courses, manuals, etc) to learn how to use the code you NEED to use to build your program.
Boy, I think I wrote too much, but you guys REALLY got my blood boiling! Is there something else I’m missing or not understanding correctly? Please don’t pull any punches, I want to know everything I can!
That’s a pretty good summary, Mr.Insaneo. I don’t know if I necessarily like the building-a-treehouse analogy, but you certainly seem to understand the basics of what’s involved.
C, not C++, and this is over-simplified to the point of being wrong anyway. The whole Unix world has always used multiple languages, with C (not C++) being a common thread among the various Unix platforms.
OK, sure, RAII is a nice idiom and sometimes it would be nice if C had it. However, it is more heavyweight than the equivalent in C, because it inherently requires the compiler to emit more code, and more complex code, than C compilers have to generate.
This is ignoring the value of the libraries written in Java, which allow working programs to be created in a fraction of the time and with a tiny amount of code compared to doing the same in C++.
BTW, once you realize memory leaks are a major source of inefficiency, Java has a big advantage over C++ in being garbage collected by default, whereas garbage collection is a bolted-on hack in the C++ world.
C is the language to use to write systems programs, where stability and efficiency are both paramount and tight OS integration is the whole point. And, of course, hardware-level code (such as the majority of operating systems) is usually best written in C unless specific opcodes are required such that you’re forced to write in assembly.
I won’t argue with the need to prepare. “Program in haste, debug at leisure” is my motto. But you are leaving out something very important, namely data structures. Knowing the best data structure for a problem can transform a long multiday nightmare into a five minute breeze. (I’ve taught data structures, so I’m a bit biased.)
Most of the comments from people here are around knowing several languages, and moving between them. Even if you know a bunch of languages. you may not be free to move between them, even if another language is somewhat better for a specific job. Also, while language B might be better for a particular task, you might be so much more proficient in language A that you’ll be better off using it.
Even if you do, you can confine the machine dependent stuff to small modules.
HLLs versus assembler for systems programming was controversial when I was an undergrad and my professors defended their decision to write Multics in PL/1. No more.