Why aren't languages like ML taking over from C++ in the IT industry?

It seems to me that C++ tries to be everything yet excels at nothing. It’s grammar isn’t particularly friendly, especially with regards to polymorphic types and their definition, it’s method of compilation is outdated and the standard definition moves incredibly slowly, limiting innovation in the language (that and the obsession with keeping backwards compatibility with C), not that there is a [mainstream] compiler out there that implements the full standard, anyway.

Now, I know that languages like C# and Java are taking market share away from C++, but both of these languages are really just refinements on C++.

Why aren’t languages like ML, which offer some advantages over C++ derived languages, being considered? What will it take for a serious amount of market share to be given to non-C++ derived languages? Will we still be using C++'s derivatives in 50 years time?

Is the paradigm shift too great? A lot of developers will have been in contact with a functional language in their CS degree. Would it take a big company to take the language and push it, like Sun did with Java and MS did with C# for it to catch on?

Inertia. Money. Bottom line.

What advantages does ML offer to a business? What possible motivation would the VP of engineering at any company have to make this gigantic switchover? C++/Java do the jobs that need to be done, they are known quantities, I’ve already invested in the software (compilers, editors, etc.), I have a staff that knows the language, I have existing software in that language that needs to be maintained and enhanced. Switching to something completely different would require a huge investment in new software, new personnel and/or new training for existing personnel, the ungodly headache of maintaining old software in one language while writing new software in the new one. Not to even mention that during the time that we’re making the switch (retraining, hiring new, installing and learning new software, etc.) my competition is moving ahead with their “old fashioned” C++/Java development and beating me to market. So I’ve just lost doubly: I’ve spent a bunch of money to gear up on this new language, and I’ve also lost income because I didn’t get my next version or product out.

And what have I gained? Absolutely nothing.

That kind of a change, to ML, is going to take a long time, in the real world.

Unless something else comes along that the businessmen and engineers can agree on.

Reduced development costs and reduced type errors, for a start?

The rest of your post implies that businesses never switch languages, which isn’t true. Every problem you stated could be applied to Java or C#. My question is why aren’t languages which aren’t derived from C++ being considered when they do have benefits over those which do.

You may be interested in the thread currently going in comp.lang.c++.moderated, Why do you program in C++?.

They are. I know a lot of IT people (especially C++ programmers) scoff at the likes of Visual Basic and Active Server Pages, but VB (and ASP) account for a great deal of application development.

Why not? Folks who know “dead” languages like COBOL and FORTRAN are in high demand these days, because there aren’t a lot of folks left who can maintain decades-old legacy software in those languages.

C/C++ has hung on for as long as it has primarily due to legacy effects – there’s a big bumper-crop of code already written in those languages that can be easily lifted and adapted to those needs. Java (which IMO is better for object-oriented development than C++) benefits from this as well, since porting C++ to Java is a relatively minor effort. A “new and improved” language that doesn’t leverage this backwards-compatability will have a hard time gaining a foothold as a result, IMO.

And speaking as a long-time coder, while C/C++ are a hassle to program in, I don’t think it’s fair to lump Java into the mix. Java fixes a lot of the nitty-gritty problems C++ inherited from C, and is a very clean language to do development in. The only thing that hampers Java vs. C/C++ is the perceived performance hit of the JVM.

No debate here as I’m not much of a programmer myself, but after browsing this thread, I quite coincidentally stumbled upon a pretty cool historical timeline of programming languages.

Hopefully, my two cool links make up for the slight off-topicness of my post. Carry on.

That’s not quite true. The non-deterministic garbage collection is a big issue for a lot of folks. Additionally, as I understand it, there’s no equivalent to C/C++ linker optimizations, so for those of us who care about writing really small code, it’s not so hot either.

That said, Java’s a very good language for a lot of applications and is definitely worth knowing.

Yeah, but what’s the overall impact this quarter? Remember, the whole Y2K issue only existed because there was no short-term benefit to converting all that old COBOL code to use four-digit dates until it was a crisis.

We played with ML many years ago. People say that functional languages are intuitive, but I never found that to be the case. Is anyone writing significant amounts of real-world code in ML?

BTW, heard about object-oriented Cobol? It’s called “Add 1 to Cobol”.

How much of a programmer’s job is maintaining existing code?
What language is the existing code?

Reduced development costs? Maybe if measured by how long it takes an experienced programmer in each language to build something useful. But there’s a lot more to the cost of developing an application than just how efficient a language happens to be. There’s the existing infrastructure, measured in compiler licenses, tools, libraries, and most especially the collective knowledge of the staff. Then there’s the fact that you can find C++ programmers off the street, whereas new languages have to be taught. Java actually had this problem for a long time. Inexperienced programmers, having to learn as they go. That’s an expensive way to develop.

Languages like C++ and Java have a huge amount of support infrastructure around them. The internet is an invaluable resource to me. I can find answers to questions in seconds that would take hours if I had to dig around in textbooks. The set of tools available for these languages is immense. We have a pre-existing library of code that we’ve developed over decades that we can draw from when writing new apps.

So for a new language to be worth adopting, there has to be significant productivity gains in the reasonable future to pay back the intermediate costs.

The languages that are really making a fast entrance into the marketplace are the ones that allow you to do things you couldn’t do before without a great deal of effort. ASP, JSP, XML, written against RAD application servers like IIS and Tomcat. They’ve put complex programming tasks within the grasp of many companies, greatly expanding the software industry.

Java took off because it promised true platform independence with its JVM, and that excited a lot of people. In the end, I think Java somewhat failed at that, but once it had a foothold people found it was pretty good for other things, without a lot of the annoying scaffolding and boilerplate needed for C++ programming.

I think that’s right, and the relative closeness of Java syntax to C++ is a very big deal. The barrier to entry into Java is very low.

Also don’t underestimate the value of the free Sun compilers and all the free tools available for Java. The Sun JDK and Eclipse as an IDE make for a pretty strong development environment, all free.

Not so much the perceived performance hit of the JVM, but the managed garbage collection. One of big advantage of the new .Net languages is that they allow you to mix managed and unmanaged code, and that they come out of the gates with a huge infrastructure around them from Microsoft.

Our company has moved into .Net in a big way. We’re abandoning Java.

Which is great for an all Microsoft shop, not so great for a heterogeneous environment.

Sam, you previously stated that Java failed at the portability thing and I can’t quite figure out how that is the case.

I write Java code and run it at work on my PC, the AS400 and our SUN box, unchanged. Seems to be doing the portability thing ok, I assume you must looking at it from some other angle, GUI apps maybe??

One big, big thing is that companies are unwilling to write code in languages that aren’t guarenteed to exist in 20 years and a language has no chance of existing in 20 years if nobody writes code for it. Code survives for mind-bogglingly long periods of time relative to the rapid pace of computer development and the vast majority of a programmers time is spent in maintence mode, not development. Nobody wants to be stuck with a piece of code which is no longer supported and has no known compiler for your current machines.

Another thing is that while some good computer science courses teach you how to learn how to program, most poor course and the majority of self-taught programmers only know how to program. The course I wen’t through was very big on teaching the fundamentals of the profession. This meant lots of linear algebra and discrete math, algorithm design, object orientated philosophy and good style etc. The goal was to produce a student that could sit down with a completely unfamiliar language and be able to do moderately complex tasks after 3 weeks and exploit the more powerful features after a few months. OTOH, the neccesary trade off is that you don’t have the time to teach XML or Tomcat or IIS or .NET or any of the other flash in the pan technologies. Meanwhile, the college down the road is teaching all this stuff and their graduates go out and earn $10,000 more than ours straight off the bat which is immensely appealing to a young high school student choosing university degrees.

There are actually some wonderful languages out there which do some really cool stuff. I feel privledged to have been able to program in some of them because, even though I don’t use them regularly when I program, just knowing about them has improved my programming dramatically. However, you just can’t teach a language like that unless you have a clear expectation that your students can jump in the deep end and learn the fundamentals of the language in a few weeks without excessive handholding. Which is a real shame.

A lot more than it should be, at least in my experience. Maintaining (or, more precisely, adding new features to an existing system) software that’s over 20 years old is not uncommon. Open source/freeware kinda makes it worse, as some managers decide it’d be easier for you to take an existing open source project and modify it for your need instead of writing something from the ground up.

Depends on the industry. I’ve seen legacy COBOL code at an insurance company, legacy FORTRAN code in mathematical/scientific research, and a truckload of legacy C/C++ code in aerospace and defense. Off the top of my head, I’d guess C/C++ followed by FORTRAN would be the biggies.

Best choice of language (assuming a completely trained staff), is highly dependent on the task at hand.

I personally have used, in the last 6 months, about 7 different languages (including scripting languages, etc.), each one chosen for it’s applicability to the task at hand.

For business apps, I can gaurantee you that there are more lines of COBOL out there running on mainframes than anything else (40 years of accumulation will do that).

C/C++ certainly is tops in any kind of consumer PC software, which may exceed COBOL, but they are kind of 2 different markets.

Only assembly can do certain hardware specific functions and C can more reliably interact with assembly code at a low enough level such that only a very minimal amount of code need be in assembly, than most (all?) other languages.
Thus, OSes and drivers are written at their core with primarily C code. Any language which maintains compliance with C can automatically interface directly into the APIs of the OS it is expected to run on. Hence C/C++ for Windows, C/C based scripted languages for Unix (family), and C/Objective C for Mac.

Java has only been able to become applicable because, beyond being simply a language, they require that the VM handle also an entire API for a full virtual OS. Most languages do not do this and so you’ll automatically be forced to go through some hoops to do anything on your platform towards making a real application.

And of course, for the same reason that C become big, C++ with it’s lack of safety nets provides an environment that allows anything to be accomplished. .NET, COM, scripted languages, all can only exist because someone went about doing things that would generally be considered as a scary thing to do–and thus not allowed in most existing languages.

So simply, the C family will continue to be a powerful force simply because, at heart, everything else is dependent on the existence of such a language.

Another reason: functional languages just aren’t taught as much, nor are students willing to learn them. (Don’t blame me, I’m just the messenger; I like functional languages and think they should be a major part of the curriculum.) I TA’d an introductory class for CS for two years that used scheme. There was only a handful of people who didn’t complain endlessly about it and how they’d rather be using C. Of those students, I think about half actually got functional programming.

Next semester, the university is changing the intro class to use C; the students will get no exposure to functional languages outside of maybe some Lisp in an AI course.