Why aren't languages like ML taking over from C++ in the IT industry?

Dead my ass. FORTRAN is still heavily used in science and engineering. As an aerospace engineer all the important work related codes I use are FORTRAN, as are most of the new code I see being produced. Heck, the 32 year old engineer sitting next to me uses a lot of FORTRAN 77 as that is what a great deal of his early codes used and the time to convert it even to 90 would be too much of a pain. When you have thousands upon thousands of lines of highly technical code that works well there is no impetus to mess with it.

I’m probably the only person in the world who could say this(at least I hope so). My company, specifically the project we’re working on right now and the VPs are all in a tizzy about, would pay serious money for a top-notch Smalltalk developer. You have no idea how depressed that makes us.

Enjoy,
Steven

This is unproven. Seriously, you may strongly believe it to be true, but it hasn’t been proven in the marketplace, by a real business doing real-world developement. Now don’t get me wrong, I am not claiming that it’s not true, I’m saying that it has not yet been proven to be true, in a way that would convince a VP of software that the extraordinary cost of switching would be worth it.

Well, in the first place, a true switch from one language to another happens less often than you seem to think. But really, you answered your own question in the OP: C++, C#, and Java are not that much different from C, certainly not compared to how different ML is from C or C++ or Java. So if you’re at a company that’s been doing all its development work in C, adopting C# or Java for new development isn’t such a big change; your engineering staff can still leverage a lot of their C experience. Making the switch to ML, as you yourself point out, would mean the developers would have to be fully retrained in the new language and new ways of thinking and developing. There’s just a huge, huge difference between moving from C++ to Java, and moving from Java to ML.

In our engineering group we’ve almost completely switched from C/C++ to Python for all new programming projects. As one of my colleagues who was the first to switch put it, “Anything that needs to be done in C, I can write four times faster in Python, and it will have fewer bugs.” He gradually convinced the rest of us.

I only use C/C++ for legacy code nowadays.

Ed

I’m expected to retrain myself in new languages, as needed, if I expect to keep my job.

Ed

On what basis do you think the need is decided?

Speaking as someone whose done some commercial work in ANSI Common Lisp (the best programing language every in the entire past, present, and future of the universe), and thus knows a bit about the niche language scene…

Check with the Smalltalk vendors (even the ones whose product you aren’t using) to see if they have a resume database and/or know of some good contractors. Check to see if there are any local Smalltalk user’s groups and go to one of their meetings/check out their email list or blog. Hang out on any Smalltalk cultural hot-spots (for Lisp it was comp.lang.lisp and cliki, for Smalltalk I’d probably start at www.squeak.org and www.whysmalltalk.org and go from there). Find out the “big” Smalltalk open source projects and ask on the mailing lists. Look up the people involved in organizing Smalltalk conferences and ask them if they know of anyone good looking for work.

Smalltalk may not be as popular as C, but it’s not that obscure a language. It’s not like you’re looking for a Simula or SNOBOL developer. :wink:

When the boss says, “You need to do code in __________ starting tomorrow,” I’d wager.

And when a Lisp programmer talks to you about obscure languages, you listen. :smiley:

Yeah, that’s the point. No one’s going to be ordered to start programming in ML until the suits see enough benefit to outweight the cost of conversion.

Yeah, that’s the point. No one’s going to be ordered to start programming in ML until the suits see enough benefit to outweight the cost of conversion.

I’m feeling very old because I got that. :frowning:

Not a developer but I’ve used it in the past week. :frowning: :frowning:

I think I noticed that nobody mentioned the main reason for not programming in machine language: it is the very opposite of portable. If Intel changes the next generation of processors too much your code is a bunch of worthless ones and zeros.

Psst!

God DAMN it! If people are going to go around naming languges the same as existing languages… (sputtering too much to make even as little sense as I usually do)

I haven’t been this mad since IBM commandeered the term “Personal Computer.”

Exactly right. And that’s the point I’ve been trying to make: The answer to the OP is that Those Who Make The Decisions (aka the suits) need to see and believe that there is a clear benefit – in dollars and cents – to making such a switch that outweighs the costs of making such a switch. And believe me, the costs are very, very high. Yes, suranyi, you are expected to retrain when needed, but no one with an IQ of at least two digits believes that such retraining has no cost to the company you work for, either in direct dollars for the classes, or in time for you to “get up to speed”.

It’s no more complicated than the old cost/benefit ratio. Go to a VP of engineering or an IT manager and convince him that the switching to ML puts that ratio well below one, and it’ll be done.

That is so unbelievably cool. :cool:

Speaking as a person who hires, employs, and fires programmers (and who does a fair bit of coding herself) - the programmer is much more important to me than the language. We went through throwing money to get C++ programmers on board because C++ was going to be the Wave O’ the Future. But the experienced, tried and true programmers were the ones who got the shit done - in FORTRAN. While the C++ gurus were fighting over specifications of polymorphic pointers and underwent a “braces war” that cost us untold amounts of money in re-work, the FORTRAN programmers on a parallel track were delivering code that, while “inelegant”, ran, and made us money.

Now we have an evolution of that. The older, experienced C++ coders are the ones who are getting the shit done, and the new Wave O’ the Future coders are the ones fucking things up and costing me money. So I’m not interested in any major new language change, unless and until we have a driving, overwhelming need to do so.

Thankfully, some new languages have little overhead to learn - like PHP, for example, which we did have an overwhelming need to learn. Although it took a lot of work to convince people to try it, our PHP coders now run circles around the ActiveX and java “gurus” who keep saying “but PHP is open source, you can’t trust that! Open source code is only used by script kiddies and hackers!” :rolleyes:

And thinking that makes you almost as big a loser as me! :wink:

But for what it is intended for it gets the job done. Me, I usually use it to strip high bits out of old wordstar files.

This is the absolute key here, and the mark of a good programmer is understanding that what I’ve quoted is true for any language. C, C++, Java and every other language we’ve discussed is not perfect. They have significant problems. But for each one of them, I can point to an application for which that language is the best.

Let’s see, I maintain and support a system written in C on a Unisys Unix system V box, I also work with a system written in Cobol on a Unisys 2200 mainframe, and the new system that’s been in development for 5 years is written in VB 6.0.

These, of course, are not scientific or engineering systems, but just the sofware that runs a multi-million dollar company with hundreds of employees.

What’s ML and what’s C++?

Ugh. I hate to say it, but I’ve found that this often costs more in the long run. I’ve seen companies where people were praised for “getting the job done.” They were applauded as heroes for their fast work, but after a while, their hasty patchwork began to fall apart. From what I’ve seen, this often happens after just a few months – not necessarily years later.

Mind you, I realize that there are times when quick and dirty (“inelegant”) work is needed. That should be the exception rather than the rule, though. I also realize that your experience with braces wars may have been a bit extreme. Still, I’m reluctant to offer praise to programmers who “get the job done,” since I’ve seen how these quick results can often be a ticking time bomb.

This is why I think the term “software engineer” is much abused. It’s a common job title, but most of the “software engineers” I’ve known are really just programmers. A real software engineer approaches coding in a systematic and disciplined way–with careful planning, an understanding of design trade-offs, and an eye toward elegance and maintainability. A lot of companies scoff at this, preferring quick coding and seat-of-your-pants design – and I think they pay for it before long.

The thing is, they might not even realize that they’re paying for it. When it comes time to wrestle with bugs or performance issues, they might never realize that these problems occured because of a cowboy approach to programming. All they know is that their programmers get the job done, without realizing that these issues could have been avoided with a slower, more fastidious initial design. This sort of thing can make management nervous, since it often appears that nothing is being accomplished. In reality though, it can save months – if not years – of heartache down the road.