What programming language would most likely be used these days?

I think the issue is that, even in C++, operator overloading isn’t a natural part of the language – it’s something they would have had to put time into working on, and if it’s not going to be used often, and when it is used, frequently not used well, why do it? Most Java classes are not algebraic entities, so wouldn’t benefit greatly from operator overloading anyway, so why complicate the language to support it?

(My point about Scala is that something equivalent to most of operator overloading flows naturally from the standard rules of the language, so it is more sensibly supported.)

“Used” is not quite the same thing as “loved”, though. And arguably using “+” for a non-commutative operation is itself an abuse.

The problem with things like operator overloading is that it makes maintenance harder. Often in business, you end up having to debug code written by other people. When I’m looking at some generic code like:

object1 + object2

I don’t have any intrinsic idea of what ‘+’ means in that context. I have to go look up the source and see what it does. But if the code was:

object1.append(object2)

Then I get an idea of what the method is supposed to do. I can more easily read through unfamiliar code and get an idea of what it’s supposed to be doing so I can hone in on where the bug is.

I find it hard to debug Python for similar reasons. For example, method parameters aren’t typed, so I waste a lot of time trying to figure out what is getting passed into the methods.

Often when I hear a language is ‘powerful’, I take it to mean that there are lots of tricks and nuances which make maintenance more difficult. Also, it will be harder to find accomplished programmers since there is more to learn.

Unless you have a very good reason to use a niche language or a loosely typed language, it’s better in the long-run to use a language which is common, simple to understand, strongly typed, and compiled. You’ll have fewer bugs and it will be easier to maintain.

Yeah, the bit about LISP was kind of a joke – but a semi-serious joke. My point was that the foundations of language design are very old. Many beautiful, expressive, and useful high-level abstractions were discovered long before Java came on the scene, and Java benefited from almost none of them – it just regurgitated a wordier, albeit better structured, C.

Haskell supports “operator overloading” with similar elegance, even allowing you to define binding power for infix functions. It’s one of my favorite features of the language. :slight_smile:

The software company I was working for (admittedly three years ago) was using Visual Basic.

Is there any language on Earth that doesn’t overload the plus sign operator? Adding two floats together is a different operation than adding two integers together, but I’ve yet to see the language that used anything other than “+” for both.

Huh, maybe I ought to learn LISP. I’ve programmed HP calculators, and while I found it terribly idiosyncratic (HP engineers seem to love re-inventing the wheel), it also had some very handy features.

Hey Stealth Potato. LISP be cool. Let me throw a log on the fire. FORTH.Yeah baby…

Jragon. Are you aware of any GO forums that talk about using GO with any of the Internet of Things protocols?

Yes. I can’t remember what language it was, maybe Prolog, but I used one once where to add floats you had to explicitly use the +. operator. Same with other operations, multiplying floats was *. and so on.

OCaml. (+) is integer addition, (.+) is float addition.

Haskell’s type classes also solve the problems with ad hoc polymorphism that filmore was complaining about. When I see the Haskell code:



  foo :: Monoid a => a -> a
  foo x = x `mappend` mempty


I know quite a bit about what the overloaded mappend function and overloaded mempty value are doing due to the Monoid type class having a set of algebraic laws associated with it. I know, for instance, straight from the definition of Monoid, that foo is equivalent to:



  foo' :: Monoid a => a -> a
  foo' x = x


which in turn is just a type-restricted identity function:



  foo'' :: Monoid a => a -> a
  foo'' = id


The same holds for most type classes in Haskell. You identify a repeated pattern in code, you abstract it into a type class, and you associate algebraic laws with the type class which should be satisfied by all instances. You can even tell the Haskell compiler to optimise code using those rules, by informing it of the laws with rewrite rules.

And introduced garbage collection to algol-style languages. And created the first mass-market virtual machine. And enabled object-oriented programming in the C++ style without the overhead of C++.

I won’t be too full-throated in my defense of Java, but I don’t see the point of all the Java hate. It does what it does, and what it does is something a lot of people want to do.

No doubt the connection is that Scala’s purpose is to be able to Haskell-like things in a Java-like language.

There was a recent article on the topic of the failure of weak typing.

Relevant Part:

No, you’re right – all those things are to Java’s credit. The main reasons I dislike Java are its clunky and uninspired syntax and its abject lack of high level abstractions. Also its standard library is quite poor, owing perhaps to the lack of abstraction in the language.

Agreed. I’m going with Go.