The Best Programming Language Ever

Well, if you’ve managed to do “Hello World”, you’ve at least made it past the usual “How’s this input/output system work?” stumbling block. :slight_smile:

(Near) 20 years is impressive; you’ve been playing with the language since it was first developed? At any rate, my usual response to people’s initial difficulties with Haskell is to blame the way things are taught, rather than the language itself. But then, of course I would say that, rarely putting myself to the test of actually attempting to teach it myself…

It’s worth noting that C, Java, and indeed the vast majority of mainstream languages, all use parentheses for the exact same thing as Scheme does: function application [though with very slightly different syntax; in C, one writes f(x, y, z), where in Scheme one writes (f x y z)]. The only real difference is that Scheme programming typically involves a lot more nested function applications. But, this is not fundamentally a syntactic issue.

Just as C, and a whole class of similar programming languages, is essentially fully used and exhausted by the time you finish learning the standard teaching example of printing “Hello world!” to the screen…

Same as in Unix, except in Unix the data was born human-readable and doesn’t have to be converted.

So, it’s really no different from Unix, then.

But it’s a lot more enjoyable reproducing a DNA program than any other computer language!

It’s completely different from Unix, because constantly having to extract data from a user-formatted representation of it is a stupid task.

It’s a subtle change, but f(x,y) is a lot more readable than (f x y). I don’t understand why all the greatness of Scheme can’t be kept but the syntax completely redone.

You mean replicating DNA …or… splitting+recombining DNA?

The first happens without involving another person.

The second requires a lab technician to get a frozen sperm sample and implant it into an egg in a dish. Looks more tedious rather than enjoyable work even with today’s fancy microscopes.

I’ve always felt the old-fashioned, manual way works fine (one man, one woman, dark room, candles, background music, etc.).

The best time I ever had programming was in Labview. The Labview language, G, was easy to understand for me; I drew my progams and they matched the way I think. Then I made control panels fopr input and output.

Such a pity that the Labview system was extremely proprietary to National Instruments, and therefore very expensive. Labview didn’t accrete an ecosystem of independent users and developeers and extenders around it: there was the expense of acquiring it, and there was no real way to save ‘source code’ so that a free workalike could be constructed, so it never really made its way out of the niche of instrumentation programmimg.

DNA is self-documenting; it just takes a couple billion years to finish.

I’m inclined to agree - Pascal has the language features that make BASIC popular (i.e. being readable like English), but structural features in common with many serious languages (explicit types, code blocks, ignored whitespace, etc)

shrugs. It could be, I suppose [indeed, it’s a trivial front-end you can manually place on any existing Scheme interpreter], but people grow accustomed to one syntax or another and see no need to change. Indeed, I see no need to change; I find (f x y) just as readable as f(x, y), or even moreso, in some contexts [one might just as well complain about commas as about parentheses, though, in every which case, it’s such a trifling thing to take issue with]. Of all the issues to judge a language on, this particular one is way, way down there, I think.

[Though my truest love is for a syntax in which mere juxtaposition is function application, with parentheses only necessary for explicit tupling. You can guess which language does this…]

Absolutely not. Syntax is extremely important. User interface is extremely important. GUIs are extremely important. There’s a certain subset of people, often from academia, who completely fail to see the inherent truth of this. Those people cause all the evil in the world.

Yes, even dead kittens.

There’s a difference between user-formatted and human-readable. However, the primary advantage of the Unix method is that binary blobs can’t be debugged: If something is going wrong upstream, text wins out because you can, in extremis, dump it into a text file and use a known-good editor (as opposed to a possibly-broken special-purpose inspector) to see what’s going on.

To you, maybe. I find it’s easier to edit code that is syntactically homogeneous: My editor always knows how to indent things and can prevent me from making nearly any syntax error.

Because then macros would be a lot harder. You really can’t understand any Lisp (Scheme included) without understanding the power of being able to modify code in arbitrary ways using Lisp functions and the whole Lisp toolkit. Things like C++ templates and AspectJ are, at best, a workable imitation of some subset of what can be accomplished easily with Lisp macros.

Macros make code readable by folding both common boilerplate and complex but tedious transformations into simpler syntactic forms.

For example, it’s possible to implement your own object system without any help from the language. People do it in C every so often. However, when you have macros you can do all the tedious stuff (method dispatch, polymorphism, inheritance, etc.) once and reuse that code anywhere it’s needed.

Macros allow you to name and abstract away code that can’t be named and abstracted away into a function. Macros do to code what functions do to values.

(Oh, and they tried to do what you said. They called the result Dylan, which is a nice enough language but doesn’t have the great macro system all Lisps do.)

I agree fully. That’s why I love Common Lisp and its macro system. (Scheme’s macro system is nice, and just as powerful in its own fashion, but it doesn’t mesh with my brain as well.)

It’s been done. More than a few times. People hate it.

Lisp takes some getting used to, but for me it had the exact opposite effect: I can’t stand languages that have stupid unnecessary syntax for no good reason. I’m looking at you -> Erland end.

shrugs again. I’m not saying syntax in general cannot be important. I’m not opposed to, or unappreciative of, syntactic sugar. I’m not even wedded down to the overwhelmingly dominant paradigm of having syntax be text-based [one could easily imagine something else being more appropriate for many purposes, as you perhaps hint at].

My position isn’t “This issue is unimportant because it’s mere syntax”. My position is that this particular syntactic issue happens to be of negligible importance, regardless of the value of other interface issues.

Perhaps code should be stored merely as some encoding of its abstract syntax tree, and then particular editors could render it in whichever fashion one prefers. [Perhaps Scheme code already is essentially such an encoding, and if you’d like to manipulate it in some syntax beyond the default, one could easily slap on a front-end to do so, just as with any other customizable component of an IDE…]. Then we could put this sort of argument to rest and spend time concentrating on the differences between programming paradigms which involve conflicts more fundamental than superficial window dressing [again, not all syntactic issues are matters I would dismiss as superficial window dressing, but this particular pair of contrasts is].

You read my mind exactly.

Boo does very advanced macros, and it doesn’t need to nest parentheses maddeningly. Imagine Python with all the capabilities you’re talking about. Perhaps there’s more rules to writing a macro than with Lisp, but you get a better result.

CTS objects are not binary blobs. You can examine them fully (in the debugger or programatically). Debugging powershell is easier, not harder. And because it’s statically-typed, etc., throughout, you make fewer mistakes in total. For many various, obvious reasons, text-based interchange is downright primitive.

Anybody care to give me a link to information about D?

My choice, btw, is Java. I have coded in C, C++, AWK, Perl, Prolog, CLIPS, Metropolitan English Language (MEL) and who knows what all, but Java is the language where my program is most likely to work with relatively little debugging, and still be maintainable a year later.

D programming language

I wondered if LabVIEW was going to come up. I didn’t list it in my comparisons because, while using it does amount to programming, I am still undecided on that debate about whether or not it is a language.

But I did spend much of the last couple years programming with that glistening turd of a hack job, and still awaken sobbing at night because of all those hours I will never get back.

It’s buggy as hell. The development environment keeps crashing while just editing code. It keeps spontaneously rearranging “clusters” (its version of structs) and breaking the code, which has been a known issue and has had various CAR numbers for the last two major versions. It has the most daffy idea of variables I have ever handled (numeric variables, for example, have a font). It is designed for data acquisition, but uses one gigantic half gigabyte driver, which doesn’t support some modes depending on the communications bus in the background. Some subroutines you write will work fine by themselves and the first time you use them, but if you use them in a second location, it will break the behavior of the first instance (due to their odd defaults and documentation for reentrant code). Moving assemblies with feedback nodes in them often breaks the wiring inside the assembly (like scrambling the identities of variables when you cut and paste text code referencing them). And the minor version releases come out several times a year, introducing new little glitches and quirks every time, so we keep having to jump between versions depending which glitch is more troublesome in which way.

LabVIEW’s success may be due to its low starting threshold - you can learn to write some simple working programs in a few minutes. In fact, without any instructions, you might guess how to write a “Hello world” program just by staring at the IDE. But this is a double edged knife, and as things get more complicated, it is very inefficient, with much too much mouse fiddling. Flow charts were intended to clarify things, but if you try to draw a flow chart representing every single program detail, you’ll get the idea (not that LV “programs” equal flow charts, but there are similarities). Or its success may be due to NI’s habit of buying competitors and smacking them with a giant hammer. But, I hear the LabVIEW shop contains many programmers unhappy with the low standards for stability and the overeager schedules that push buggy software out the door way more often than anybody needs. Programmers probably always wish for more time to debug, but I don’t know any other programming environments this troubled.

So, anyway, I hope this doesn’t tread on anybody’s feelings, but if another thread about “The Worst…”, at least I’m ahead of schedule.