easy + eclectic programing language - why no?

jally offers the example:

Actually, I believe that this example is valid in every context-free language ever devised. (I say “I believe” to try and eliminate the cries of triumph that would arise should someone find an example from 1962 in which it is not valid. I also note that optimizing compilers may not give the expected results). So, if we limit ourselves to this example as the touchstone of intuitiveness (and, of course, we ought not to), HLLs are intuitive. Of course, programming in this fashion is also a pain very low in the back. Much easier to write:


do i = 1 to 1000 by 3;
   *some stuff*
end

than it is to write:


i = 1;
inc = 3;
do while (i <= 1000);
   *some stuff*
   i = i + inc;
end;

(Finally, a use for the <code> tag!)

Akatsukami:

You might just as well do this:

reset N TO 0
do till N=102
Print “N,”
Add 2 to n
end loop

I think that’s clearer than:

do i = 1 to 100 by 2;
some stuff
end

To the guys who suggested AppleScript:

How come I never heard of it? Is there a good online tutorial for AppleScript (for example, Barta wrote an online tutorial for Java; is there one for Apple?) You say it’s slow. Would it be suitable, say, for programming a forum script (such as this one or wwthreads or ubb?) Actually, I have in mind more of a pen-pal type of script such as http://users.wantree.com.au/~designer/cgi-html/contact_list.html EXCEPT more flexible in that it would allow users to input more info. I.E. it would need to have a “see-more” link (as in the full-mode of http://www.beholder.net

Finally, is there anything online which can teach how to adapt (hack?) existing forum scripts in order to tailor one of them to one’s needs, such as an “enhanced” pen-pal script?

jally,

You wrote:

This just re-emphasizes the point made earlier about intuitiveness. I think most people would find the second example more intuitive. It keeps all of the looping control structure on one line so it’s obvious what’s going on. In your example, it’s not always obvious that the “Add 2 to n” instruction is part of the loop control (especially since you changed the looping control variable from ‘N’ to ‘n’ [wink]). You had to artificially add 2 to your loop limiting variable so that you would print all of the numbers. This is easy and not too dificult to understand in this fixed example, but in the more general case where your step size might not be known at compile time, your method gets more complex.

I’m guessing you don’t have a Macintosh. AppleScript is a part of the Mac OS and can be used to automate many OS level tasks. One nice thing about AppleScript is that it has a sort of macro capability, or automatic script generation. You can turn on a recorder and just do things in the OS, all the while the AppleScript editor is recording your actions in the script. You can then come back and make minor modifications to the script. In some cases, it truely is “programming for dummies”.

I don’t think AppleScript would be up to that task on it’s own. It doesn’t make a good CGI language. AppleScript is intended as more of an OS control language (I can’t think of a Windows analogy). AppleScript can work with other applications, so it’s probably not impossible to do what you want, but I don’t think it would be your first choice as an implementation language.

All computer languages are very simple and intuitive, even C and C++. What’s not intuitive, and probably will never be, is using the features of a language to efficiently perform tasks.

Anyone can learn to hammer a nail into wood, use a saw to cut a piece of wood, etc. But that knowlege does not qualify one to build a house.

In programming we are asked to instruct a machine to perform tasks that are often quite complicated. Unlike human beings, however, machines have no “preconceived ideas” that we can reference and exploit to perform the task. Therefore each tiny step must be explicitly and correctly specified. Therein lies the professional skill of the programmer.

Some languages and development systems “embed” knowlege of very specific tasks. For instance, Perl embeds many techniques of string manipulation. The problem is that either the techniques are very specific and thus have narrow applicability, or they are general, and thus again require “high level” knowlege of how to use those techniques to perform tasks.

In short, it is not the deficiency in the languages that makes programming “non-intuitive”; it is that the implementation of the tasks that people wish machines to perform, the precision necessary to implement those tasks without reference to a human knowlege base, which are fundamentally non-intuitive.

Joey:

I think I have an idea of what you mean by a language that works via “recording”. (WordPerfect had something like that, wherein a series of keystrokes allowed you to record various functions, such as “user-input” or merging a primary file w/a secondary file, & much more - enclosed within macros)

Re:

Which language would be most suitable (while at the same time being the simplest one which can efficiently get the above-stated job accomplished)? In other words, there’s no purpose in my reading alot of jargon that abstract-minded guys created; rather ONLY the particular commands (no more & no less) that’s required for my purposes.

As to the Big Debate, I think the reason programming languages are so complex is probably because the guys who have the types of abstract minds suited to creating a language from a bunch of binary bits, lack the simplicity necessary for making it user-friendly. Furthermore, they (and, apparently, many of you :wink: can’t even understand that the average person would consider it unfriendly. (I didn’t understand the reason why you think most people would consider my “DO TILL” example less intuitive.)

I don’t know how valid this conclusion is. Look at Perl. It was created by Larry Wall, who was (ready for this) a linguist by trade before he began designing the language. His philosophy was to allow Perl to grow as needed, and evolve like a natural language. Result? A very flexible and powerful language–easy to learn the basics, hard to master.

Of course, that could also be said of many natural languages.

jally,

You wrote:

Well, the obvious answer is to start with the code for something that is similar - if you can gain access to it. It’s much easier to ‘hack’ something that already exist to ‘mutate’ it into something else. If I were starting from scratch, I’d probably use perl. However, and this is a key point that you seem to keep missing, each language has it’s strengths and weaknesses. The absolute best approach looks at a multitude of project parameters to arrive at the optimum solution. Considerations include, what kinds of data structures am I going to use? What kinds of access methods do I need? What kind of execution speed do I need? How bullet proof does the application need to be? Is rapid development important? Is a language I already know, suitable? Programming is a game of tradeoffs.

I don’t mean this to be cruel or condescending, however it’s clear to me that you are not grasping the fundamentals. Understanding these abstract ideas is critical. If I had to break down and rank the elements of effective programming, I would do so thusly:

20%   Syntax (understanding the language)
30%   Development process and methodology
50%   Data structures and algorithims

If you attack only the 20% portion of the problem, you can guess how effective your solution is going to be.

NO!!! The point I’ve been trying to make is that programming languages are NOT complex! They are abstract! And complex and abstract are NOT the same thing. Fundamentally, programming languages are extremely simple; because they’re made up of logical laws and ONLY logical laws, they don’t have to do much with the real world, so they appear alien. But so does mathematics. Yet there are some people who “get” mathematics and some don’t. These tend to be abstract-thinking people who can apply abstract concepts to real-world situations. I believe that that ability can be learned (though some people obviously have it better than others) but learning it requires sacrificing the notions of “regular” language (which are EXTREMELY complex) for more simple notions. Many people’s brains don’t want to do this, I have found.

Larry Wall had a degree in something like “Natural and Artificial Languages.” One of the reasons Perl has become so great is because of its inherrent abstractness mixed with context-sensitivity.

Oh, and regular expressions. Those things rock. :slight_smile:

Of course, regular expressions were around long before perl was invented. The unix utilities ‘grep’ ‘awk’ and ‘sed’ use regular expressions and have been around for a long time. It’s possible that unix stole the concept from somewhere else…

      • Yes, unix stole it - from me. You all owe me ten dollars each. No checks, you have the face of a deadbeat.

-quoted-
(case insensitive)
reset N TO 0
Reset Incr to 2
do till N=102
Print “N,”
Add incr to n
end loop

  • jally, your example is inconsistent. What is the difference between N and “N,”? -Or rather, what would the difference be? As is, there is nothing to differentiate between printing variable values and characters. - MC

Here are some of my thoughts on why it hasn’t succeeded.

First, obviously, it’s difficult. For most of us programmers, writing a program to do a particular task isn’t particularly tough. But getting the idea for how to do it, that’s the really hard part. Given a little time, any professional programmer could probably write a usable spreadsheet from scratch. But if you go back to (?can’t remember his name?), the guy who wrote Visicalc, the very first one, he probably had a really tough time doing it. You see the same principle at work regularly, somebody writes a killer program with a new concept and within a few weeks or months, there are several clones of it out. Anybody want to start a new thead on what such a language or programming system would look like, how you’d interface with it, what core functions it would need, etc?

Second. It’s a truism in the programming business that about 10% of code actually does the program’s real job, the other 90% is bulletproofing (those numbers may be a bit of an exaggeration but the idea holds). What that means is that the remaining code is there to handle exceptions. In the case mentioned a while ago of

	Do 10 times
           xxx yyy zzz

In order to make this program work acceptably, you’d need to add additional code to handle situations like – the data that was being handled inside the loop is defective or missing and we have to jump out early, or the person executing the program pressed a key so we have to stop and decide what to do about that, or the operation being attempted there failed for some reason (maybe a write-protected disk of disk-full,) or a thousand other things that might happen. The upshot is that if we wrote a stripped down program that assumed everything worked properly, it would be fairly intuitive to write but nobody
would want to use it. As an aside, this is the one of the root causes of complaints about buggy software. If everything went perfectly, just as the programmer expected it to, the bugs probably wouldn’t happen. You seldom hear somebody complain that “every time I type the letter E, my word processor crashes.”

Third, if we were able to write one, it would probably be extremely inefficient and slow at execution. In order to function, a high-level language (HLL) program like COBOL or C++ or any of the thousands of others has to use the lower
level language components available to it that are built into the hardware, what is called machine code, which is a programming language unto itself and is far more complex than the high level languages. During compilation, each HLL instruction is translated into one or more (usually more and
sometimes many more) machine code instructions. The more layers of interaction or hidden functionality you build into the program, the more instructions are needed to execute it. More instructions means more time needed for the hardware to execute it. In general, the more closely a language is tied to the machine-level instructions, the more complex it is, but also the more fast and efficient it is. That’s essentially why a program written in C is so much faster than a program written in BASIC to do the same job.

Finally, writing a program is a way of solving a problem in a logical and orderly fashion. It requires a certain mindset in order to do it effectively. For the majority of people, I suspect it will always remain a mystery, like programming a VCR. Likewise, for me it’s pretty much a mystery how a car works. While I’m a pretty good programmer, I don’t have the mindset to for mechanical things so I rely on professionals for that. The auto industry could probably build a car that’s easy to fix but the tradeoffs they’d have to make in the process would be unacceptable to the vast majority of customers.

So far everyone has pointed out that programming is essentially difficult, and no amount of simplification can hide that difficulty.

However, the answer is not to make programming easier (which is impossible), but to make it unnecessary.

For example, imagine the HAL 9000. After he is delivered, running, why would anyone need to program him? You want him to balance your checkbook? These days you have to write a checkbook-balancing application. But with a sufficiently advanced AI (with the resources available to it), just say “HAL, balance my checkbook”. The AI fills in all the details. If it has any questions, it can ask you, or make educated guesses.

Another possibility: artificially evolved programs. There are many that think this is the next big things. They have been shown to come up with the most difficult algorithms on their own, with no one telling them precisely what to do.

Actually, Microsoft is working on what they call “Intentional Programming”. I’ve had it explained to me, but can’t remember much of it… but it was quite interesting.

At any rate, I think in about 20 years no one will program anymore. It’s error-prone and expensive. There may be “program designers” but not “programmers”.

Sure there will be! The programmers will just be other computers, instead of humans. Mind you, I don’t expect this to happen any time soon, espescially not if Microsquish is heading the effort.

Avumede suggests:

Yeah, that’s they said about 4GLs, too.

When HAL 9000 is running, we won’t need to program him to balance our checkbooks. (Don’t have him monitor an interplanetary expedition, though.) The programming effort to get such a system running is non-trivial, though.

Joey:

Thanks for the tip; I’ll see. I wonder if there’s a good online tutorial for Perl.

…so, there’s no e.z. way out, huh? Even with a hypothetical million-buck incentive, no genious programmer can create a language that would help people to help themselves? <-- rhetorical

friedo:

OK, you made your point, so I’ll use your terminology:

I think that my “DO TILL” example was both simple and non-abstract, whereas the respondent’s example, while perhaps simple, was nevertheless abstract.

The larger question (& the point of this tread) is, whether any geniouses who have dabbled in many languages (and thus have seen the overall picture) would be capable of sticking to this same non-abstractness in creating a complete language (with all it entails).

And it seems the majority consensus of respondents to this thread is that it’s improbable.

MC:

oops, I goofed. Make that:

Print N", "

(maybe I goofed again ;))

Aramis:

The clarity + thoroughness of your response tells me that you might be a good candidate for creating a user-friendly language. Since, on the one hand, you imply that you have an innate programmer-mindset, and on the other hand your response was so “HLL”, well, how about it? Anyway, thanks! & may I suggest you take up Dynamic AutoDesign 101 next… :wink:

First of all, I wish to thank everyone who took the effort to try & clarify things for me. I don’t mean to be rude (since I started this thread) but since I may be otherwise occupied the next few weeks, I doubt I’ll have time to respond much more within this thread (since it’s time consuming). I may read and post, but not necessarily consistently.

I guess everyone who wishes can continue the discussion/debate with each other. And I hope the problems of all Intelligent Dummies everywhere in the Universe get resolved (sooner rather than later! :wink: