Explain the concept of "computer language" to me.

Users have different needs: ADA was created for the military, SQL is for database access, COBOL is ‘business oriented’, C was used to write Unix (and most other things since 1970), Java is designed to be machine independant, Brainfuck (really I’m not making this up, go Google it) is just someone being [Graham Chapman Colonel Voice]very silly[/GCCV] Just leave a computer scientist some free time and they’ll write a new one.

Er? No, the nearest analogy I can make is that you can “talk” to a Unix or DOS prompt using various incantations and have the machine tell you something back. Probably the commonest type of command/request is “tell me all the Jpeg files in this directory” which in DOS-speak is *DIR .jpg and in Unix-speak is *ls -l jpg. It’s no good shouting DIR! DIR! DIR! to a Unix box, it doesn’t know what you mean.

Depends who you ask. Most people would admit that Java and C# are distinct improvements on C (and some would say C++ too). But old languages don’t get totally superceded because of all the old code out there.

Yup. C is pretty bad at string manipulation where Basic and Java are good. However you can write an operating system in C, you can’t in Basic or COBOL or Java.

For an alternative language that does permit you to write poetry, or any kind of free verse, take a look at Whitespace.

It’s evil. Truly, admirably evil. And it would suit Microsoft far better than C#.

You have it EXACTLY right, there. Altho’ calling it “somewhat nerdy” could be an understatement. :slight_smile:

One of the great thinkers (someone will be along momentarily to remind me who) said:

Which is to say that what a computer does when it “understands” language is not at all what you do. The computer will never understand “blue” like you do (nor will a person blind from birth, or a brain-enhanced dog who sees in black & white).

Actually, it is not completely understood how humans understand anything. The question may even be unanswerable.

It may clear things up - or it may confuse everything - but computers “think” in binary. Everything to them is expressed in long strings of 1s and 0s.

These 1s and 0s represent the answers to long strings of yes/no questions. These yes/no questions are strung together with what is called “Boolean logic”, which is expressing concepts by connecting yes/no questions with AND, OR or NOT. Which questions are being answered by each of the 1s and 0s is a convention, chosen just because we need a standard way of expressing things to the computer that it will expect and can decode.

Computers are electronic. Thus current flows thru them, or not, based on the Boolean logic of each bit of code being a 1 or 0. Current flowing means a “true” condition; current not flowing means “false”.

Thus the computer encounters three bits of data. The Boolean logic of this in “machine code” might say:

IF bit1 = 1 AND
bit2 = 1 AND
bit3 = 1
THEN consider condition1 = true. Then condition1 (whatever it is) would be true IF all three bits were = 1. The question being controlled by condition1 could be anything. Maybe the question controlled by condition1 is “Should I send this file to the printer?” The file would only be printed if all three bits were set to 1.

Each bit could control a different question as well. Bit1 controls “Is there a printer connected to this computer?” Bit2 controls “Is the printer turned on?” Bit3 controls “Is there paper in the printer?” Therefore all three subconditions need to be true to be able to print.

A crude over-simplification of a “computer language” is a set of conventions where you can refer to the conditions instead of the bits.

Thus the “machine code” above would be expressed in a “computer language” like Java or COBOL as:

IF ThereIsAPrinterConnected not equal true
move true to ThereIsAPrinterConnected.

IF ThePrinterIsTurnedOn not equal true
move true to ThePrinterIsTurnedOn.

IF ThereIsPaperInThePrinter not equal true
move true to ThereIsPaperInThePrinter.

IF ThereIsAPrinterConnected AND ThePrinterIsTurnedOn AND ThereIsPaperInThePrinter
perform PrintRoutine.

Computer languages vary widely in how much like English they are. Currently, there are many GUIs, or Graphic User Interfaces, which attempt to make it easier to program in various languages by allowing the programmer to click on icons and thus assemble pre-defined functions into usable programs. To an old mainframe dinosaur like myself, this is similar to using COBOL to say “ADD VARIABLE-A TO VARIABLE-B” instead of a string of “shift left logical, add the contents of register 3 to the contents of register B and store the result in register D” as I used to do in BAL. You are simply making big blocks of code available so you don’t have to rewrite everything from scratch, and to make it easier to remember what the computer expects to be told.

Computer geeks such as myself pat themselves on the back for being smart, but learning computer languages is not as hard as learning a foreign language. It is easier, because the syntax is much more regular than for human language. Computers aren’t smart enough to ignore very small changes, or understand from context, or do things that humans do almost without thinking. For computers, the distinction between “variable A” and ‘variable A’ is too great to understand without being explicitly instructed as to which is which.

Computers are very stupid, and very fast. Humans are very smart, and very slow. Computer languages are an attempt to meet in the middle.

Regards,
Shodan

::twiddles thumbs::

ping!

Told you!

And I though Brainfuck was mental. Was Caml also thought up by drunk cs students?

Check out Befunge

Bingo on the last one. As others have already said, the diversity of programming languages comes from the diversity of needs people have had for them. Also, the steady advance of computer performance and resources has encouraged new kinds of languages. (For example, although Java could theoretically have been implemented on an old VAX, it would have been intolerably slow to run, and it might have consumed the machine’s entire memory and disk space just to execute a simple program.)

For some examples of the canonical “Hello World” program written in various programming languages, take a look at these. Some of the languages mentioned there are the popular ones used today, and some are quite obscure or archaic.

To address the question of “nationalites”: nearly all programming languages for professional use have been in English — that is to say, they have taken their keywords from English. There have been some exceptions however. I know the Russians used to use a teaching language that was essentially Pascal but with the keywords translated to Russian, and with Cyrillic as the character set of course. A computer scientist however would not be fooled and would still call that “Pascal” in all its essence.

Conceivably we might see more foreign-language based programming languages now that Unicode is becoming widely supported. More likely though the dominance of English will continue for quite a while. (The Japanese invented Ruby a few years ago, for example, but they chose traditional English keywords and punctuation for its syntax.)

Almost. It was thought up by French people.

Putting aside the snarkiness (though it is bushels of fun), I happen to like Caml and OCaml, and think they’re excellently designed languages. I’m sure I couldn’t persuade anyone at work to take a look at it, but I’m hoping to use OCaml for some of my own hobby projects, after I’ve learned some more.

Viva la France.

I second that!

I think I’ve more or less got my head round this now, although I might have to let it sink in and peruse some of the links in more detail. I’m actually really liking the idea of this, especially the part about creating languages to suit particular needs and/or make it easier on the programmer. That just seems like it would be fun (if not easy) to play with and I really understand how you could get quite into all of that. The philosopical side of the matter, which I am slightly less new to is an interesting an avenue as well IMO.

Very interesting stuff all round.

For a truly demonic language, :wink: you should look at Malbolge. It is so difficult to program in, the first program created for it was written 2 years after the lanuage was created. Here is the code for “HEllO WORld.”


 (=<`$9]7&lt;5YXz7wT.3,+O/o'K%$H"'~D|#z@b=`{^Lx8%$Xmrkpohm-kNi;
 gsedcba`_^]\\[ZYXWVUTSRQPONMLKJIHGFEDCBA@?&gt;=&lt;;:9876543s+O

I’d probably compare computer language to something like chefs writing recipies in their shorthand. It’s more or less a set of directions.
If you don’t know what Tsp, Tbsp, 1/4 cp. is or the shorthand for sift, stir, saute, bake or 350 degrees, 20 min. you wouldn’t understand the language.
Can you write poetry with it?
You’d have to ask a chef.

I doub’t it. Like it or not, there needs to be a lingua franca in programming and English looks to be it. There would be no point in creating seperate languages for each nation as it would just splinter the programming community. However, there have been notable instances of foreign words being integrated into CS.

Someone mentioned HTML a while ago. It’s pretty simple, but doesn’t do much more than create webpages.


For example, this <i>text</i> right <u><font color="red">here would</font> show</u> up <br> like this:

(BTW, I used vB code to make this post, which is similar to HTML, but much more limited. Click the reply button at the bottom right to see how that works.)

Anyway, HTML, like any language, tells the computer what to do. <i> means to start making text italicized, and </i> means to go back to normal. I think HTML isn’t usually considered a computer language, though technically I think it is.

Well, now you’ve got to get into a discussion of what constitutes poetry. In one sense, poetry is the use of language to convey a concept in a particularly artistic way. This has analogues in the programming world, too. I can write:



if( (p==0 && q!=0) || (q==0 && p!=0))
{
    /* meaning: either p is zero and q is not, or q is zero and p is not */
}


which has a very specific meaning in C, but I have a clever (some might say artistic :)) way of writing the same thing (also in C):



if( !p != !q )
{
    /* meaning: p's zero-ness is different than q's zero-ness */
}


Poetry, if I do say so myself. Note, of course, that this doesn’t mean it’s good code to write. Just as you wouldn’t write a business proposal in obscure language and use elements such as foreshadowing or difficult metaphor, you don’t necessarily want to write your computer programs in a way that another programmer has difficulty understanding them, even if they are particularly clever. But a programmer may be inclined to appreciate the construction for its cleverness, much like a literary scholar appreciates a well-turned phrase, even if it takes a minute to decipher its meaning.

It’s also interesting to consider what’s different about programming languages from natural languages like English. The big difference that sticks out in my mind is that natural languages are very loose, because they’re used by humans, who are intelligent enough to infer meaning when it’s not clear, and to generally deal with the inherent imprecisions. If I say my girl has lips of honey, you know that’s not literally true. If I say that the integer x=1, you can be pretty sure I’m not using a metaphor. Good programming languages try not to allow much ambiguity, because it makes programming harder, and there really isn’t much use in ambiguity to a computer. Of course, the ambiguity in natural languages makes learning and understanding them harder, but we deal with it because we’re pretty smart (whereas computers are not).

And this difference is precisely why I can appreciate a clever piece of code, even if I have a difficult time figuring it out, whereas poetry annoys me. With code, there is generally a precise meaning which can be figured out. With poetry, you never know if you got the author’s point, because of all the ambiguity. Besides, he’s probably dead, so you can’t ask him (if you get stumped on the meaning of some code, run it in a debugger :)).

And by the way, see http://www.brunching.com/adcode.html

undefined

In the end, computers are very stupid machines. Their only value comes from doing what they’re told, and allowing humans to tell them to do a lot of things.

You may not be able to write poetry with most computer languages, but a well-written computer program is a work of art. :wink:

In COBOL the following ‘poem’ is a perfectly legal statement:

MOVE CLOSER TO ME

Unfortunatly COBOL is only a little better for writing good programs in than it is for great poetry.

You mean like this? :D.

A small point. A “programming language” and a “computer language” are two different concepts. Often the latter is used as a sloppy term meaning the former. And both are very different concepts from human language.

The philosophical-level distinction between programming language & human language is that programming languages are all prescriptive, not descriptive. They’re an agreed-upon symbol set for issuing orders to be executed within a pre-defined realm of facts and environment, not for idly discussing or describing anything, much less everything.

Beyond defining terms and then directing manipulations of the terms just defined, programming languages have no reach. They can speak of the meaning of “blue” as well as your big toe can see. There is no vocabulary, no sematics, no nothing going that far.

Compared to the human / colloquial definition of language, that’s a very different, and impoverished, domain.

So what’s a “computer language”? I’d suggest that’s a framework for communucation and comprehension between computers or parts of an individual computer. Your browser and the SDMB speak particular languages (http, etc). to each other to get the one machine to do the other’s bidding.

They have mutual comprehension, if you define “comprehension” at the very functional level of “Machine2 did what Machine1 wanted, so apparently Machine2 was able to parse the psuedo-nouns and psuedo-verbs of Machine1’s request, act on it, and formulate a reply that itself was parsable by Machine1 in return. In addition, the reply was (somehow) appropriate, semantically as well as syntactically meaningful.”
Jumping from “comprehend” to “understand” is a whole 'nother kettle of fish.

The term “understand” is very fraught when applied to computers. Much research is ongoing, and I personally side with the folks who hold that fancy enough computers can think in the same way that you can, and that although they don’t yet exist, we may well get there in my lifetime.

Whatever philosohical position one takes on machine thinking, it remains that programming languages are a different class of thing from human languages.

Many a would-be philosopher has gotten derailed at the beginning by various people using the same word for vastly different concepts. Locking down your terminology and its definitions is key to any clear thinking on complex issues.

It’s worth mentioning that there are two different views as to the purpose of a programming language. One holds that a programming language exists so that a human may tell a computer what it should do. The other holds that a programming language is for one human to tell other humans what he wants the computer to do.