The Straight Dope

Go Back   Straight Dope Message Board > Main > General Questions

Reply
 
Thread Tools Display Modes
  #1  
Old 06-10-2004, 05:14 AM
PookahMacPhellimey PookahMacPhellimey is offline
Guest
 
Join Date: Jul 2001
Explain the concept of "computer language" to me.

Whenever someone mentions they can "speak" this computer language or that I've always scanned over it and filed it away under "clever computery things". I have absolutely no idea at all what they are or what they do.

I mean, in what way exactly are they languages? Are there words? Grammar?Poems? Or is "language" just jargon here?

To whom are what do they speak? I mean, if I speak Dutch I can speak to the Dutch (and Belgians and maybe South Africans, but let's keep it simple). Computer language allows you to speak to er.. computers? How? What are they saying? Or are you just ordering them around?

Also why are there so many different computer languages? Do computers have different nationalities? Or is one language new better and improved than others, or better suited to certain topics?

Okay, I know I'm a bit silly here and I wouldn't expect anyone to address these "questions" one by one. However, I would really like to find out what this is all about. So could anyone explain the basic concept in layman's/idiot's terms?
Reply With Quote
Advertisements  
  #2  
Old 06-10-2004, 05:24 AM
Skeptico Skeptico is offline
Guest
 
Join Date: Jul 2002
Quote:
Originally Posted by PookahMacPhellimey
I mean, in what way exactly are they languages? Are there words? Grammar?Poems? Or is "language" just jargon here?
Yes, yes, yes. They are considered "languages" in just the way you're
thinking. Computer languages are classified as "context-free" by linguists, whereas human languages are in an even more complex category, I believe.

Quote:
To whom are what do they speak? I mean, if I speak Dutch I can speak to the Dutch (and Belgians and maybe South Africans, but let's keep it simple). Computer language allows you to speak to er.. computers? How? What are they saying? Or are you just ordering them around?
Yes, we use computer languages to boss them around . There was a group of
BASIC (that's an old language) programmers once who thought it'd be funny to
use only BASIC to order each other around in a convention. I think they mostly succeeded.

Quote:
Also why are there so many different computer languages? Do computers have different nationalities? Or is one language new better and improved than others, or better suited to certain topics?
Languages are like cars, as the metaphor goes. Sometimes you need a fast car, like a ferrari. But sometimes you need a dump truck, which is slower but can carry in bulk. Different tools for different jobs.
Reply With Quote
  #3  
Old 06-10-2004, 05:27 AM
Colophon Colophon is offline
Guest
 
Join Date: Sep 2002
I do not "speak" any computer languages, but I think I understand what they are and so I'll jump in before the programmers come in, as I might be able to explain it better, layman to layman.

Yes, they're for ordering computers around. A computer on its own just sits there - to make it do useful stuff you have to tell it to do stuff. And, computers being dumb, you can't tell it what to do in English. Instead you have to use a defined set of instructions. It's no good saying "When someone clicks on the "Submit Reply" button I want you to take all the text they've entered in the box and submit it to the server (or eat it, as the case may be)", you need to use computer code, with lots of mystical brackets and suchlike

Yes, there are words, and grammar - in fact the grammar is explicitly set to allow no room for ambiguity, and from my early tinkering with Sinclair BASIC I know it will result in plenty of error messages if you don't get it right.

I'm sure someone will be able to tell you more about them....
Reply With Quote
  #4  
Old 06-10-2004, 05:27 AM
Capt. Ridley's Shooting Party Capt. Ridley's Shooting Party is offline
Guest
 
Join Date: Jul 2003
There's all sort of computer languages. Basically, a computer language is a way that a programmer instructs the computer to perform some task. Some are highly specialised, like SQL, while some are general purpose, like C++.

In the beginning, programs were written by manually flicking switches on the front of the computers. This is a pain, so simple assembly languages were invented. These are of the form of simple three of four letter mnemonics with some operands. This was better, but still is a bit of a pain to write in (although some people do, Chris Sawyer, author of Rollercoaster Tycoon and others writes all his games in Assembly language).

Next came even higher level languages. These were starting to look a bit like natural language. You got languages like Ada, Fortran, C etc. These are still widely used (especially C), but they are seen as unsuitable for creating huge programs in, by many. Object oriented languages were then developed, Simula being the first (I think?). Examples include Java, C++, Smalltalk etc. These attempt to map computer code with objects in the real world. For example, you can define a dog class, with a state, for example, age 15, heading north, colour brown, and provide means of altering this state.

This is a highly simplified history of programming languages, and leaves out many important ones, too. Generally, languages are broken down into families, espcially the higher level ones. I've already mentioned obkect oriented languages, but there's also declarative languages like Prolog and functional Languages like Lisp.

Here's a short example of some code (C++) that prints "Hello World!" to he screen:

Code:
#include<iostream>
using namespace std;

int main(int argc, char** argv)
{
    cout << "Hello World!" << endl;
    return 0;
}
HTH
Reply With Quote
  #5  
Old 06-10-2004, 05:30 AM
Capt. Ridley's Shooting Party Capt. Ridley's Shooting Party is offline
Guest
 
Join Date: Jul 2003
Quote:
Yes, we use computer languages to boss them around . There was a group of
BASIC (that's an old language) programmers once who thought it'd be funny to
use only BASIC to order each other around in a convention. I think they mostly succeeded.
Unfortunately, this sort of geeky nonesense ends up giving the impression that everyone studying CS is a social misfit to outsiders. But that's a pit thread.
Reply With Quote
  #6  
Old 06-10-2004, 05:50 AM
Small Clanger Small Clanger is offline
Guest
 
Join Date: Oct 2003
Ultimately digital computers speak in binary, this sort of thing 0101010111010.
Since telling the machine what to do using binary gets old fast (it did used to be done that way in the early days) we write programs what are called higher level languages which are converted into machine code/binary by other programs.

The lowest level high level language is Called assembly and looks like:

LD 01 A
MOV FF, B
ADD A, B C

What is going on here are very specific instructions to the computer: Yes we are ordering it around

LD 01 A -- load value 01 into register A
MOV FF B -- move a value from location FF to register B
ADD A, B -- add the values in registers A and B (result ends up in A)

(forgive me assembly people if this is rubbish I'm just showing what it looks like).

Writing a flight simulator (or payroll program for some of us ) Would take a long time using pigeon steps like this. A higher level language C/C++/Java/C# lets you write something a bit more human-readable like:
x = y + z;

To a non-programmer code in any of the four curly bracket languages I listed will probably look like the same type of gibberish. Some of the 'grammar' is very similar and a programmer who knows C++ (poor bastard) will have no trouble reading C, Java or C#. COBOL is a different story.

Another type of language is less like bossing the computer around and more like asking questions. SQL Structured Query Language (pron. Sequel - sometimes) Is the primary database language and consists mostly of requests for information along the lines of "Tell me every one who working on the Death Ray project" which would look like:

SELECT INITS, SNAME
FROM EMPLOYEE_TABLE
WHERE PROJECT = 'DEATH RAY';


I'm tired after all that someone want to take over with HTML? Unix shells? OOP!

Oh look people have beaten me to it, but after typing that lot I'm jolly well gonna post.
Reply With Quote
  #7  
Old 06-10-2004, 05:51 AM
PookahMacPhellimey PookahMacPhellimey is offline
Guest
 
Join Date: Jul 2001
Thanks all, so far. Again, forgive me my strange line of questioning, I'm just trying to get my head around it without having the background or possibly braintype for it.

Would it be right to say, that what stops you from composing poetry in computer language is not so much the language as the "audience". This because the audience is a computer and it won't appreciate it. I guess certain computer languages would be better suited to it than others, but such a language could surely be written. However, to write poetry in it would only make sense between two programmers, which would indeed be somewhat nerdy.

But have I got it right, there?

I think what stopped me from grasping the concept before is that I had the concept of "language" tied up with human type understanding. Of course, a computer understands your language perfectly (provided you do it correctly) as it will act according to your command, but that's not the same as me understanding the colour "blue" or similar. (although I have heard of people argue it IS the same when I was studying philosophy).

Sorry, to wax all philosophical there. I think I will broaden my definition of what "language" is to everything that "acts language like", and then computer languages won't be problematic. Or am I completely barking up the wrong tree?
Reply With Quote
  #8  
Old 06-10-2004, 05:54 AM
counsel wolf counsel wolf is offline
Guest
 
Join Date: Mar 2004
Quote:
Originally Posted by PookahMacPhellimey
Whenever someone mentions they can "speak" this computer language or that I've always scanned over it and filed it away under "clever computery things". I have absolutely no idea at all what they are or what they do.

I mean, in what way exactly are they languages? Are there words? Grammar?Poems? Or is "language" just jargon here?

To whom are what do they speak? I mean, if I speak Dutch I can speak to the Dutch (and Belgians and maybe South Africans, but let's keep it simple). Computer language allows you to speak to er.. computers? How? What are they saying? Or are you just ordering them around?

Also why are there so many different computer languages? Do computers have different nationalities? Or is one language new better and improved than others, or better suited to certain topics?

Okay, I know I'm a bit silly here and I wouldn't expect anyone to address these "questions" one by one. However, I would really like to find out what this is all about. So could anyone explain the basic concept in layman's/idiot's terms?
There's some good questions in here, the answers will take a bit of background setup, bear with me. Note also that I'm not the definative source for this info, take what I say as illustrative rather than exact.

To start with, computers produce their results be following a set of instructions which effectively boil down to putting and pulling numbers into various storage areas and performing operations on these numbers. The instructions on how to perform these operations are hard-wired into the computer (i.e. they ae not part of the operating system but part of the chip, INTEL, AMD, IBM, MOTOROLA and so on are responsible for this portion of the work. These instructions are basically numbers themselves, you ask the chip to perform operation 12 on area 5 and area 6 and put the result in area 7 for example. Given that computers native number format is Binary this instruction would end up as a sequence of 1s and 0s. This is what the computer understands natively and is known as 'machine code'. As an aside here, not all machine codes are the same, Macs and PC hardware differ at this level for example.

Now machine code is a rather difficult thing for humans to read and write, especially in a context like computer programming where people will often have to re-write portions of their code as bugs are ironed out. The creation of computer languages is an effort to reach a compromise between the rigidity of expression required by the computer with the more intuitive understanding and working of humans.

The first generation of languages was assembly, which replaced the numbers of the instructions and areas with (vaguely) human readable abbreviations. This aids readability but there still exists a one-to-one mapping between the machine code and assembly instructions (I'm not 100% on this, but fairly close). Thus even though humans could now read the code more easily it didn't allow them to abstract program flow to a level where complex ideas could be presented simply.

The next step is to introduce a compiler. What a compiler does is take statements that look nothing like machine code and 'compile' them into the appropriate machine code. Thus instead of telling the machine I want to something by having to write machine or assembly code by hand I can instead write something that uses human level concepts like strings, "Hello World" for example, rather than a series of 1s and 0s that just happen to represent "Hello World" (which I'd probably have to learn). This breakthrough allows several things:

1: More people can use computers. The barrier to learning machine language is quite a lot higher than a compiled language.
2: Programs become shorter and more readable. With the introduction of compilers higher level concepts (loops for example) can be presented in a very concise fashion.
3: You can use the same program on different hardware by changing the compiler rather than changing the code. Thus today it is perfectly possible to run the same code on a mac as on a pc for simple examples (complexities arise from a number of sources, but simple code will work).

These advantages vastly improved the ease with which humans and computers can interact. There was another upshot though, the number of languages in which you could write a program multiplied at a huge rate. These languages diiffer in how you structure the code that willl eventually become machine code (and in some cases, the route it takes to get there). The differences arise for several reasons including advances in understanding of how we can best represent the real world in computer code, proprietry ownership, or simple differences of opinion between academics or commercial enterprises.

I realise I haven't covered all of your questions here but hope to serve at least as a starting point for your understanding
Reply With Quote
  #9  
Old 06-10-2004, 06:00 AM
Small Clanger Small Clanger is offline
Guest
 
Join Date: Oct 2003
Quote:
Originally Posted by PookahMacPhellimey
However, to write poetry in it would only make sense between two programmers, which would indeed be somewhat nerdy. . .
[b]#define ( 2*b || !2*b) QUESTION







what?
Reply With Quote
  #10  
Old 06-10-2004, 06:02 AM
Capt. Ridley's Shooting Party Capt. Ridley's Shooting Party is offline
Guest
 
Join Date: Jul 2003
Quote:
Originally Posted by PookahMacPhellimey
Thanks all, so far. Again, forgive me my strange line of questioning, I'm just trying to get my head around it without having the background or possibly braintype for it.

Would it be right to say, that what stops you from composing poetry in computer language is not so much the language as the "audience". This because the audience is a computer and it won't appreciate it. I guess certain computer languages would be better suited to it than others, but such a language could surely be written. However, to write poetry in it would only make sense between two programmers, which would indeed be somewhat nerdy.

But have I got it right, there?

I think what stopped me from grasping the concept before is that I had the concept of "language" tied up with human type understanding. Of course, a computer understands your language perfectly (provided you do it correctly) as it will act according to your command, but that's not the same as me understanding the colour "blue" or similar. (although I have heard of people argue it IS the same when I was studying philosophy).

Sorry, to wax all philosophical there. I think I will broaden my definition of what "language" is to everything that "acts language like", and then computer languages won't be problematic. Or am I completely barking up the wrong tree?
Poetry in a programming language is coming up with an exceedingly elegant method of performing some task.
Reply With Quote
  #11  
Old 06-10-2004, 06:11 AM
Skeptico Skeptico is offline
Guest
 
Join Date: Jul 2002
Quote:
Would it be right to say, that what stops you from composing poetry in computer language is not so much the language as the "audience". This because the audience is a computer and it won't appreciate it. I guess certain computer languages would be better suited to it than others, but such a language could surely be written. However, to write poetry in it would only make sense between two programmers, which would indeed be somewhat nerdy.
"Computer poetry" is perfectly legit, albeit the audience is a bit limited.
2nd PERL Poetry competition
International Obfuscated C Code Contest

There was one competition which even had a Haiku sub-category, and you could only use so many words or lines or somesuch. You're basically right in your intuition.
Reply With Quote
  #12  
Old 06-10-2004, 06:13 AM
NutMagnet NutMagnet is offline
Guest
 
Join Date: Sep 2001
Pookah
Quote:
but that's not the same as me understanding the colour "blue" or similar
In some languages, you are able to define what "blue" is, then you're allowed to use it. For instance, you might want to store a value. You would define the characteristics (size, numeric or character, etc.) of the storage item (or field). Only then would you be able to reference it.

I'm not sure if this is exactly what you meant in your quote, but it's a primitive example. A concept like "blue" is difficult to define even in human language.
Reply With Quote
  #13  
Old 06-10-2004, 06:31 AM
Mangetout Mangetout is online now
Charter Member
 
Join Date: May 2001
Location: England (where it rains)
Posts: 51,289
In the majority of cases, what we call computer languages aren't actually the language spoken by the hardware of the computer itself; warning - vast oversimplifications ahead:

At the most basic level, the language instructions that the actual hardware of a computer understands consist of things equivalent to:
-Memorise the number following this instruction
-Add one memorised number to another
-Compare one memorised number against another, if they are the same, set a 'flag'
-Jump to another place in the series of instructions
-Jump to another place in the series of instructions if the 'flag' is set
-Move the value stored in a specified location to another specified location

And so on - these instructions aren't 'understood' in words by the computer, but as numbers - instruction number 01 might mean "Memorise the number following this instruction", so passing the two numbers 01,99 to the processor makes it memorise the number 99.

That's Machine Code and the actual instruction sets vary from one processor type to another.

To make Machine Code a little easier to use, the numeric instructions often have short names (this is called Assembly Language), so rather than having to remember that 01 means 'memorise', the programmer can (in a hypothetical assembly language) write 'mem,99'. When he wants to run the program, the names are converted into numbers by an Assembler program first (this is a simple, literal one-to-one translation).

Writing large programs in assembly language is painful, so there are easier ways and this is where we get to compiled languages:
Compiled languages consist of commands that are more like human language, such as:
-Repeat the following set of instructions for a specified number of times
-Repeat the following set of instructions until a specified condition is encountered
-Remember a numeric or other value by assigning it a name
-Display something on the screen
-Get some input from the user

For each of these (much more general purpose) instructions, the compiler (which is a program itself, consisting of machine code) has a pre-made machine code translation, so the instruction 'get some input from the user' might actually translate into quite a long and complex series of machine code instructions, saving a lot of effort on the part of the programmer.

There are also Interpreted languages, which work in a similar way, but instead of doing all the translation in advance, they do it as the program runs (in fact many 'compiled' languages still do interpretation at runtime and many 'interpreted' languages do some prior compilation).

But they aren't conversational languages, just means of instructing the machinery to do certain things, perhaps analogous to a set of travel directions; Proceed along route A, when you come to point B, turn left etc.
Reply With Quote
  #14  
Old 06-10-2004, 07:06 AM
ccwaterback ccwaterback is offline
Guest
 
Join Date: Jul 2003
You can probably figure out what these sentences written in SQL computer language are doing.

Connect to Big-Telephone-Directory-Database;
Select Telephone-Number from Telephone-Directory-Table where Name="PookahMacPhellimey";
Print Telephone-Number;

After these sentences are run through compilers, interpreters, linkers, loaders and other such friendly utilities, they are transformed (eventually) into a series of 1's and 0's that the computer will understand as instructions to execute your request. The three sentences you wrote will eventually be transformed into a string of millions, perhaps billions of 1's and 0's. So you can see, if you want to construct the series of 1's and 0's that instruct a computer to do more than just make an LED flicker on and off, it's nice to have friends called computer languages to do your work for you.
Reply With Quote
  #15  
Old 06-10-2004, 07:51 AM
PookahMacPhellimey PookahMacPhellimey is offline
Guest
 
Join Date: Jul 2001
Okay, I'm think you are collectively getting through to my compu-ignorant brain. What I'm getting now is that compiled languages are a bit like a shorthand that will instruct the computer to do all the basic stuff itself. A small bit similar to the use of "etc". As in, me saying to you: "1,2, 3, 4, etc." and you then knowing you are to proceed in a similar fashion, rather than me having to stand there and recite nrs for hours on end.

I must sound like a right idiot to all you programmer people. Thanks for bearing with me on this.

NutMagnet, what I was trying to get at there was the difference between a human's understanding of a concept and a computer's. I think that's a great debate, but I find it really interesting. The reason I brought it up was that was initially blocked me from understanding the concept of a computer language; the fact that a computer does not "understand" a word the way humans do.

I'm beginning to sound a bit like Data.
Reply With Quote
  #16  
Old 06-10-2004, 07:57 AM
Capt. Ridley's Shooting Party Capt. Ridley's Shooting Party is offline
Guest
 
Join Date: Jul 2003
A compiled language is just a high level language which gets reduced to another lower level language.

For example, Java code gets converted into Java Byte Code when compiled, which, on modern Java platofmrs gets compiled yet again at run time to the specific instructions needed for individual processors (a process known as Just In Time compilation).

Suppose I define a method like so, in Java, to add two numbers:

Code:
public static int add(int a, int b)
{
    return a + b;
}
Then, after compilation, the Java Byte Code emitted is somewhere along these lines (I don't specifically know JBC, so it's an approximation):

Code:
add:
  push a
  push b
  iadd
  ret
Ultimately, the aim is to get a high level language, which is easily readable by a human, into a form that a computer can understand. Whether this takes one sweep of a compiler, or multiple ones in the case of Java (this was an intentional decision, though, not a design flaw) depends upon the language and platform you are using.
Reply With Quote
  #17  
Old 06-10-2004, 08:25 AM
counsel wolf counsel wolf is offline
Guest
 
Join Date: Mar 2004
Quote:
Originally Posted by PookahMacPhellimey
I must sound like a right idiot to all you programmer people. Thanks for bearing with me on this.
If you're willing to learn, I for one am more than willing to explain, I wouldn't denigrate yourself because you've not spent any time learning what are quite obtuse concepts.


Quote:
Originally Posted by PookahMacPhellimey
NutMagnet, what I was trying to get at there was the difference between a human's understanding of a concept and a computer's. I think that's a great debate, but I find it really interesting. The reason I brought it up was that was initially blocked me from understanding the concept of a computer language; the fact that a computer does not "understand" a word the way humans do.

I'm beginning to sound a bit like Data.
'Understanding' is a difficult enough concept without introducing computers into it. Again drifting a bit from general Qs but you might find John Searle's Chinese Room thought experiment interesting to think about.
Reply With Quote
  #18  
Old 06-10-2004, 08:30 AM
PookahMacPhellimey PookahMacPhellimey is offline
Guest
 
Join Date: Jul 2001
Counsel wolf: yes, that's exactly the kind of thing I meant! I actually know the thought experiment. I studied philosophy and this came up in either philosophy of mind and/or philosophy of language. It's been years since I've really considered it, though, but will refresh my knowledge on the website you gave.
Reply With Quote
  #19  
Old 06-10-2004, 08:42 AM
cantara cantara is offline
Guest
 
Join Date: Nov 1999
Excellent discussion, history and descriptions everyone! It sounds like PookahMacPhellimey has got it! One thing to keep in mind is that the computer only take orders and executes instructions and does not 'think'. You can program a number of conditions to monitor and execute instructions based those values but that is about it. With a very complex system it can appear to be thinking, which I suppose you could argue that the decisions it is making is thinking...

Also programs try to be as succint as possible. You want to be as terse and specific as you can. It makes it easier for the computer to figure out what you want to do and allows you to fit more instructions into a program space.

The distinction I want to make is that you could write out a set of instructions to make a peanut butter and jam sandwich to tell someone how to make it, or you could write a poem to describe the sandwich and various aspects of its appearance, texture, flavour, etc. The poem is not particularly helpful in getting from ingredients to final product, but it sure may be more elegant. A lot more interpretation is needed to figure out what the poem is trying to say, rather than 'open the peanut butter jar'.
Reply With Quote
  #20  
Old 06-10-2004, 08:46 AM
II Gyan II II Gyan II is offline
Guest
 
Join Date: Dec 2002
If you have the time and inclination, read this book: Code: The Hidden Language of Computer Hardware and Software.
Reply With Quote
  #21  
Old 06-10-2004, 08:54 AM
Small Clanger Small Clanger is offline
Guest
 
Join Date: Oct 2003
Quote:
Originally Posted by PookahMacPhellimey
...why are there so many different computer languages?
Users have different needs: ADA was created for the military, SQL is for database access, COBOL is 'business oriented', C was used to write Unix (and most other things since 1970), Java is designed to be machine independant, Brainfuck (really I'm not making this up, go Google it) is just someone being [Graham Chapman Colonel Voice]very silly[/GCCV] Just leave a computer scientist some free time and they'll write a new one.


Quote:
Do computers have different nationalities?
Er? No, the nearest analogy I can make is that you can "talk" to a Unix or DOS prompt using various incantations and have the machine tell you something back. Probably the commonest type of command/request is "tell me all the Jpeg files in this directory" which in DOS-speak is DIR *.jpg and in Unix-speak is ls -l *jpg. It's no good shouting DIR! DIR! DIR! to a Unix box, it doesn't know what you mean.


Quote:
Or is one language new better and improved than others
Depends who you ask. Most people would admit that Java and C# are distinct improvements on C (and some would say C++ too). But old languages don't get totally superceded because of all the old code out there.


Quote:
or better suited to certain topics?
Yup. C is pretty bad at string manipulation where Basic and Java are good. However you can write an operating system in C, you can't in Basic or COBOL or Java.
Reply With Quote
  #22  
Old 06-10-2004, 09:11 AM
Bytegeist Bytegeist is offline
Guest
 
Join Date: Jul 2003
For an alternative language that does permit you to write poetry, or any kind of free verse, take a look at Whitespace.

It's evil. Truly, admirably evil. And it would suit Microsoft far better than C#.
Reply With Quote
  #23  
Old 06-10-2004, 09:24 AM
NoCoolUserName NoCoolUserName is offline
Guest
 
Join Date: Feb 2003
Quote:
Originally Posted by PookahMacPhellimey
However, to write poetry in it would only make sense between two programmers, which would indeed be somewhat nerdy.

But have I got it right, there?
You have it EXACTLY right, there. Altho' calling it "somewhat nerdy" could be an understatement.
Quote:
Originally Posted by PookahMacPhellimey
I think what stopped me from grasping the concept before is that I had the concept of "language" tied up with human type understanding. Of course, a computer understands your language perfectly (provided you do it correctly) as it will act according to your command, but that's not the same as me understanding the colour "blue" or similar. (although I have heard of people argue it IS the same when I was studying philosophy).
One of the great thinkers (someone will be along momentarily to remind me who) said:
Quote:
The question of whether a computer is neither more nor less interesting than the question of whether a submarine can swim.
Which is to say that what a computer does when it "understands" language is not at all what you do. The computer will never understand "blue" like you do (nor will a person blind from birth, or a brain-enhanced dog who sees in black & white).
Reply With Quote
  #24  
Old 06-10-2004, 09:25 AM
Shodan Shodan is online now
Charter Member
 
Join Date: Jul 2000
Location: Milky Way Galaxy
Posts: 25,517
Quote:
Originally Posted by PookahMacPhellimey
NutMagnet, what I was trying to get at there was the difference between a human's understanding of a concept and a computer's. I think that's a great debate, but I find it really interesting. The reason I brought it up was that was initially blocked me from understanding the concept of a computer language; the fact that a computer does not "understand" a word the way humans do.
Actually, it is not completely understood how humans understand anything. The question may even be unanswerable.

It may clear things up - or it may confuse everything - but computers "think" in binary. Everything to them is expressed in long strings of 1s and 0s.

These 1s and 0s represent the answers to long strings of yes/no questions. These yes/no questions are strung together with what is called "Boolean logic", which is expressing concepts by connecting yes/no questions with AND, OR or NOT. Which questions are being answered by each of the 1s and 0s is a convention, chosen just because we need a standard way of expressing things to the computer that it will expect and can decode.

Computers are electronic. Thus current flows thru them, or not, based on the Boolean logic of each bit of code being a 1 or 0. Current flowing means a "true" condition; current not flowing means "false".

Thus the computer encounters three bits of data. The Boolean logic of this in "machine code" might say:

IF bit1 = 1 AND
bit2 = 1 AND
bit3 = 1
THEN consider condition1 = true. Then condition1 (whatever it is) would be true IF all three bits were = 1. The question being controlled by condition1 could be anything. Maybe the question controlled by condition1 is "Should I send this file to the printer?" The file would only be printed if all three bits were set to 1.

Each bit could control a different question as well. Bit1 controls "Is there a printer connected to this computer?" Bit2 controls "Is the printer turned on?" Bit3 controls "Is there paper in the printer?" Therefore all three subconditions need to be true to be able to print.

A crude over-simplification of a "computer language" is a set of conventions where you can refer to the conditions instead of the bits.

Thus the "machine code" above would be expressed in a "computer language" like Java or COBOL as:

IF ThereIsAPrinterConnected not equal true
move true to ThereIsAPrinterConnected.

IF ThePrinterIsTurnedOn not equal true
move true to ThePrinterIsTurnedOn.

IF ThereIsPaperInThePrinter not equal true
move true to ThereIsPaperInThePrinter.

IF ThereIsAPrinterConnected AND ThePrinterIsTurnedOn AND ThereIsPaperInThePrinter
perform PrintRoutine.

Computer languages vary widely in how much like English they are. Currently, there are many GUIs, or Graphic User Interfaces, which attempt to make it easier to program in various languages by allowing the programmer to click on icons and thus assemble pre-defined functions into usable programs. To an old mainframe dinosaur like myself, this is similar to using COBOL to say "ADD VARIABLE-A TO VARIABLE-B" instead of a string of "shift left logical, add the contents of register 3 to the contents of register B and store the result in register D" as I used to do in BAL. You are simply making big blocks of code available so you don't have to rewrite everything from scratch, and to make it easier to remember what the computer expects to be told.

Computer geeks such as myself pat themselves on the back for being smart, but learning computer languages is not as hard as learning a foreign language. It is easier, because the syntax is much more regular than for human language. Computers aren't smart enough to ignore very small changes, or understand from context, or do things that humans do almost without thinking. For computers, the distinction between "variable A" and 'variable A' is too great to understand without being explicitly instructed as to which is which.

Computers are very stupid, and very fast. Humans are very smart, and very slow. Computer languages are an attempt to meet in the middle.

Regards,
Shodan
Reply With Quote
  #25  
Old 06-10-2004, 09:28 AM
Small Clanger Small Clanger is offline
Guest
 
Join Date: Oct 2003
Quote:
Originally Posted by Me
Just leave a computer scientist some free time and they'll write a new one
::twiddles thumbs::

ping!
Quote:
Originally Posted by Bytegeist
. . .take a look at Whitespace.
Told you!

And I though Brainfuck was mental. Was Caml also thought up by drunk cs students?
Reply With Quote
  #26  
Old 06-10-2004, 09:30 AM
Mangetout Mangetout is online now
Charter Member
 
Join Date: May 2001
Location: England (where it rains)
Posts: 51,289
Quote:
Originally Posted by Bytegeist
For an alternative language that does permit you to write poetry, or any kind of free verse, take a look at Whitespace.

It's evil. Truly, admirably evil. And it would suit Microsoft far better than C#.
Check out Befunge
Reply With Quote
  #27  
Old 06-10-2004, 10:03 AM
Bytegeist Bytegeist is offline
Guest
 
Join Date: Jul 2003
Quote:
Originally Posted by PookahMacPhellimey
Also why are there so many different computer languages? Do computers have different nationalities? Or is one language new better and improved than others, or better suited to certain topics?
Bingo on the last one. As others have already said, the diversity of programming languages comes from the diversity of needs people have had for them. Also, the steady advance of computer performance and resources has encouraged new kinds of languages. (For example, although Java could theoretically have been implemented on an old VAX, it would have been intolerably slow to run, and it might have consumed the machine's entire memory and disk space just to execute a simple program.)

For some examples of the canonical "Hello World" program written in various programming languages, take a look at these. Some of the languages mentioned there are the popular ones used today, and some are quite obscure or archaic.

To address the question of "nationalites": nearly all programming languages for professional use have been in English that is to say, they have taken their keywords from English. There have been some exceptions however. I know the Russians used to use a teaching language that was essentially Pascal but with the keywords translated to Russian, and with Cyrillic as the character set of course. A computer scientist however would not be fooled and would still call that "Pascal" in all its essence.

Conceivably we might see more foreign-language based programming languages now that Unicode is becoming widely supported. More likely though the dominance of English will continue for quite a while. (The Japanese invented Ruby a few years ago, for example, but they chose traditional English keywords and punctuation for its syntax.)
Reply With Quote
  #28  
Old 06-10-2004, 10:16 AM
Bytegeist Bytegeist is offline
Guest
 
Join Date: Jul 2003
Quote:
Originally Posted by Small Clanger
Was Caml also thought up by drunk cs students?
Almost. It was thought up by French people.

Putting aside the snarkiness (though it is bushels of fun), I happen to like Caml and OCaml, and think they're excellently designed languages. I'm sure I couldn't persuade anyone at work to take a look at it, but I'm hoping to use OCaml for some of my own hobby projects, after I've learned some more.

Viva la France.
Reply With Quote
  #29  
Old 06-10-2004, 10:31 AM
PookahMacPhellimey PookahMacPhellimey is offline
Guest
 
Join Date: Jul 2001
Quote:
Originally Posted by cantara
Excellent discussion, history and descriptions everyone!
I second that!

I think I've more or less got my head round this now, although I might have to let it sink in and peruse some of the links in more detail. I'm actually really liking the idea of this, especially the part about creating languages to suit particular needs and/or make it easier on the programmer. That just seems like it would be fun (if not easy) to play with and I really understand how you could get quite into all of that. The philosopical side of the matter, which I am slightly less new to is an interesting an avenue as well IMO.

Very interesting stuff all round.
Reply With Quote
  #30  
Old 06-10-2004, 12:16 PM
RandomLetters RandomLetters is offline
Guest
 
Join Date: Feb 2004
Quote:
Originally Posted by Bytegeist
For an alternative language that does permit you to write poetry, or any kind of free verse, take a look at Whitespace.

It's evil. Truly, admirably evil. And it would suit Microsoft far better than C#.

For a truly demonic language, you should look at Malbolge. It is so difficult to program in, the first program created for it was written 2 years after the lanuage was created. Here is the code for "HEllO WORld."


Code:
 (=<`$9]7<5YXz7wT.3,+O/o'K%$H"'~D|#z@b=`{^Lx8%$Xmrkpohm-kNi;
 gsedcba`_^]\\[ZYXWVUTSRQPONMLKJIHGFEDCBA@?>=<;:9876543s+O
Reply With Quote
  #31  
Old 06-10-2004, 01:28 PM
Hampshire Hampshire is offline
Member
 
Join Date: Jan 2003
Location: Minneapolis
Posts: 9,434
I'd probably compare computer language to something like chefs writing recipies in their shorthand. It's more or less a set of directions.
If you don't know what Tsp, Tbsp, 1/4 cp. is or the shorthand for sift, stir, saute, bake or 350 degrees, 20 min. you wouldn't understand the language.
Can you write poetry with it?
You'd have to ask a chef.
Reply With Quote
  #32  
Old 06-10-2004, 11:46 PM
Shalmanese Shalmanese is offline
Charter Member
 
Join Date: Feb 2001
Location: San Francisco
Posts: 5,977
Quote:
Originally Posted by Bytegeist
Conceivably we might see more foreign-language based programming languages now that Unicode is becoming widely supported. More likely though the dominance of English will continue for quite a while. (The Japanese invented Ruby a few years ago, for example, but they chose traditional English keywords and punctuation for its syntax.)
I doub't it. Like it or not, there needs to be a lingua franca in programming and English looks to be it. There would be no point in creating seperate languages for each nation as it would just splinter the programming community. However, there have been notable instances of foreign words being integrated into CS.
Reply With Quote
  #33  
Old 06-11-2004, 12:56 AM
TJdude825 TJdude825 is offline
Guest
 
Join Date: Feb 2000
Someone mentioned HTML a while ago. It's pretty simple, but doesn't do much more than create webpages.

Code:
For example, this <i>text</i> right <u><font color="red">here would</font> show</u> up <br> like this:
Quote:
For example, this text right here would show up
like this:
(BTW, I used vB code to make this post, which is similar to HTML, but much more limited. Click the reply button at the bottom right to see how that works.)

Anyway, HTML, like any language, tells the computer what to do. <i> means to start making text italicized, and </i> means to go back to normal. I think HTML isn't usually considered a computer language, though technically I think it is.
Reply With Quote
  #34  
Old 06-11-2004, 01:03 AM
galt galt is offline
Guest
 
Join Date: Apr 1999
Quote:
Originally Posted by PookahMacPhellimey
Would it be right to say, that what stops you from composing poetry in computer language is not so much the language as the "audience". This because the audience is a computer and it won't appreciate it. I guess certain computer languages would be better suited to it than others, but such a language could surely be written. However, to write poetry in it would only make sense between two programmers, which would indeed be somewhat nerdy.
Well, now you've got to get into a discussion of what constitutes poetry. In one sense, poetry is the use of language to convey a concept in a particularly artistic way. This has analogues in the programming world, too. I can write:
Code:
if( (p==0 && q!=0) || (q==0 && p!=0))
{
    /* meaning: either p is zero and q is not, or q is zero and p is not */
}
which has a very specific meaning in C, but I have a clever (some might say artistic ) way of writing the same thing (also in C):
Code:
if( !p != !q )
{
    /* meaning: p's zero-ness is different than q's zero-ness */
}
Poetry, if I do say so myself. Note, of course, that this doesn't mean it's good code to write. Just as you wouldn't write a business proposal in obscure language and use elements such as foreshadowing or difficult metaphor, you don't necessarily want to write your computer programs in a way that another programmer has difficulty understanding them, even if they are particularly clever. But a programmer may be inclined to appreciate the construction for its cleverness, much like a literary scholar appreciates a well-turned phrase, even if it takes a minute to decipher its meaning.

It's also interesting to consider what's different about programming languages from natural languages like English. The big difference that sticks out in my mind is that natural languages are very loose, because they're used by humans, who are intelligent enough to infer meaning when it's not clear, and to generally deal with the inherent imprecisions. If I say my girl has lips of honey, you know that's not literally true. If I say that the integer x=1, you can be pretty sure I'm not using a metaphor. Good programming languages try not to allow much ambiguity, because it makes programming harder, and there really isn't much use in ambiguity to a computer. Of course, the ambiguity in natural languages makes learning and understanding them harder, but we deal with it because we're pretty smart (whereas computers are not).

And this difference is precisely why I can appreciate a clever piece of code, even if I have a difficult time figuring it out, whereas poetry annoys me. With code, there is generally a precise meaning which can be figured out. With poetry, you never know if you got the author's point, because of all the ambiguity. Besides, he's probably dead, so you can't ask him (if you get stumped on the meaning of some code, run it in a debugger ).
Reply With Quote
  #35  
Old 06-11-2004, 01:07 AM
TJdude825 TJdude825 is offline
Guest
 
Join Date: Feb 2000
And by the way, see http://www.brunching.com/adcode.html

Quote:
For example,
Code:
$substance = "NaCl";

while ($raining) {
    &pour($substance);
}
is pseudocode for
SPOILER:
"When it rains, it pours," so the answer would be Morton Salt.
undefined
Reply With Quote
  #36  
Old 06-11-2004, 03:28 AM
rjung rjung is offline
Guest
 
Join Date: Sep 2000
In the end, computers are very stupid machines. Their only value comes from doing what they're told, and allowing humans to tell them to do a lot of things.

You may not be able to write poetry with most computer languages, but a well-written computer program is a work of art.
__________________
--R.J.
Electric Escape -- Information superhighway rest area #10,186
Reply With Quote
  #37  
Old 06-11-2004, 06:02 AM
ticker ticker is offline
Guest
 
Join Date: Apr 2000
In COBOL the following 'poem' is a perfectly legal statement:

MOVE CLOSER TO ME

Unfortunatly COBOL is only a little better for writing good programs in than it is for great poetry.
Reply With Quote
  #38  
Old 06-11-2004, 06:57 AM
Shalmanese Shalmanese is offline
Charter Member
 
Join Date: Feb 2001
Location: San Francisco
Posts: 5,977
Quote:
Originally Posted by Hampshire
I'd probably compare computer language to something like chefs writing recipies in their shorthand. It's more or less a set of directions.
If you don't know what Tsp, Tbsp, 1/4 cp. is or the shorthand for sift, stir, saute, bake or 350 degrees, 20 min. you wouldn't understand the language.
Can you write poetry with it?
You'd have to ask a chef.
You mean like this? .
Reply With Quote
  #39  
Old 06-11-2004, 07:25 AM
LSLGuy LSLGuy is online now
Charter Member
 
Join Date: Sep 2003
Location: Miami, Florida USA
Posts: 6,320
A small point. A "programming language" and a "computer language" are two different concepts. Often the latter is used as a sloppy term meaning the former. And both are very different concepts from human language.

The philosophical-level distinction between programming language & human language is that programming languages are all prescriptive, not descriptive. They're an agreed-upon symbol set for issuing orders to be executed within a pre-defined realm of facts and environment, not for idly discussing or describing anything, much less everything.

Beyond defining terms and then directing manipulations of the terms just defined, programming languages have no reach. They can speak of the meaning of "blue" as well as your big toe can see. There is no vocabulary, no sematics, no nothing going that far.

Compared to the human / colloquial definition of language, that's a very different, and impoverished, domain.

So what's a "computer language"? I'd suggest that's a framework for communucation and comprehension between computers or parts of an individual computer. Your browser and the SDMB speak particular languages (http, etc). to each other to get the one machine to do the other's bidding.

They have mutual comprehension, if you define "comprehension" at the very functional level of "Machine2 did what Machine1 wanted, so apparently Machine2 was able to parse the psuedo-nouns and psuedo-verbs of Machine1's request, act on it, and formulate a reply that itself was parsable by Machine1 in return. In addition, the reply was (somehow) appropriate, semantically as well as syntactically meaningful."


Jumping from "comprehend" to "understand" is a whole 'nother kettle of fish.

The term "understand" is very fraught when applied to computers. Much research is ongoing, and I personally side with the folks who hold that fancy enough computers can think in the same way that you can, and that although they don't yet exist, we may well get there in my lifetime.

Whatever philosohical position one takes on machine thinking, it remains that programming languages are a different class of thing from human languages.

Many a would-be philosopher has gotten derailed at the beginning by various people using the same word for vastly different concepts. Locking down your terminology and its definitions is key to any clear thinking on complex issues.
Reply With Quote
  #40  
Old 06-11-2004, 10:56 AM
ultrafilter ultrafilter is offline
Guest
 
Join Date: May 2001
It's worth mentioning that there are two different views as to the purpose of a programming language. One holds that a programming language exists so that a human may tell a computer what it should do. The other holds that a programming language is for one human to tell other humans what he wants the computer to do.
Reply With Quote
  #41  
Old 06-12-2004, 10:16 AM
LSLGuy LSLGuy is online now
Charter Member
 
Join Date: Sep 2003
Location: Miami, Florida USA
Posts: 6,320
Lemme try that again ...

Quote:
Originally Posted by LSLGuy
The philosophical-level distinction between programming language & human language is that programming languages are all prescriptive, not descriptive.
Instead of "prescriptive", insert "imperitive". I thought of the sentence that way and by the time my fingers got to that part, the word had completely escaped me and prescriptive was the best susbsitute I could come up with. Human thinking is a strange and wonderful, but not terribly reliable, phenomenon.
Reply With Quote
  #42  
Old 06-12-2004, 01:00 PM
yabob yabob is offline
Charter Member
 
Join Date: Mar 2000
Posts: 7,221
Quote:
Originally Posted by LSLGuy
Lemme try that again ...



Instead of "prescriptive", insert "imperitive". I thought of the sentence that way and by the time my fingers got to that part, the word had completely escaped me and prescriptive was the best susbsitute I could come up with. Human thinking is a strange and wonderful, but not terribly reliable, phenomenon.
Be careful with those terms. Many people speaking exclusively of computer languages will use "imperative" in a different sense than you intend here to distinguish between languages like C, java, etc, and "declarative" languages like SQL, HTML, etc. In the sense you intend to use it, the "declarative" languages are still recipes to tell some application running on the computer to DO something, but using the word "imperative" for them could cause confusion when someone reads something about SQL NOT being "imperative" (I think it skates dangerously close to it, given the way people think about transactions, and the sort of procedure-like extensions that most vendors have added, but most descriptions will insist on calling SQL "declarative").

In terms of the OP's original question, one important distinction is that virtually all computer languages strive to be context-free and parsable by an LR grammar (in practice, most actual languages have some contextual "warts" when you get down to writing a compiler for them, but the intent is that they be context-free). Human languages are contextual as hell (interesting invented things like Loglan aside). Part of the reason computer languages are this way is the purpose they serve, and part of it is to make it practical to process them.

Now that I dragged that in, I'm going to have to translate it. Most of the points have been touched upon already. The "LR parsable" buzzword has to do with how you write a program to walk through the source, and extract the directions for the computer. It essentially means that your program is going to read "left to right" through the input source, and with a finite amount of "lookahead" make sense out of what it is looking at at every point strictly by using its grammar rules and the statements you've already made. For instance, let's take a few simple statements in some garden variety procedural programming language (for our purposes, it doesn't matter what it actually is, and in fact, it isn't anything that I know of):

integer x;
integer y;
x = input();
y = x + 7;

When I design a compiler (or interpreter or whatever) that is going to understand that, it can walk through it, and make sense out of it strictly from grammatical rules - "integer" is probably a keyword, so it can apply some sort of declaration rule and "know" that "x" and "y" are names I have now chosen to use for integer values. "input" may be a keyword or known as some library function - the rules the compiler has let it realize that it makes sense to assign its result to one of my integer variables. "7" is something it can recognize as an explicit integer value (we call such things "literals"). The grammar rules it has built in concerning expressions allow it to determine what to do with "x + 7" and verify that I may indeed assign it to the variable I called "y". All unambigous and not dependent on some unstated nature of things named "x", "y" or "7".

Contrast that with a couple statements in English:

"One should not drive a car with faulty brakes"
"One should not ride a motorcycle without a helmet"

You don't even think about it, but those two sentences are horribly ambigous - to determine that "with faulty brakes" modifies "car" in the first example, and "without a helmet" modifies "one" in the second sentence you HAVE to rely on things other than the nearly identical grammatical structure, namely, your knowledge that people don't have brakes and motorcycles don't wear helmets. Most people get too "cute" in constructing examples to illustrate the ambiguity of natural language, and provide something that hinges on a play on words. That obscures the point, I think. The issue is more fundamental than that. Go through a newpaper article sometime being deliberately obtuse about the attachment of modifiers, and you'll see what I mean.
Reply With Quote
  #43  
Old 06-12-2004, 04:48 PM
Fish Fish is offline
Guest
 
Join Date: Nov 2000
I'm not a professional programmer, Pookah, but layperson to layperson, I figure I'd tackle the explanation of how some computer languages might be better at certain tasks than others. I'm sure the coders here will correct me where I'm wrong, or provide concrete examples where I'm just giving vague generalities.

When you are running a webserver that helps thousands of visitors per second, you want a language capable of handling multiple simultaneous users discretely, one which uses a minimum of run-time and memory space, and one which effectively keeps users restricted only to the parts of the information they're entitled to. To keep the server's resource consumption to a minimum, you'd use a language lightweight and nimble enough to process its script quickly.

When you are running a stand-alone program on a computer (a game, for instance, or a word processor) you have the entire computer to yourself, so it needn't be small or nimble, necessarily. What you want here is that the program be robust: powerful enough to handle what users want, and secure enough that the users can't foul it up with a few unthinking mouse clicks.

Some languages enable you to define certain terms or objects or functions or processes that you plan to repeat a lot. You might say: "Computer, I want to build a blueprint for all the buttons on this program. They'll all be round and gray and when you click 'em, they'll play click.noise. They'll have text in 'em that's in Palatino Bold 12 point and they'll be 40 pixels tall." Later, you tell the computer, "Okay, I want four Buttons here, just like I said before. One will say OKAY, one will say CANCEL, one will say BACK, one will say HELP." The computer can then apply the blueprint Button to each instance of that blueprint that you create. You might then want to create a class of items called RedButton, and you say, "Computer, this will be exactly like Button, except it'll be red." You can then create some instances of RedButton objects which have the same traits as your blueprint. This kind of computer language is useful for specific kinds of data handling, but not all kinds.

Another language might be particularly good at dissecting strings of text. Your job might be to typeset someone's manuscript, and you want to go through it and find every "..." that you can and replace it with ". . ." instead, except that when you find "........................" you want to leave it completely alone. A text-parsing language would be easiest here, because it has commands written that do exactly this. In another language you might have to program a routine from scratch that will perform this same function.

Another language might excel at handling multiple users. Such a language might give every user a particular level of security, and built into that language is some kind of permission scheme: you can't get into /secrets/stuff/passwords directory because you don't have access! Before access is granted, the computer checks who is doing the asking and if it's okay to tell them. Other languages may not have this feature built in and the programmer must build the security permissions manually.

Last, there are various ways that languages can execute. One is line-by-line, in order, starting at the top. For instance, written in pseudocode:
Code:
check recipe
if recipe.needs-sugar then import sugar
if recipe.needs-eggs then import eggs
if recipe.needs-flour then import flour
if recipe.needs-salt then import salt
if recipe.stove-temperature > 0 then openburner = getopenburner(stove) and
     set openburner = recipe.stove-temperature
In this program, the computer executes all the lines of code even if you're making cold cereal. Some languages are slightly more efficient: once they figure out that the recipe does not need sugar, they skip to the next line (because everything after that is irrelevant).

A different language may handle it this way, again in pseudocode:
Code:
blueprint for Recipe:
All recipies will have a list of ingredients called list-of-ingredients.
All recipies will say true/false if they need to be cooked, a
     variable called is-stove-on.
All recipies will have cooking-temp and cooking-time variables.
end Recipe

cereal = a Recipe
list-of-ingredients [captain_crunch, milk]
is-stove-on = false
cooking-temp = 0
cooking-time = 0
end cereal

friedeggs = a Recipe
list-of-ingredients [eggs, salt]
is-stove-on = true
cooking temp = Medium
cooking-time = 360
end friedeggs

Main:
check recipe
what-we-need = get list-of-ingredients(cereal)
import what-we-need
if is-stove-on(cereal) then setstove(cereal.cooking-temp)
cook-for(cereal.cooking-time)
eat!
end Main
In this style of writing, it can be slightly more efficient because the "fried eggs" part won't actually run unless it is called upon to do so. It has been defined but the computer need never actually execute that code until it's needed.
Reply With Quote
  #44  
Old 06-13-2004, 05:45 AM
Small Clanger Small Clanger is offline
Guest
 
Join Date: Oct 2003
Quote:
Originally Posted by Fish
When you are running a webserver that helps thousands of visitors per second, you want a language capable of handling multiple simultaneous users discretely
What you're describing here is really the operating system rather than anything to do with computer languages. All modern OSs will cope with multiple processes even those from Microsoft . The programs running on a webserver can be written in anything most likely perl, C or php.


Quote:
Some languages enable you to define certain terms or objects or functions or processes that you plan to repeat a lot.
Every language I can think of allows (if not demands) this. What you go on to describe is the object oriented approach where everything in a program is an object. A string of text is an object, a button is an object a database connection is an object. If you want a red button what you do is use an existing basic button object (class actually but this is technical enough already) and extend it by adding the red-coloured attribute. This saves you from writing all the code to make a button work because you can just pick the existing button object off the shelf and bolt extra bits on to it.
Reply With Quote
  #45  
Old 06-13-2004, 08:11 AM
RandomLetters RandomLetters is offline
Guest
 
Join Date: Feb 2004
Quote:
Originally Posted by Small Clanger
What you're describing here is really the operating system rather than anything to do with computer languages. All modern OSs will cope with multiple processes even those from Microsoft . The programs running on a webserver can be written in anything most likely perl, C or php.
And if you wanted to, you could even write the TCP/IP stack & webserver in php if you wanted too.
Reply With Quote
  #46  
Old 06-13-2004, 09:42 AM
LSLGuy LSLGuy is online now
Charter Member
 
Join Date: Sep 2003
Location: Miami, Florida USA
Posts: 6,320
Quote:
Originally Posted by yabob
Be careful with those terms. Many people speaking exclusively of computer languages will use "imperative" in a different sense than you intend here to distinguish between languages like C, java, etc, and "declarative" languages like SQL, HTML, etc. In the sense you intend to use it, the "declarative" languages are still recipes to tell some application running on the computer to DO something, but using the word "imperative" for them could cause confusion when someone reads something about SQL NOT being "imperative" (I think it skates dangerously close to it, given the way people think about transactions, and the sort of procedure-like extensions that most vendors have added, but most descriptions will insist on calling SQL "declarative").
Yobob,

Yup, I understand your point. I was trying to keep the discussion at the OP's level, using terms in their plain-English sense.

HTML, at least before the advent of DHMTL, could legitimately be described as almost entirely declarative. DHTML, advanced CSS and of course the embedded script functionality all muddy the waters in the direction of imperitivity (new word!).

For SQL, I have a problem with calling it declaritive. This
Code:
INSERT INTO TableName (Field1, Field2) VALUES ( Value1, Value2)
is 100% imperitive, no less than x:=y+z is imperitive. Certainly the DDL portions of SQL (ie CREATE ... ) are declarative, but every imperitive language still needs a declarative component to define the "nouns" that the imperitive "verbs" of the imperitive part will act on.
Reply With Quote
  #47  
Old 06-13-2004, 10:00 AM
yabob yabob is offline
Charter Member
 
Join Date: Mar 2000
Posts: 7,221
Quote:
Originally Posted by LSLGuy
...
For SQL, I have a problem with calling it declaritive. This
Code:
INSERT INTO TableName (Field1, Field2) VALUES ( Value1, Value2)
is 100% imperitive, no less than x:=y+z is imperitive. Certainly the DDL portions of SQL (ie CREATE ... ) are declarative, but every imperitive language still needs a declarative component to define the "nouns" that the imperitive "verbs" of the imperitive part will act on.
We're on the same page. Nonetheless, many people will insist on typifying SQL as "declarative", including the people who write primers on SQL, and I was addressing what people would find if they poke around the literature. And I edited out something about not considering such things as embedded scripts for HTML.
Reply With Quote
  #48  
Old 06-13-2004, 10:15 AM
Musicat Musicat is offline
Charter Member
 
Join Date: Oct 1999
Location: Sturgeon Bay, WI USA
Posts: 17,527
For the truly geek at heart...

I don't believe anyone has yet linked directly to this Wikipedia page of Esoteric Programming Languages, so I will do the honors.
Quote:
Esoteric programming languages are programming languages which are designed as a proof of concept, or as jokes, and not with the intention of being adopted for real-world programming. Consequently, usability is rarely a high priority for such languages. The usual aim is to remove or replace conventional language features while still maintaining a language that is Turing-complete.
Reply With Quote
  #49  
Old 06-13-2004, 11:21 AM
Fish Fish is offline
Guest
 
Join Date: Nov 2000
Quote:
Originally Posted by Small Clanger
Every language I can think of allows (if not demands) this. What you go on to describe is the object oriented approach where everything in a program is an object.
Thank you, SmallClanger. I knew I was specifically describing an object-oriented approach but I wanted to keep the discussion in layman's terms and didn't want to get into too-technical terms like subroutines or defined functions that exist in other languages. I also didn't want to have to get into the concept of the pre-designed top-level classes that come standard in a particular language.
Reply With Quote
  #50  
Old 06-14-2004, 05:43 AM
PookahMacPhellimey PookahMacPhellimey is offline
Guest
 
Join Date: Jul 2001
Quote:
Originally Posted by Fish
Thank you, SmallClanger. I knew I was specifically describing an object-oriented approach but I wanted to keep the discussion in layman's terms
Which did actually help a lot.

Yabob, sorry I mixed up terms. Before this discussion I really knew nothing about the whole topic and I thought they were the same thing. I know exactly what you mean from other types of discussion, though. I've seen many discussions derail because people are talking about two different things using the same word.

To all. I'm still reading. It's a lot to get my head around and I'm a bit confused about some of the finer points. I do think, however, that I understand at least the basic idea now, which was the point of the thread. Funny, but threads actually ending up doing what you started them for is actually a rarer event than you'd expect. Well, that's my experience anyway.
Reply With Quote
Reply



Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 07:02 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.

Send questions for Cecil Adams to: cecil@chicagoreader.com

Send comments about this website to: webmaster@straightdope.com

Terms of Use / Privacy Policy

Advertise on the Straight Dope!
(Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks.)

Publishers - interested in subscribing to the Straight Dope?
Write to: sdsubscriptions@chicagoreader.com.

Copyright 2013 Sun-Times Media, LLC.