Explain the concept of "computer language" to me.

Whenever someone mentions they can “speak” this computer language or that I’ve always scanned over it and filed it away under “clever computery things”. I have absolutely no idea at all what they are or what they do.

I mean, in what way exactly are they languages? Are there words? Grammar?Poems? Or is “language” just jargon here?

To whom are what do they speak? I mean, if I speak Dutch I can speak to the Dutch (and Belgians and maybe South Africans, but let’s keep it simple). Computer language allows you to speak to er… computers? How? What are they saying? Or are you just ordering them around?

Also why are there so many different computer languages? Do computers have different nationalities? Or is one language new better and improved than others, or better suited to certain topics?

Okay, I know I’m a bit silly here and I wouldn’t expect anyone to address these “questions” one by one. However, I would really like to find out what this is all about. So could anyone explain the basic concept in layman’s/idiot’s terms?

Yes, yes, yes. They are considered “languages” in just the way you’re
thinking. Computer languages are classified as “context-free” by linguists, whereas human languages are in an even more complex category, I believe.

Yes, we use computer languages to boss them around :). There was a group of
BASIC (that’s an old language) programmers once who thought it’d be funny to
use only BASIC to order each other around in a convention. I think they mostly succeeded.

Languages are like cars, as the metaphor goes. Sometimes you need a fast car, like a ferrari. But sometimes you need a dump truck, which is slower but can carry in bulk. Different tools for different jobs.

I do not “speak” any computer languages, but I think I understand what they are and so I’ll jump in before the programmers come in, as I might be able to explain it better, layman to layman.

Yes, they’re for ordering computers around. A computer on its own just sits there - to make it do useful stuff you have to tell it to do stuff. And, computers being dumb, you can’t tell it what to do in English. Instead you have to use a defined set of instructions. It’s no good saying “When someone clicks on the “Submit Reply” button I want you to take all the text they’ve entered in the box and submit it to the server (or eat it, as the case may be)”, you need to use computer code, with lots of mystical brackets and suchlike :slight_smile:

Yes, there are words, and grammar - in fact the grammar is explicitly set to allow no room for ambiguity, and from my early tinkering with Sinclair BASIC I know it will result in plenty of error messages if you don’t get it right.

I’m sure someone will be able to tell you more about them…

There’s all sort of computer languages. Basically, a computer language is a way that a programmer instructs the computer to perform some task. Some are highly specialised, like SQL, while some are general purpose, like C++.

In the beginning, programs were written by manually flicking switches on the front of the computers. This is a pain, so simple assembly languages were invented. These are of the form of simple three of four letter mnemonics with some operands. This was better, but still is a bit of a pain to write in (although some people do, Chris Sawyer, author of Rollercoaster Tycoon and others writes all his games in Assembly language).

Next came even higher level languages. These were starting to look a bit like natural language. You got languages like Ada, Fortran, C etc. These are still widely used (especially C), but they are seen as unsuitable for creating huge programs in, by many. Object oriented languages were then developed, Simula being the first (I think?). Examples include Java, C++, Smalltalk etc. These attempt to map computer code with objects in the real world. For example, you can define a dog class, with a state, for example, age 15, heading north, colour brown, and provide means of altering this state.

This is a highly simplified history of programming languages, and leaves out many important ones, too. Generally, languages are broken down into families, espcially the higher level ones. I’ve already mentioned obkect oriented languages, but there’s also declarative languages like Prolog and functional Languages like Lisp.

Here’s a short example of some code (C++) that prints “Hello World!” to he screen:



#include<iostream>
using namespace std;

int main(int argc, char** argv)
{
    cout << "Hello World!" << endl;
    return 0;
}


HTH

Unfortunately, this sort of geeky nonesense ends up giving the impression that everyone studying CS is a social misfit to outsiders. But that’s a pit thread.

Ultimately digital computers speak in binary, this sort of thing 0101010111010.
Since telling the machine what to do using binary gets old fast (it did used to be done that way in the early days) we write programs what are called higher level languages which are converted into machine code/binary by other programs.

The lowest level high level language is Called assembly and looks like:

LD 01 A
MOV FF, B
ADD A, B C

What is going on here are very specific instructions to the computer: Yes we are ordering it around :slight_smile:

LD 01 A – load value 01 into register A
MOV FF B – move a value from location FF to register B
ADD A, B – add the values in registers A and B (result ends up in A)

(forgive me assembly people if this is rubbish I’m just showing what it looks like).

Writing a flight simulator (or payroll program for some of us :frowning: ) Would take a long time using pigeon steps like this. A higher level language C/C++/Java/C# lets you write something a bit more human-readable like:
x = y + z;

To a non-programmer code in any of the four curly bracket languages I listed will probably look like the same type of gibberish. Some of the ‘grammar’ is very similar and a programmer who knows C++ (poor bastard) will have no trouble reading C, Java or C#. COBOL is a different story.

Another type of language is less like bossing the computer around and more like asking questions. SQL Structured Query Language (pron. Sequel - sometimes) Is the primary database language and consists mostly of requests for information along the lines of “Tell me every one who working on the Death Ray project” which would look like:

SELECT INITS, SNAME
FROM EMPLOYEE_TABLE
WHERE PROJECT = ‘DEATH RAY’;

I’m tired after all that someone want to take over with HTML? Unix shells? OOP!

Oh look people have beaten me to it, but after typing that lot I’m jolly well gonna post.

Thanks all, so far. Again, forgive me my strange line of questioning, I’m just trying to get my head around it without having the background or possibly braintype for it.

Would it be right to say, that what stops you from composing poetry in computer language is not so much the language as the “audience”. This because the audience is a computer and it won’t appreciate it. I guess certain computer languages would be better suited to it than others, but such a language could surely be written. However, to write poetry in it would only make sense between two programmers, which would indeed be somewhat nerdy. :slight_smile:

But have I got it right, there?

I think what stopped me from grasping the concept before is that I had the concept of “language” tied up with human type understanding. Of course, a computer understands your language perfectly (provided you do it correctly) as it will act according to your command, but that’s not the same as me understanding the colour “blue” or similar. (although I have heard of people argue it IS the same when I was studying philosophy).

Sorry, to wax all philosophical there. I think I will broaden my definition of what “language” is to everything that “acts language like”, and then computer languages won’t be problematic. Or am I completely barking up the wrong tree?

There’s some good questions in here, the answers will take a bit of background setup, bear with me. Note also that I’m not the definative source for this info, take what I say as illustrative rather than exact.

To start with, computers produce their results be following a set of instructions which effectively boil down to putting and pulling numbers into various storage areas and performing operations on these numbers. The instructions on how to perform these operations are hard-wired into the computer (i.e. they ae not part of the operating system but part of the chip, INTEL, AMD, IBM, MOTOROLA and so on are responsible for this portion of the work. These instructions are basically numbers themselves, you ask the chip to perform operation 12 on area 5 and area 6 and put the result in area 7 for example. Given that computers native number format is Binary this instruction would end up as a sequence of 1s and 0s. This is what the computer understands natively and is known as ‘machine code’. As an aside here, not all machine codes are the same, Macs and PC hardware differ at this level for example.

Now machine code is a rather difficult thing for humans to read and write, especially in a context like computer programming where people will often have to re-write portions of their code as bugs are ironed out. The creation of computer languages is an effort to reach a compromise between the rigidity of expression required by the computer with the more intuitive understanding and working of humans.

The first generation of languages was assembly, which replaced the numbers of the instructions and areas with (vaguely) human readable abbreviations. This aids readability but there still exists a one-to-one mapping between the machine code and assembly instructions (I’m not 100% on this, but fairly close). Thus even though humans could now read the code more easily it didn’t allow them to abstract program flow to a level where complex ideas could be presented simply.

The next step is to introduce a compiler. What a compiler does is take statements that look nothing like machine code and ‘compile’ them into the appropriate machine code. Thus instead of telling the machine I want to something by having to write machine or assembly code by hand I can instead write something that uses human level concepts like strings, “Hello World” for example, rather than a series of 1s and 0s that just happen to represent “Hello World” (which I’d probably have to learn). This breakthrough allows several things:

1: More people can use computers. The barrier to learning machine language is quite a lot higher than a compiled language.
2: Programs become shorter and more readable. With the introduction of compilers higher level concepts (loops for example) can be presented in a very concise fashion.
3: You can use the same program on different hardware by changing the compiler rather than changing the code. Thus today it is perfectly possible to run the same code on a mac as on a pc for simple examples (complexities arise from a number of sources, but simple code will work).

These advantages vastly improved the ease with which humans and computers can interact. There was another upshot though, the number of languages in which you could write a program multiplied at a huge rate. These languages diiffer in how you structure the code that willl eventually become machine code (and in some cases, the route it takes to get there). The differences arise for several reasons including advances in understanding of how we can best represent the real world in computer code, proprietry ownership, or simple differences of opinion between academics or commercial enterprises.

I realise I haven’t covered all of your questions here but hope to serve at least as a starting point for your understanding

**#define ( 2b || !2b) QUESTION

what?

Poetry in a programming language is coming up with an exceedingly elegant method of performing some task.

“Computer poetry” is perfectly legit, albeit the audience is a bit limited.
2nd PERL Poetry competition
International Obfuscated C Code Contest

There was one competition which even had a Haiku sub-category, and you could only use so many words or lines or somesuch. You’re basically right in your intuition.

Pookah

In some languages, you are able to define what “blue” is, then you’re allowed to use it. For instance, you might want to store a value. You would define the characteristics (size, numeric or character, etc.) of the storage item (or field). Only then would you be able to reference it.

I’m not sure if this is exactly what you meant in your quote, but it’s a primitive example. A concept like “blue” is difficult to define even in human language.

In the majority of cases, what we call computer languages aren’t actually the language spoken by the hardware of the computer itself; warning - vast oversimplifications ahead:

At the most basic level, the language instructions that the actual hardware of a computer understands consist of things equivalent to:
-Memorise the number following this instruction
-Add one memorised number to another
-Compare one memorised number against another, if they are the same, set a ‘flag’
-Jump to another place in the series of instructions
-Jump to another place in the series of instructions if the ‘flag’ is set
-Move the value stored in a specified location to another specified location

And so on - these instructions aren’t ‘understood’ in words by the computer, but as numbers - instruction number 01 might mean “Memorise the number following this instruction”, so passing the two numbers 01,99 to the processor makes it memorise the number 99.

That’s Machine Code and the actual instruction sets vary from one processor type to another.

To make Machine Code a little easier to use, the numeric instructions often have short names (this is called Assembly Language), so rather than having to remember that 01 means ‘memorise’, the programmer can (in a hypothetical assembly language) write ‘mem,99’. When he wants to run the program, the names are converted into numbers by an Assembler program first (this is a simple, literal one-to-one translation).

Writing large programs in assembly language is painful, so there are easier ways and this is where we get to compiled languages:
Compiled languages consist of commands that are more like human language, such as:
-Repeat the following set of instructions for a specified number of times
-Repeat the following set of instructions until a specified condition is encountered
-Remember a numeric or other value by assigning it a name
-Display something on the screen
-Get some input from the user

For each of these (much more general purpose) instructions, the compiler (which is a program itself, consisting of machine code) has a pre-made machine code translation, so the instruction ‘get some input from the user’ might actually translate into quite a long and complex series of machine code instructions, saving a lot of effort on the part of the programmer.

There are also Interpreted languages, which work in a similar way, but instead of doing all the translation in advance, they do it as the program runs (in fact many ‘compiled’ languages still do interpretation at runtime and many ‘interpreted’ languages do some prior compilation).

But they aren’t conversational languages, just means of instructing the machinery to do certain things, perhaps analogous to a set of travel directions; Proceed along route A, when you come to point B, turn left etc.

You can probably figure out what these sentences written in SQL computer language are doing.

Connect to Big-Telephone-Directory-Database;
Select Telephone-Number from Telephone-Directory-Table where Name=“PookahMacPhellimey”;
Print Telephone-Number;

After these sentences are run through compilers, interpreters, linkers, loaders and other such friendly utilities, they are transformed (eventually) into a series of 1’s and 0’s that the computer will understand as instructions to execute your request. The three sentences you wrote will eventually be transformed into a string of millions, perhaps billions of 1’s and 0’s. So you can see, if you want to construct the series of 1’s and 0’s that instruct a computer to do more than just make an LED flicker on and off, it’s nice to have friends called computer languages to do your work for you.

Okay, I’m think you are collectively getting through to my compu-ignorant brain. What I’m getting now is that compiled languages are a bit like a shorthand that will instruct the computer to do all the basic stuff itself. A small bit similar to the use of “etc”. As in, me saying to you: “1,2, 3, 4, etc.” and you then knowing you are to proceed in a similar fashion, rather than me having to stand there and recite nrs for hours on end.

I must sound like a right idiot to all you programmer people. Thanks for bearing with me on this.

NutMagnet, what I was trying to get at there was the difference between a human’s understanding of a concept and a computer’s. I think that’s a great debate, but I find it really interesting. The reason I brought it up was that was initially blocked me from understanding the concept of a computer language; the fact that a computer does not “understand” a word the way humans do.

I’m beginning to sound a bit like Data. :slight_smile:

A compiled language is just a high level language which gets reduced to another lower level language.

For example, Java code gets converted into Java Byte Code when compiled, which, on modern Java platofmrs gets compiled yet again at run time to the specific instructions needed for individual processors (a process known as Just In Time compilation).

Suppose I define a method like so, in Java, to add two numbers:



public static int add(int a, int b)
{
    return a + b;
}


Then, after compilation, the Java Byte Code emitted is somewhere along these lines (I don’t specifically know JBC, so it’s an approximation):



add:
  push a
  push b
  iadd
  ret


Ultimately, the aim is to get a high level language, which is easily readable by a human, into a form that a computer can understand. Whether this takes one sweep of a compiler, or multiple ones in the case of Java (this was an intentional decision, though, not a design flaw) depends upon the language and platform you are using.

If you’re willing to learn, I for one am more than willing to explain, I wouldn’t denigrate yourself because you’ve not spent any time learning what are quite obtuse concepts.

‘Understanding’ is a difficult enough concept without introducing computers into it. Again drifting a bit from general Qs but you might find John Searle’s Chinese Room thought experiment interesting to think about.

Counsel wolf: yes, that’s exactly the kind of thing I meant! I actually know the thought experiment. I studied philosophy and this came up in either philosophy of mind and/or philosophy of language. It’s been years since I’ve really considered it, though, but will refresh my knowledge on the website you gave.

Excellent discussion, history and descriptions everyone! It sounds like PookahMacPhellimey has got it! One thing to keep in mind is that the computer only take orders and executes instructions and does not ‘think’. You can program a number of conditions to monitor and execute instructions based those values but that is about it. With a very complex system it can appear to be thinking, which I suppose you could argue that the decisions it is making is thinking…

Also programs try to be as succint as possible. You want to be as terse and specific as you can. It makes it easier for the computer to figure out what you want to do and allows you to fit more instructions into a program space.

The distinction I want to make is that you could write out a set of instructions to make a peanut butter and jam sandwich to tell someone how to make it, or you could write a poem to describe the sandwich and various aspects of its appearance, texture, flavour, etc. The poem is not particularly helpful in getting from ingredients to final product, but it sure may be more elegant. A lot more interpretation is needed to figure out what the poem is trying to say, rather than ‘open the peanut butter jar’.

If you have the time and inclination, read this book: Code: The Hidden Language of Computer Hardware and Software.