What is the greatest (most influential, whatever) computer program ever written?

The importance of TCP/IP today cannot be overstated, for sure. And this is maybe a minor nitpick, but IMHO the ideal nomination would be one where something was so uniquely innovative that we can say, “if this had not happened, we don’t know how things would have evolved”. But TCP/IP evolved more or less concurrently with DECnet, which actually had some technical advantages. And if one argues that TCP/IP was the foundation of the Arpanet which was the precursor of the Internet, one can argue that at the same time, DECnet was the foundation of DEC’s internal Easynet network which at its peak contained around 55,000 nodes worldwide and was, if not larger than Arpanet at some points, certainly the largest private network in the world. It was also used for some Arpanet-like scientific networks in preference to TCP/IP, like the US and European space physics networks US-SPAN and E-SPAN.

And before long, OSI began rolling out, especially in Europe. OSI had overwhelming technical advantages but was a lumbering heavyweight in terms of complexity, and soon DECnet hitched its wagon to OSI compliance and became DECnet/OSI. Had it not been for the explosive growth of the Internet, we’d probably be all interconnected with OSI protocols. And frankly, an OSI-based Internet would probably be far superior in terms of more capable routing, intrinsic security, and practically unlimited addressability right from the start. The problems that surfaced again and again as the Internet grew were along the lines of, “TCP/IP was never really meant to do this”, either in terms of scale, base functionality, or security. Yes, TCP/IP was designed specifically for the Arpanet, but the Arpanet was meant to be a closed network of limited size with trusted participants.

That TCP/IP successfully became the foundation of today’s global public Internet is in fact rather amazing. The merging of DECnet with OSI was essentially a huge business bet that this was not going to happen.

Truly l33t ten-year-olds added a semicolon at the end of the PRINT statement :wink:

Loads of great examples of “greatest” candidates in the thread. I’ll add two that I know nothing much about, so specifics welcome:

  1. the radio software that facilitates seamless hand-off between mobile towers, and

  2. whatever the hell Google, Amazon and Microsoft use to manage their humungous number of servers

I’ll vote for Visicalc as well, but because it was such a groundbreaking concept.

Word processors and photo editors are basically digital versions of what people already did. But while visicalc enabled people to do things they already did on paper, it handled aspects from several different tasks. And I’ll bet there are things done in software spreadsheets that people hadn’t even thought of before, or maybe didn’t bother with before it could be done on the computer.

I would do the same thing, but it would say "Joey " and whatever I did made it not LF/CR so it would say Joey joey joey… It worked perfectly so that once the screen filled up it appeared to scroll sideways instead of up. At some point I also added a counter and had the color change each time it wrote my name.
I would, like you, quickly type it on a handful of computers at the computer store and it was my basis for quickly seeing how fast a computer was. There was typically a noticeable difference in how fast my name scrolled across the screen from one computer to the next.

IMHO, I’d say the original C compiler from Dennis Ritchie. Once the Unix kernel was rewritten in C, porting the OS to a new machine architecture was at least a couple of orders of magnitude easier than recoding it in assembler every time. Use a cross compiler on a running system to generate a running version of C for the new machine, compile the kernel, and viola. This was the insight/breakthrough that made *nix the winner of the OS wars everywhere but the desktop.

Appreciate the responses, everyone.

My nomination is spreadsheets - I can go with “Excel” or “Visicalc-Lotus-Excel” or whatever, but in terms of productivity, usefulness, variety of output, and more, it’s hard to beat Excel.

However, TCP/IP and the *nix kernel are extremely strong contenders, as well as the C/COBOL/FORTRAN compilers mentioned.

Does HTML count? Wait! I’m the OP - of course it counts! :wink:

Baron Greenback, apparently I wasn’t |337 enough (or I forgot) - what did adding the semicolon do?

By golly, I was going to contradict you because I had long been under the impression that the venerable COBOL came first, but you’re right. COBOL arose from the work of the CODASYL conference that began in the spring of 1959. Wikipedia tells us that the first FORTRAN compiler came out of IBM for the IBM 704 in April 1957. Ignorance fought!

Placing the exact date drove me to scour my drives for a marvelous old publication that I knew I had somewhere, and I eventually found it (thank you, Windows search!). It’s called “Preliminary Report, Programming Research Group, Applied Science Division, IBM, Specifications for the IBM Mathematical FORmula TRANslating System - FORTRAN” and is dated November 10, 1954. It’s fascinating reading from the standpoint of technical history. The first paragraph is almost poignant in its simple innocence:
The IBM Mathematical Formula Translating System or briefly, FORTRAN, will comprise a large set of programs to enable the IBM 704 to accept a concise formulation of a problem in terms of a mathematical notation and to produce automatically a high speed 704 program for the solution of the problem. The logic af the 704 is such that, for the first time, programming techniques have been devised which can be applied by an automatic coding system in such a way that an automatically coded problem, which has been concisely stated in a language which does not resemble a machine language, will be executed in about the same time that would be required had the problem been laboriously hand coded. Heretofore, systems which have sought to reduce the job of coding and debugging problems have offered the choice of easy coding and slow execution or laborious coding and fast execution.

Concatenation (at least in the BASIC flavours on the machines found in UK shops back then)

I remember Fareed Zacharias reporting that Angry Birds accounted for some **vast **number of wasted man-hours of productivity.

That’s gotta count for something.

What the Excel Gods have given us, the Angry Birds have taken away…

Some great nominations, for me it was Lotus and WP. Still use the shortcuts in Word and Excel today!

But windows was great for the consumers.

At the end of the day Google and it’s algorithms are the mightiest of them all!

Another vote for VisiCalc.

The first great category killer of the PC age.
People paid thousands for an AppleII just to run a $100 bit of software.
And that they didn’t copyright it allowed a array of clones, copies and competitors to develop (there were dozens and it largely outstood them all) before it got squished by the second great category killer of the PC age.
VisiCalc went bankrupt a bit more than a year after 1-2-3 was released.

I would have thought that Microsoft’s first foray into spreadsheet apps (MasterPlan) which pre-dated Lotus, would refute the notion “Microsoft ripped it off and called it Excel”

The leading spreadsheet has always been determined by the operating system.

Windows is certainly the software that most influenced today’s computer landscape.
But Windows wouldn’t exist if the Apple Macintosh hadn’t been created earlier.
But the Macintosh wouldn’t exist if the Xerox Alto hadn’t been created earlier.
But the Alto wouldn’t exist if NLS hadn’t been created earlier.

NLS was the true origin of much of what we now take for granted in computer interaction: windows, mouse input, WYSIWYG word processing, hypertext, screen sharing, and much more. It’s probably hard for younger computer users today to imagine the computer landscape in the early 1960s, when most computer interaction was via punch cards and line printers. NLS was an almost unbelievably huge leap forward.

For us old people, that interval was quite a difference! Especially us old programming language designers.
John Backus led the FORTRAN team, and he is famous for BNF - Backus Naur Form, which is used to describe the syntax of a programming language, and which we used to feed more or less into lex and yacc to write parsers.
I never had the misfortune to code in COBOL, though I knew it, and even quoted Jean Sammet in the presentation I used when I went around looking for jobs after I got my PhD.

But he didn’t pioneer that. Multics, which Richie worked on and gave up in disgust, was written in PL/1, which was a controversial thing to do at the time. I know because my professors at MIT were defending the decision in the early days of Multics which I used in late 1969. In fact C has roots in BCPL, a systems implementation language that ran on Multics.

Although I have previously nominated Visicalc and still stand by it, the different versions of Windows are interesting: versions 1 and 2 were curiosities and didn’t sell at all. Version 3.0 (quickly superseded by 3.1) were monsters that changed personal computing. After that, every second version was an impressive advance. For example, Win95 they tried a big step and fell on their face but Win98 fixed it and was great. Same for

Missed the edit window (#$&^%^&).

Same for ME/XP, Vista/7, 8/10

In my Computer Concepts class, I highlight and talk about many of these that have been mentioned and that I agree with: Xerox Star, Unix/Linux, VisiCalc, Netscape/Mosaic, and even OS/2 (it just didn’t crash, unusual for its time).

However, I want to nominate Bill Gate’s version of BASIC for introducing programming to a whole generation of future entrepreneurs. You turned on your S-100 or TRS-80 or Commodore or Apple or IBM PC and there was something readily accessible and usable by novices. Yes it’s quite “brain damaged” in the lack of structure, but being able to draw simple graphics and tunes was liberating, especially considering the alternative was finding and paying for a compiler (Pascal? ugh) or assembler.

And it was extremely influential, since IBM contracted Gates for its ROM-based version of BASIC, and off-handedly asked him if he knew of a simple disk operating system. By sure luck and good negotiations to resell DOS to anyone, he outmaneuvered IBM, allowed clones to proliferate, and generated enough revenue to develop and grow Windows.

You’re right, of course, about Multics and PL/1, but where that was a dead end, the C/*nix combination and succesors took over the world.

Yeah, I think that was true for all flavors of BASIC that I remember. Without the semicolon, the next PRINT or INPUT statement would print on a new line. With the semicolon, it would just continue at the last screen position. So:

10 PRINT “HELLO”
20 PRINT “GOODBYE”

would output:

HELLO
GOODBYE

but

10 PRINT “HELLO”;
20 PRINT “GOODBYE”

would output

HELLOGOODBYE

(Of course – and I’m not sure how many flavors of BASIC allow it, but I seem to recall it being allowed in most dialects of Microsoft BASIC, Commodore BASIC, Applesoft BASIC-- we wouldn’t even use the PRINT statement, but rather a ?, which was an alias shortcut for PRINT.)