When was the Internet invented? And could it ever be destroyed?

I knew people would jump on this post (and mostly rightly), but there’s an important kernel of accuracy to it. [Although Berners-Lee actually put the site out in 1991, invented the web in 1989 and it was in 1993 they released the project as open-source and royalty free so anyone could copy them and setup their own web server.]

I have a long computing history. I had some very early and in my opinion fantastic (for its time) exposure to computers. I went to the United States Military Academy (USMA “West Point”) and almost astoundingly considering how backwards the Army and its Academy can be about so many things they were probably 20 years ahead of their time in terms of computing. Ten years before I set foot on campus, in the early 60s, they laid out the goal that every cadet would be exposed to computing. This was an idea that came out of the importance of early types of computers in WWII.

By the time I arrived in the 70s things were in full swing, there was a dedicated mainframe solely for use for cadets to learn computing on, further as part of your ordinary education every student at some point had to write some programs. One guaranteed exposure is required math courses had a certain number of problems you had to solve on the computer. (We wrote our programs in FORTRAN on cards.) I’ve heard from people who were majoring in specific technical fields at some other universities were doing similar things but this was every cadet doing some amount of programming, further unlike some people I know who were actual computer science majors at other schools who solely wrote out code using punch cards and paper and would submit it all for compilation to the team that actually ran the system we had dozens of terminals and teletype machines all throughout campus giving a good amount of access to the system.

It’s never been what I wanted to do with my life full time, but ever since then I’ve always maintained a strong interest in computing. When I actually left West Point my job in the military didn’t really expose me much to the emerging internet technology and I didn’t even have a personal PC for use there until many many years later when I was permanently riding a desk and most office workers had moved to using PCs.

So afterward to sustain my interest in computers I had to purchase and run my own machines (quite expensive in the 1980s.) You began to hear about the Internet, but what it was at that time to a plebeian computer enthusiast like myself was a fancy academic/government system only egg head researchers or privileged researchers at big corporations or research institutes had access to. It was known that had some very high speed hardware and could transfer tons of files and had all kinds of simultaneous users. But it was a golden highway for which there was no on ramp for computing peasants.

For people like me, I was relegated to the world of dial-up computing. Now, not dial-up internet where you dial into an ISP and they hook you up to the wider internet. No, I’m talking dialing a number with my computer to call some other guy’s computer he has sitting in a room in his house. On this computer he has setup a Bulletin Board Service. Typically you found a few BBSes you really liked in your local area (because this was the 80s, long distance was expensive and of course if you were dialing a computer across the country that was a long distance phone call) and stuck to them. While a very small number of enthusiast ran BBBes had fancy setups that could allow multiple users concurrently by far the norm as one person at a time could call in and use the BBS. Time was parceled out by limiting unique visitors to x amount of time per day before they would be kicked off. You typically would dial in, use it for a little while, then sign off.

But while connected you could do cool things. A BBS was kind of like a message board with some additional features. You could find files to download, exchange private messages with other BBS users, you could chat with the sys op if he was around (since he’d be at the physical machine hosting the BBS he could also be on, but you couldn’t typically chat with other users since they couldn’t dial in at the same time as you), you could post public messages in a forum/wall format, and you could even play rudimentary multiplayer “games” in which different users would take turns when they called in.

A few companies hosted BBSes, primarily for me this meant companies like Sierra (makers of King’s Quest), because you could actually in the 1980s find software patches to fix bugs this way. Prior to that you really had no options with buggy games other than to deal with it. Typically these corporate BBSes also had fancier setups, which meant more than one person at once could call in.

But these were all separate from the internet. The internet was something fancy not for me.

Later, I signed up for CompuServe. CompuServe was somewhere in between the BBSes ran out of someone’s house and the true full internet. CompuServe was accessed by calling a local CompuServe number (they had numbers all over most of the country) and then connecting. CompuServe could support thousands and thousands of concurrent users, making it far larger than the biggest and most sophisticated BBS. CompuServe was not directly connected to the “true” internet, but it formed its own kind of interconnected network. Everyone had a CompuServe number, including companies. You could go to a specific corporate CompuServe number and actually order certain services totally online, for example I believe one of the earliest things you could do was actually book flights with a few air lines on CompuServe. I’m not sure how many people did it, but this was pretty freaking cool in the age when many people still called up an actual human travel agent to book a flight.

CompuServe also had a veritable mountain of message boards, text-only versions of a few major newspapers, chat functionality, messaging like email to other CompuServe users, and of course online games. It was also as expensive as sin, I can’t remember the rates but I think it was easily $0.15-0.20 a minute when I started using it. This meant that while you could do far more than on a traditional BBS and didn’t have to deal with the frustration of busy tones as often (many popular BBSes you would have to autodial for hours to get into), you were still quite limited unless you had very deep pockets. I ran up a few very big CompuServe bills when I started playing the online text roleplaying game “British Legends” on CompuServe (I didn’t know right away but it was actually just a branded version of the even older “Multi-User Dungeon” game that started in British universities and spawned the generic term MUD for online text games.)

But the thing about CompuServe, as amazing as it was to me at the time, it was not the internet. There were other services out there at the time, Genie, Prodigy, I think even AOL may have existed as a stratup (I never heard of it until it got big in the 90s, though), but they didn’t work together. They were their own worlds, and they really didn’t connect to the real internet, either.

Arguably CompuServe broke out of the walled garden a little bit near the late 80s, when it added the functionality to email anyone, and that was a big deal. Before then, I could message other CS users, after, I could email anyone in the world using an email address more or less.

While the internet predated WWW, it was the web and Berners-Lee that really changed this model. By creating this publicly accessible web sites, which providers like CompuServe were very quick to open up to their users, a lot of people who had been running around in their own segregated gardens were now hitting web servers hosted on the honest-to-god internet of legend before only spoken of in tech magazines. It wasn’t until a few years later when they removed a few rules that were a holdover from the internet as a purely government sponsored and later academic network that what I consider the “internet as we know it” becoming truly mainstream. Because as a computer user back before then, I may have been connecting to other computers, but I wasn’t connecting to the internet until the web created the real possibility for that to happen. [Technically CompuServe actually was using the IP protocol on the back end to connect its many networks together to facilitate all of its users connecting and interacting, but it was still closed off from the “real” internet of universities and research institutes.]

So in a sense there’s a very good reason most people confuse the creation of WWW with the creation of the internet, because WWW becoming popular is exactly what let most ordinary people actually access resources on the internet and hit content that lived on the internet.

Great post Martin Hyde!

That’s what I thought, too – I didn’t realize it was an actual set of protocols rather than just a vague design philosophy.

But Derleth’s link and posts summarize the situation well. There is no need to continue to spread that not-quite-accurate model when explaining “layers of protocols”. It would just be more dry technical/historical reading that won’t really help people understand the actual Internet since it’s no longer in use.

Fascinating read, and a great explanation of how the WWW was/is different from the Internet. Thanks for sharing!

And just to co-reminisce/stealth-brag… :smiley:

I vaguely remember being in 5th grade or so, hacking our school’s computers by force-quitting their security software using Word macros and trying to figure out what the hell “dim” meant in VBasic. A year or two after that I became a sysop of my own BBS (which nobody used because I was a 10 year old with more modems than friends, but Wildcat was fun to play with) and I thought ZModem was the most amazing protocol ever – resumable downloads meant the world on 14.4k modems!

Then when I was 11 years old, my mother (who worked at IBM at the time) showed me something called Netscape Navigator at an office. It had an actual GUI and even supported the mouse (which before then was mainly useful for Minesweeper and Microsoft Word), and you could click things and text and images would magically pop up out of nowhere. It was actual information from the other side of the world, not just stupid messages left on a BBS message board!

She gave me a copy of OS/2 Warp to take home and I installed it on our 386, then she signed me up for IBM’s early employee ISP and let me loose on the early Web. I chose the email address “username@ibm.net” and it confused quite a few of her coworkers who insisted that could not possibly be a real email address, until they actually emailed me. I guess that was the era before trolling became an official phenomenon.

After a few days playing around with the Netscape Fishcam and other moderately more useful sites, I distinctly remember asking my mom, “Mommy, how come every store and every person doesn’t have their own web page? This is such a useful thing… don’t you think this would change the world if more people knew about it?”

And before long came AltaVista, and in a few short months the scope of human existence exploded far beyond what we imagined possible with Yahoo! (which at the time was manually curated by humans). And not long after that GeoCities rose to prominence, carrying with it a plague of animated GIFs but also an endless sea of knowledge theretofore limited to corporations and formal academics with institutional hosting.

I don’t think even she, immersed in the IT culture as she were, could’ve predicted how much, how quickly, human history would come to be shaped because of those few years. From then on the progress of our species would be seen in hertz and bytes, and the Earth would come to be measured not in miles but milliseconds. Yes, a new Age had already dawned by this point and we all knew it, but what held us breathless was watching it mature into adolescence to take its first running steps into a brave new world. There was a collective sense that the peoples of the world were bearing live witness to a part of history perhaps more significant than anything before it – would it mean world peace at last? Will we soon shed our bodies? Know everything?

In retrospect it might seem silly now to have had such high hopes for a global advertising platform ruled by a fruit and a big number, but however you spin it, at least a framework had been laid, distances have forever shrunk, and our species is more connected (if not united) than ever before. Someday the history books might look back at the early 90s and wonder “Why did they ever separate time into BC and AD, instead of before and after the WWW?”

Funny, but I also grew up through these times. I have a different perception. Possibly based upon seeing things happen from a slightly different viewpoint.

Back when I was an undergraduate networking was something we did between specific machines. We had a row of Vaxen, and they ran DECNet. There were various institutions that ran IP based networks between cities (CSIRO did). As technology advanced we got personal workstations, I remember my first, a Sun-2/50. That was pretty spiffy. Then in about October 1986 Australia connected permanently to the Internet. At that time a satellite link of pretty woeful bandwidth, but instead of intermittent connections to transfer mail, we had IP. That first day the attacks on the Suns started, with attacks on the well know exploits. But it was great, we had ftp access, nntp brought with it a huge time sink, and email was close to instantaneous. The first spam was not too long is arriving, not great amusement, such was the novelty.

When the WWW came along, I will admit to being, well, happy that there was a neat GUI layer over what I was doing, but in the end, not much changed. You can point and click, but for anyone who grew up with a CLI, the gestalt is that you are still triggering basically the same protocols. I’ll still use ftp, curl, and wget, quite cheerfully when I need to.

What has transformed things is scale. Back then, there were essentially only universities and major companies (those with serious technical research interests) on the Internet. It was a bit like being the only person with a telephone. It was only useful when you needed to interact with other researchers. There came a cascade of critical masses, as new areas got onto the Internet, and new areas opened up. With scale came issues of managing scale, and the search engines bridged the gap. Altavista was a huge step.

The next big thing is interconnectivity of things. The grey goo is not far behind.

Yeah, for me I was a bit too old to have that experience in school. We had a mainframe with awesome access for the 70s. I know many people who are my peers in terms of age who went to larger State and even private schools where mostly only people in highly technical majors had any mainframe access…and even then it might be limited to them being allowed to submit punch cards for mainframe techs to compile. At USMA we actually had terminals all over campus, and for a school with only a few thousand enrolled it was pretty amazing we had a mainframe that was primarily for the use of cadets. [They used it for grading papers and a few other administrative tasks which lead to a minor scandal when a few cadets found out how to modify their grades on the system.] Although now that I think about it, I don’t think we used “real” FORTRAN, I think (like many schools have done) the academy actually created its own programming language derived from FORTRAN that was a bit “watered down” so students trying to submit a program to say, calculate the answer to a math assignment didn’t have to learn full FORTRAN.

I thought that was the height of awesome at the time, but it predated ARPANET / NSFNET being widely available to universities and students so I never really had any access to the internet until the web opened it up. The internet as was like a fancy road, and it was deliberately kept closed. There weren’t any commercial options for individuals to get internet access until the very late 80s when CompuServe and its competitors first started to get limited connectivity with the wider internet.

When and why did the military start to de-emphasize that aspect? New recruits these days don’t do any programming anymore, right?

It might be worthwhile to distinguish the various types of computing that you found in the 1970s, since it all gets muddled today when everything is connected.

First, we had standalone computers. I learned to program on an LGP 21 in high school which was single user and not connected to anything else.
Then there were mainframes, which often did one job at a time, but which were so expensive that you didn’t talk to them directly, but send jobs on card decks. Actually those with the right equipments could submit jobs from files to the 370 we had at Illinois, very handy for my compiler class.
Then there were timesharing systems. One of my CS classes during my first term at MIT in 1969 was on Multics, which was still being developed. It was not the first by any means.
The Arpanet was the next stage, and was around by 1973. One of my professors defined it as a way for MIT students to send “foo” to Stanford and for them to send “bar” back. In 1975 or so we used the Arpanet from Illinois to play the Stanford paranoid person simulator Parry.
By the late 1970s you could use modems to log into lots of computers - I sometimes used a terminal from home to log into our Multics system.
By the early 1980s the internet was working, though you needed explicit paths to send mail through it, there being no DNS yet. Usenet was there by the late 1980s also. The walled off domains of Compuserve and AOL were not the net, or not very much. While a lot of people owned PCs, there was the question of what they did with them except writing and printing out letters and filing recipes. And playing some games.

the explosion came from the combination of lowering the price point of home PCs to under $1K, and more ISPs to give you access. These often started small. But soon Metcalfe’s Law really took over, the value became greater than the cost of a PC and connectivity, and the rest is history.

I don’t know about the military, but my daughter’s CS 101 class basically didn’t have any programming. It had how to use ftp, a little bit of html, and fill in the blank Javascript which hardly counts. CS101 in my day was Fortran - then we who taught more advanced classes had to break them of bad habits.

The death of true computing came when Sun stopped shipping a C compiler with workstations.

The USMA is designed to train the top officers in the “next generation” of the Army and as part of that to give a college education to Army officer candidates (cadets), when you graduate you are given a B.S. and a commission as a 2 LT in the U.S. Army and are on a relative fast track to making Captain. It’s very different from enlisting through a recruiter or even going in as an officer through JROTC or OCS.

The Army wanted its top leaders to have programming exposure and historically almost all of the top Army leadership came from West Point. I don’t know if there is a comprehensive push today to make sure every cadet gets exposed to programming as there was in my time, though. Probably not, but there is a long history of it being a technical oriented school (it focused on engineering for its academic side since the early 19th century when it started and even developed common curricula for engineering schools that were later copied across the country.)

And then SKYNET became self-aware.
:frowning:

If so, it was revived when Linux and the open-source BSDs came along in the 1990s and gave average people access to systems much better than the old closed-source Unix clones ever were.

Anyway, a great place to get access to pre-Internet and non-Internet home computer network information is textfiles.com, which is a massive archive of mostly text files but also pictures, sounds, software, and other stuff.

The thing about Sun killing the C compiler in a standard release was that it was “no longer needed” and thus they could unbundle it and charge money for it.

The logic for it being previously needed was that, back then, a kernel reconfiguration required you to recompile some components. Hard to believe now, but that was the way it worked. Once they got the kernel to the point where this wasn’t necessary anymore, there was no need for the compiler in day to day operations or management of the computer. So some bright sparc decided to make some more money by unbundling it. Universities still got it as part of their licence.

What the policy did represent was an attitude change. The computer was a useful device without the tools needed to write your own code. That was for many a sea change, and as Voyager alludes to a point where attitudes to computers and user changed for the worse.

The open sourcedness of stuff was not the issue. There was open source long before the term was invented. Not only UNIX, but my PhD work involved rewriting the Zurich Pascal compiler which was freely available.

Sun’s stopping of shipping the compiler indicated that most people who bought Sun workstations did not know how to program - for instance people who used them for CAD or EDA work, or Wall Street people. That was the big change.

Man, you sound old and crotchety!
Get off my SPARC, you damn kid!
I’m confidant that there are WAY more programmers today then there were 20 years ago. Maybe not people writing Fortran or C number-crunchers to run on a Mainframe, but programmers none-the-less. Just look at how many apps are available for the iPhone (over 1,000,000). Writing such an app requires knowledge of Objective-C, object-oriented program design, and UI design. This is a lot more involved than writing a simple C utility or a shell script!

Heh. My first Sun workstation was a Sun 3 which still used the Motorola processor. I’m not complaining. I’m sliding into a well-paid senility by writing 100 line Perl script which do things that seem to be miraculous to the non-programming engineers in my group.

I’m sure you’re right about absolute quantities. But only a tiny percentage of computer users these days know how to program - or have a clue about what goes on inside.
That being computer literate today means you know how to use Word and maybe Excel if you are really advanced is the reason why so few people have any clue at all about what is going on inside their machines. I’m far from being an expert on Windows, but I can debug quite nasty problems by my basic knowledge of architecture, programming, and operating systems. Maybe some day PCs will get bulletproof enough so this doesn’t matter, but it hasn’t happened yet. I’m sure just about anyone with decent real computer skills has heard “the internet has crashed, do something about it!”

True but irrelevant: Open Source was and is important not just because source code is available but because the source is available online, to a large community of programmers and users. That speeds development cycles, smokes out bugs, and turns problems which could lead to a project being killed into problems which lead to, at worst, a fork.

This is inevitable when a product becomes useful for multiple groups of people. It only kills the development world if the product is closed and people have a tough time getting development tools to run on it again. The majority of people will never be developers, or they won’t be as the culture and tools stand now. Most people can’t fix cars, either. Therefore, getting a larger fraction of everybody into your system means most of them are going to be different from the people who built it to begin with.

Jason Scott has an interesting take on this from the perspective of the Internet and who uses it now. The frame story is about him goatse’ing MySpace, but that image appears nowhere in the linked article.

I don’t entirely disagree with you, but I’ll just go on record here as saying that your position seems entirely TCP-centric and overly hostile to OSI. Yes, clearly TCP/IP won and OSI lost, but to some extent it was the Betamax syndrome where the better technology lost for bureacratic/political/marketing reasons. TCP/IP had the advantage of being driven by DARPA and enjoying very early deployment in the US, while OSI had no such focused support in Europe.

The seven-layer OSI model was supremely elegant. The network programmer should care because every important network function was isolated as a unique level of abstraction independent of the others. OSI had far more robust routing protocols, more comprehensive name services, much stronger security, and an absolutely massive address space even at a time when Class A IP addresses were being handed out like candy because it was thought that a 32-bit address ought to be enough to last until the end of time. When a typical IP host might cost a couple of million bucks and an IMP alone a few hundred thousand, that seemed like a reasonable assumption. So did the almost complete absence of security. After all, if MIT couldn’t trust Stanford not to spam them with offers for penis enlargement aids, who could you trust? Chinese hackers and mailbots weren’t in the picture.

You can argue that OSI was “overly complex”, and I’ll counter with the argument the TCP/IP was insufficiently architected, and was more of a “seat-of-the-pants” implementation that was patched and kludged as it went along. Both arguments are somewhat hyperbolic, but that’s more or less the flavor of how it went down.

As I said, you make some reasonable points, but let’s keep history in perspective. OSI was not some lumbering behemoth like the Spruce Goose. It just never had a chance in the marketplace once the ARPAnet took off.

Any successful Internet protocol by its nature must be a seat-of-the-pants implementation. The real trick was in designing something robust enough that it could survive that seat-of-the-pantsing.

Betamax wasn’t actually better. It had much reduced recording time compared to VHS and the quality was no better on the TVs people actually had back then. I know this is a tangent, but it’s an important one: We must judge technologies in the context of the times, not in the context of later on.

And the context of the time was that all ‘real’ networking projects had Large Serious Groups behind them. OSI was being championed by the likes of GM and IBM, in addition to being standardized by CCITT (now ITU-T) and ISO. The small players, the individuals and small companies, were the microcomputer users with BBSes linked together via Fidonet and a number of even more obscure protocols that routed data across the phone lines. The small players’ networks sank without much of a trace.

The Application and Presentation layers being separate is a design decision made by people who are rarely network programmers. One of the cleanest separations between Application and Presentation is seen in creating web pages for desktop vs mobile, and that only works as well as the people writing the documents (not the networking code) allow it to. Persisting in acting like it’s up to the network programmers is bad engineering, and persisting in teaching it as if it were is bad pedagogy.

Fair enough. IPv6 solves this problem, however, and it’s gained enough traction that Comcast is trumpeting about how fast they’re rolling it out.

The problem with this thinking is that security is a social problem. What gets secured (are addresses kept secure, or is it like the postal service where the police have a right to look at addresses without a warrant?), what trust looks like (do I need to explicitly trust everyone I share a document with?), and even how secure things are allowed to be (remember ITAR?) are all questions well outside of the bailiwick of networking programmers, and thus out of scope for standardizing the bare task of moving bits around.

So this isn’t relevant to OSI vs TCP/IP. It’s relevant to the social structure that you apparently want to tie to OSI, and I’d argue that that’s orthogonal. You can implement either in terms of any social structure, and the actual way the network is used would reflect the pre-existing social structure regardless of the developers’ or standards officials’ original intent.

In short: Even if the email address has a field which apparently indicates ‘surname’, what’s actually put there (if anything) is up to the people involved.

“Rough consensus and running code” got us a stable system then, and it’s allowed us to modify the system ‘in-flight’ to keep it working and improve its features. Change doesn’t happen quickly and it doesn’t happen evenly, but that’s true of any large, distributed system lacking a top-down dictatorship.

OSI protocols are certainly practically useful and are used in limited contexts. However, being the first mover is a technical feature, too, and it speaks to how simple and practical to implement the protocols are.

That is the current model, but the open source model predates online distribution and collaboration. What is BSD but a fork? The first copy I got of Usenet software came on a big magnetic tape, and there were tons of variants of emacs and shells out there before the Web.
The real differentiator of open source is its openness, not its development. Besides software there is also an open source hardware movement (at opencores.org) which has a lot of the same benefits. I even convinced some people to open up some obsolete chip designs, though unfortunately they were not very good chip designs.

Since I’ve worked in computer design for the past 20 years I owe my paycheck to lots of people who are not developers buying computers. But that was not always true, and the C compiler was an indicator of the change.
And get off my disk! :smiley: