How long do you think the WWW software will be in use?

Could you see the World Wide Web protocol or HTTP/HTML becoming extinct anytime soon? So far, it’s had a 24 year run and works largely the same way it did in the early 1990s, though of course there have been improvements like Javascript and much higher speeds.

Any time soon? No. There’s just too much out there even if new protocols and languages become popular the old ones will need to be supported for a quite a while. Communication protocols will go first with improvements in technology but HTML and related content will survive even if it’s being converted to some other form at the source. Eventually all that content may be converted to some other form, but we’re not talking soon. It will all be in use 10 years from now, maybe communication will be rapidly evolving into something new at that point, 20 years out the form of content may be radically different.

Javascript is an improvement ?

It will be in use as long as there is an Internet.
Of course, in even a few years, the current standards may be almost completely incompatible with today’s browsers. But, it will be an evolutionary change, not some overnight switch.
Just try to browse a modern website with an 1990’s version IE or Mozilla.

Moderator Action

This is more speculation and opinion than factual.

Moving thread from General Questions to In My Humble Opinion.

I also see it slowly evolving. I can’t imagine anything replacing it in the foreseeable future.

It works so well that Chrome has decided to completely dump it. :stuck_out_tongue:

I’m still waiting for people to stop using FTP.

Hey, it could happen. Telnet finally died.

But nah, HTTP probably isn’t going anywhere fast. The package managers for the major Linux distros all use it as the default communication protocol, and I don’t see a good reason for that to change in the near future.

W3C originally intended HTML 4 to be the last version. (Or 4.01 as it turned out.)

The standards system was jiggered so that development on HTML 5 could proceed. It was finally issued as a recommendation a mere 17 years after version 4. By Internet standards, that’s forever.

Other things, more CSS/XML oriented was supposed to replace it entirely. Didn’t happen.

Certain things have been finally going away lately. Browsers will stop supporting the old Netscape plugin interface soon. Things like that.

If your web browser isn’t WebKit based, God help your soul.

(And Telnet isn’t dead. I’ve used it in the past couple months. More so than FTP.)

One thing we might be heading towards is a more “locked down” world. E.g., browsers only accepting signed extensions from their own app store. The next gen of CPUs will have seriously builtin support for DRM. So expect a lot of web content to require such hardware to support DRM.

Too much of the current Web is “open” to suit content providers. A significantly changed way of doing things is desired by many. What web sites you can access, what software can be used to access them, etc., may be the future. E.g., say goodbye to ad blockers.

Keep in mind that the Web isn’t some monolithic thing that can be swapped out in one fell swoop. The “web protocol” isn’t just HTML or even HTTP, but rather a humongous “stack” of interdependent technologies. Most of these technologies are constantly evolving, separately but simultaneously. As any one piece receives incremental upgrades, it still has to maintain compatibility with the other pieces in use, even as the other pieces follow their own distinct upgrade schedules.

Consider all the different things in use in just a typical browsing session: On the user side, the browser will have its own bleeding-edge interpretation of HTML/JavaScript/CSS that only partially overlaps with the current draft standards. This partial overlap is because these three technologies (in particular) see very rapid improvements to meet the needs of evolving web apps, and Web standards bodies and various competitors don’t always keep up with one another.

On the transport side, between the user and the server, there are improvements like Google’s SPDY/HTTP2, the invisible migrations from SSL to SSL2/3/TLS, various improvements to the domain lookup system (that translates “www.google.com” to a machine-readable address), the migration to IPV6 (so that individual computers or servers can be more easily reached without needing the machine equivalent of a very complex PO box system), the rise of content delivery networks that intelligently figure out where you are and serve content to you from the closest edge node, etc.

And finally, on the server side, the actual web servers have changed quite a bit (Netscape Server, Apache, IIS, nginx), single machines are being increasingly replaced by clusters of virtual machines in the cloud, the very languages used to write webpages are being changed and iterated upon (HTML, Perl, PHP, Ruby, Java, Python, .Net, everything in JavaScript, Google and Apple’s proprietary languages, etc.).

Taken together, this means that the Web IS constantly being improved. It may seem like nothing is really changing, but that’s only because the various players take great care to make sure their upgrades still work with other preexisting pieces of the puzzle and that end-users are minimally affected beyond needing to upgrade their browser and operating system every few years. Behind the scenes, a LOT is actually going on all the time. In fact it is very difficult, as a web developer, to keep up with all the changes.

As an analogy, it’s like asking if the highway system will over be replaced. As long as land-based traffic is relevant, why would you do any such thing? If particular cars are broken, replace just the cars. If the street addresses and directions are unclear, improve the routing system. If the pavement is unreliable, make that better one patch at a time. If the traffic is terrible, create additional routes or change driving habits. The same philosophies largely hold true to the WWW: When you want to make it better, you don’t just scrap the whole thing and start from scratch, but modularly improve the broken pieces.

The WWW is so modular, and so flexible, that completely different universes can be built on it. Apple and Android apps largely utilize HTTP, with some use of HTML and JavaScript, even though the end-user sees it all through a proprietary app instead of a generic web browser. Video games use the same internet routing as the Web, though they typically go lower-level than HTTP. HTTP isn’t the best protocol for streaming video, but somehow the streamers made it work. And through JavaScript, real-time two-way communication was hacked on top of HTTP; in practice this means you can have complex apps like Gmail or Google Docs or even entire video games that are really just webpages underneath, but act like mostly-functional desktop apps. This would not have been possible in the early 90s without an abstraction layer on top like Java or Flash, which were themselves iterations upon the other pieces of the WWW stack.

Soo… barring some catastrophic extinction event that wipes out most of the world’s sysadmins and programmers at once, this whole stack isn’t going away anytime soon. But pieces of it have been and will continue to be replaced, mostly invisibly to the end-users, but quickly and dramatically nonetheless.

Wait … and I can hear the whooshing sound already … but, what?

I have nothing to contribute, but I want to recognize the high quality of your post. Very good examples and clear analogies. :cool:

Most big sites use https because they believe it gives their site mystical powers.

But the other way around - browsing an extremely minimal website with a modern browser - works well enough. And that’s the important thing, because of the Internet of Things. At some point - which we may already be past - there will be more embedded systems than there are desktops and traditional servers, and they can’t all be updated to use the latest trendy technology.

Pile on: Nice post, Reply.

Thank you both :slight_smile:

Do you mean it was effectively replaced with ssh? In that case I suppose it’s technically pretty accurate. I’d certainly be dubious if I had to telnet into a computer instead of ssh. But I ssh daily. On work days anyway. So far from dead.

I still use Telnet and FTP with some frequency because they’re available and they work, but the opportunities to use unsecured communications are diminishing rapidly. SFTP and FTPS will remain in use for a while. Telnet is fading because it’s mostly useful for legacy interfaces which are disappearing, and even when they are available Telnet is hardly enabled on servers anymore for security reasons. But secure offshoots of these methods are in heavy use.

For what it’s worth, Adobe has essentially made two bids at replacing HTML. Once with Flash and again with PDF. Both support the features you expect from HTML, like hyperlinks between documents, so that you could in theory build a whole website out of Flash. Back in the day, some people actually did this and then they realized that building a non-standard website had a host of disadvantages - search engines couldn’t index you, and a big chunk of users weren’t running the right software.

So… it’s not impossible that HTML might be replaced by something entirely different.

On the other hand, HTML is just a markup language. I’ve heard it described as a subset of XML. If you think of it that way, then it’s actually very likely that (?)ML takes over everywhere - the new Microsoft Office docs now use an XML format, for example. Markup languages are very powerful tools for sharing information across computers/platforms because they are ultimately just text files with instructions for how to handle the data. As long as everyone interprets the instructions in the same way, they all end up with the same result.

As another example: more and more desktop applications are actually using HTML for their own interfaces. QuickBooks, for example, is dependent on Internet Explorer for many features because they’re just embedded web pages. If you want to see this, try turning IE security settings to their absolute maximum and see how much of QuickBooks you’ve now broken.

Do you think people will still be using Tim Berners Lee’s code even 100 years from now?

The Web will be in use into the foreseeable future. It has a huge installed base (pre-existing users of the software and documents created for the software), which isn’t something the Web had to overcome to become dominant.

The Web wasn’t the first hypertext system, depending on how you define the term. It wasn’t even the first hypertext system on the Internet. Gopher has a better claim to that: Gopher was invented by people at the University of Minnesota, and was designed to be menu-based and text-file-centric, although of course Gopher can (and does) serve HTML just fine. The problem with Gopher was that the University of Minnesota threatened to charge people for the privilege of implementing Gopher servers, which slowed its growth right as the first Web software coming out of CERN was beginning to hit the big time. And, since the NCSA (National Center for Supercomputing Applications) didn’t have any intention of charging people to make Web servers, the Web had a huge advantage at the critical period when people out of the usual large corporation/large university/large government sphere were beginning to come online.

My point is, the Web expanded into virgin territory, and anything coming up now will have to compete in a world where everyone knows what the Web is.