I’ve been seeing this idea with increasing prevalence as time goes by, and I just don’t get it. “The Internet”, at its heart, is nothing more than a protocol that specifies how data is transferred among computers. A short list of things that are not “scrapping the Internet”:
[ol]
[li]Implementing quality of service (QoS) as part of the core routing algorithms[/li][li]Setting up mandatory filtering (censorship)[/li][li]Creating the hardware infrastructure / technology to supply the “last mile”[/li][li]Making better use of data caches and P2P protocols[/li][/ol]
What the hell are they talking about? And I ask because articles like this always cite people like Vint Cerf and Larry Roberts (from the linked article: “who, in 1969, managed the Pentagon’s APRAnet, the precursor to the Internet”). Surely, if anyone would understand what “scrapping the Internet” meant, it would be them. Is it reporting sensationalism? Corporate whoring? Or am I missing something?
An even better question: what might actually qualify as “scrapping the Internet”?
I wonder about certain ancient deep structures in how the Internet works which can affect how efficiently new content operates (a la all the DOS/Windows baggage which we still have to deal with on PCs). Someone more tech-savvy than I can chime in on this thought anytime.
This would be an example of what I’m asking, as it was mentioned in another article a couple weeks ago. Here’s my thoughts:
Spam is one particular type of email message. Controlling spam has nothing that I can see to do with basic internet operation – it’s an application-level problem, nothing more. A solution would have to take one of three forms: either a white-list (only allow email from a specified set of hosts), a black-list (do not allow email from any of a specified set of hosts), or block individual messages by content. Each solution already has some working implementation; evidently, all are bad enough that there are calls to “scrap the Internet”.
Now, I readily admit that there might well be a modification or revamping that would solve the issue. If I knew what it was, my guess is that I could make a mint (or at least a name and immediate position for myself in academia). But I believe things like this are being cited as problems that no amount of tinkering with the foundations will solve.
I’m not sure, but it sounds like you’re thinking of the upper levels of an OS’s network (protocol) stack. That is, the efficiency of getting data from the transport layer to the application layer on an end system. I recall covering the x-kernel in an OS class; I had to look it up (not remembering the name), but what I took away from the topic was the importance of proper buffering and cost of copying data within a system. I’m sure I’m not doing it justice, but it’s the best I got.
As an aside, AFAIK, the most common network stack is from BSD4.3. It would seem that although MS used to use it, they’ve included a “new” stack with Vista (scare quotes for a reason). Just an interesting tidbit I picked up just now…
God, imagine the productivity! I don’t know what the hell I’d do when I don’t want to get anything done - start smoking, I guess, or start caring about sports. We’d have to get a water cooler.
Throwing out all the IPv4 routers and replacing them with IPv6 equipment might be considered a start.
Replacing SMTP with something new and authenticated to prevent Spam is another.
Replacing DNS with a more robust and secure name system would also help.
These are the sorts of changes that would break the current infrastructure, and require massive rebuilding work by ISPs. However, they would give us a more manageable and reliable internet.
I think si_blakely’s examples are exactly what is meant. You clearly don’t have to scrap the servers, the routers, or the interconnect, you just have to do a change of protocol from the bottom up, especially for authentication. Before the web, before domain addressing, you knew the machine that all email came from, and no one who I read ever thought about the consequences of letting the great unwashed in.
Scrapping it would require the rewriting of a hell of a lot of software. And it wouldn’t fix anything, since zombie bots are not an internet issue, but an issue of poor security on the part of a common OS (and to some degree the cluelessness you’d expect when anyone and his mother gets on.)