Overhauling the Internet

Well now, the problem is in the assumption that everything about the systems in incompatible.

Yes, they have different kernels and can not run identical compiles of the same software, but that it not really what is important to the user. What keeps “minority” operating systems is, IMO, file incompatibility.

The “better mousetrap” here would be, for example, a non-MSWord word-processing program for a different operating system that would correctly open “.doc” files, and so forth. Widespread presence of such programs would give the user more leeway to switch operating systems when a better one came along, and you would not have so many identical machines waiting to succumb to the same virus, thereby “curing” the internet virus problem indirectly.

There has been some work done on this in the past, but nowhere near enough, and no one is to blame but the software developers for missing this opportunity.

A greater variety of machines would still make attractive targets, but the potential impact of each would be less, which would still have a reducing effect on the number of viruses out there. There are millions of Mac machines waiting to be scuttled, but there have been very few devastating Mac viruses nonetheless.

Gods, where to start on this one without getting overly technical. First off, you have mutually exclusive parameters. You want an open and free flowing system, easily usable by the masses of people out there. Then you want to eliminate ‘abuses’ such as spam, virus’s, pop-ups, etc.

Under the current system it can’t be done. Period. You COULD address some of them, and I certainly think the infrastructure side of the internet is long overdue for a serious overhaul (as well as the transport…IP 6 perhaps system wide). This will allow for greater speed and (more importantly) more fault tolerance (the infrastructure overhaul I mean), but won’t really address the OP’s ‘problems’.

Generally, because of the very nature of the Internet, and the organizations that ‘control’ and ‘monitor’ it (lol), you will never be able to eliminate such things. To do so you would need to go to a vastly more restrictive (and centralized) system…and then it wouldn’t be free and open anymore. It wouldn’t be user friendly anymore either. It would be hard then. People wouldnt like it, because it wouldn’t be transparent to them as it is now.

So, what you will have instead is a constant game of catchup. Anti-virus vendors will continually update their definitions (for a fee), software companies will write anti-spyware software (for a fee), spam blockers and filters will be continually updated (for a fee…I think you get the idea). More sophisticated users will take advantage of all this to protect themselves (I don’t have a problem with spam, popups or virus’s at all), less sophisticated users will have to slog through them. And the hackers out there (and virus writers, and spammers, and java script guru’s) will continue to push the envolope.

And you know what? Its all good in the end, because the web is a free and open forum, easily usable and accessable by the common man all over the world, not like it used to be when it was only a few of us propeller heads using unix systems to play about. For all the bad, there is a hell of a lot more good IMO. Hell, look at this forum alone. :slight_smile:

-XT

Apart from the spam / abuse / security issues, if we were “redesigning” things from the ground up, there might be a larger issue to consider, as long as we’re talking grand overhauls.

What the average person thinks of as “the internet”[sup]1[/sup] has grown from a document sharing and messaging system to a defacto application platform. It was never originally intended to serve in this capacity, or to this wide an audience, and things would have been designed much differently at the outset if we had known we were going to use it for commercial order handling, online HR applications and so on. Instead, we bolted on forms capabilities, cookies for session management, certificates, mail enablement, etc, as the need for them arose. It’s like house that started out as a cute little cottage, and has grown to mansion size by haphazard room additions. It functions, but it’s architecturally ugly, fragile, and hard to manage.

Given the chance to redo the web as a proper framework for applications running remotely through a lightweight client, including security / permission considerations, we would probably find that we would have better mechanisms to deal with the inevitable abuses. We ain’t going to get the chance.

In more practical terms, and in terms more specific to the OP, I more or less agree with micco. The crux of it is a legal / social issue, not primarily a technical one.

[sup]1[/sup] - more properly, the internet refers to a network using TCP/IP as a transport mechanism. What we are really discussing here are some applications built on top of that transport.

As I said, it’s easy to make exceptions for legitimate mass mailers and friends, simply by making a whitelist that contains their email addresses or public keys.

That’s true of basically every security technology in use today - they all depend on the limitations of computing power. Real quantum computing in a form that’d be accessible to spammers and crackers is still decades away; I wouldn’t abandon this system (or PGP, SSL, AES, etc.) just because of the theoretical threat that quantum computing poses.

One other problem is that of “privacy.” You can have good security, or you can have good privacy, but the only way to get both is by unplugging your computer.

I haven’t been paying much attention to this for a while, but most private attempts at one or the other have been met with outcry from the opposition (who frequently support both, it gets confusing). For instance, when Microsoft started talking about Palladium (sp), which was a system to give each processor a unique “tag” that would identify it whenever it did anything (sent an e-mail, made an online purchase, etc) to prevent fraud, piracy, mislabeling, etc, people went absolutely nuts (75% of which could be attributed to the fact that it is Microsoft we’re talking about). Hell, people went nuts when MS started having XP report unique system configs to prevent piracy.

You have a situation where people demand to be able to remain anonymous, yet also demand that they be protected and laws be enforced. Unfortunately for them, it is pretty hard to enforce laws when everyone is anonymous. Kinda like why no one trusts Microsoft’s Passort (or any other competition’s) system. They (it seems) don’t want anyone tracking their activity and purchases. Again, unfortunately, all we’ll see is more and more passwords and usernames and registration and crap to prove who we really are.

In essence - laws are a lot easier to enforce when the enforcers know everything about everyone. On the flip side, knowing everything about everyone can be horribly mishandled and abused, and we won’t let it happen.

To be fair, Palladium was a lot more than a unique tag for each computer - perhaps you’re thinking of the processor ID on some Pentium chips.

A lot of the outcry over Palladium/TCPA was that the secret key inside each computer was to be kept secret from the computer’s owner, not just the outside world. It might keep your email protected from unauthorized viewers, but you yourself could become an “unauthorized viewer” at the email software company’s whim, and your PC would be on their side, not yours.

A few basic changes (let the owner access his key, let the owner choose whether to trust an unsigned program, make it more compatible with open source software) would have made Palladium more palatable as system to protect PC users’ security and privacy… but that was never the intent of the project. The security it provided was for software and media companies, not users.