Well, despite the assholery of it all, this particular virus is quite well done from a technical standpoint. It takes a lot of ingenuity and skill to create such an efficient exploit. Think of it as admiring the skill of a burglar who could break into a big fancy museum. No one would condone his actions, but people would be impressed by his skill.
I would never say that the virus programmers are better than Microsoft’s. But I can see the point of the OP. Every virus or worm shares the exact aspects of design that people find lacking in Microsoft’s software: they are small, compact, fast, stable, and effective. I suppose that is easy for a few lines of monofunctional code which seek to exploit one aspect of an all-encompassing system.
On some level, software should be written like viruses. It should be modular, small, and efficient. It should be automatic except when you specifically want to customize. One of the biggest complaints that I have about Microsoft is that each release of software has more features, most of which I never use, which contributes to increased size, more frequent crashes, and decreased speed.
The virus writers seem to embrace an ethic of efficient programming which may be increasingly rare in the commercial sector. If someone could figure out how to coordinate 30,000 programmers in a fashion that they could write OSes in this old paradigm, I’m sure they could really make a bundle.
Well, properly and safely imploding a building takes a lot of skill, too. Are they better than the people that built it?
I’d love to see this, but don’t hold your breath. Egos, politics, communication issues, differences in skill levels, and the simple difficulty of coordinating the efforts of a large group of people make something like that more or less impossible. And that’s even ignoring the complexity of a modern OS.
I wonder, what physical construction can we compare the architecture of an OS to?
Nothing, really.
Software design is not analogous to physical design due to the lack of physical laws in the first case. Physics may be a pain in the ass to a freshman engineering student, but it’s a godsend to a senior engineer, as it cuts down on the number of possible ways of doing things. We don’t have that–we have to make up the rules as we go along, and it’s difficut to tell right off the bat which ones are good.
That said, there are known good rules for writing secure software, and MS is bad about following them. They need to seriously rethink that, or it could seriously hurt them.
The problem lies in the fact that MS is what, 90%+ of the desktop market, and a goodly chunck of the appserver market.
That many installations guarantees that you’re getting a lot of tech clueless folks, who don’t take the most basic measures to install patches, limit access, etc. The XP-series is a step in the right direction, as autoupdating is enabled by default, get a mini-FW, and other security gizmos.
Also, that scope of installation is what gets the news. It not as if a linux worm made the rounds that most people would notice…
Do you really think default auto-updating will close some holes without opening new ones? It has been argued here that security flaws are a natural consequence of software complexity. If that is the case, adding a new, complex feature is not the answer.
No, I think the fact that MS has a huge market share is also indicative of the issue. I, for one, am certainly not trying to let MS off the hook.
Oh so true. But any admin worth his salt will be locking down every unneeded service, port, etc. Some of that added complexity, if it goes towards helping the other 90% of drooling admins out there, I can deal with.
Of course, I personally think that a default install should have most services set to ‘Disabled’, but I can easily see many dorks burning out their brains, trying to get them started. Down time is down time, whether it comes from a virus or lack of knowledge. Tough call, sorta…
Frankly, I’m surprised that no nefarious virus/worm writer has, AFAIK, figured out a way to compromise the auto-update feature. We all know not to run programs from unknown sources, not to open certain types of suspicious email attachments, and so on, yet by enabling auto-updating, we’re opening our OS’s to unmonitored modification.
I keep my auto-updating turned off, and just visit the MS update websites periodically. I’ll keep some control of the process myself, thank you!
Why? It hasn’t yet.
The software market has taught Microsoft nothing so well as the fact that consumers want features, not security. If security were paramount, there’d be a lot more Unix systems out there. But people buy features, and bitch about security only when a worm gets them.
For all the slagging of MS on security issues, the voters (i.e., those who spend money on software) have proven time and again that security is just not that important to them. MS isn’t in the business of protecting users from themselves.
If Linux or MacOS were as popular as Windows, you can bet they would have just as many viruses. What’s the point of writing a virus if only 10 users are affected?
I’m up to CS 262, and we’ve yet to touch on security of any kind. I believe that those issues come up in a later course, whereas the first few years are meant to teach us all the development process, basic syntax, and so on. Of course, we aren’t writing server OSes, either.
I do actually know quite a bit about security, but it’s certainly not because of my college courses (yet).
But you are not the target-market of autoupdate. Users to stupid or lazy to visit the website are. The same users that think ‘p@ssword’ is a uber-l337 password.
They already have, sort of.
(Once the earlier version is installed, the patch/fix is effectively removed, and the system is as vulnerable as ever. Only this time the user doesn’t know it.) Sorry I don’t have a link to Brian’s article handy – this quote was taken from an automated email I had archived. But the official MS doc is here.
Interesting. It seems that security, like the writing of the operating manual, is something “tacked on” to the project instead of built-in from the start. Maybe this approach explains why both are frequently lacking from otherwise useful software.
Excellent point, Robot Arm. And let’s contrast how a flaw is handled in the auto vs. computer world. If a potentially dangerous bug is discovered in a car, serious attempts by the auto manufacturer are made to contact ALL owners of that model and get them returned to be fixed. Of course this is now mandated by law.
But with computer hardware or software, no such effort on the part of the manufacturer is expected. It is up to the user to be always up to date on the latest security flaws and patches.
Is it good practice for a user to be constantly aware? Of course. Can we expect all users to be so attentive? Of course not.
Can you imagine what would happen in the courts if Ford found a serious safety error in their Explorer (good choice of car for an example, R.A. ) and a driver was killed as a result. “Oh, that fix has been out for 6 months now, posted all over Ford’s web site, and he just forgot to install it, it’s all his fault, he got what was coming to him.”
Why is it the responsibility of the user, not the manufacturer?
Of course not. My question was intended to be about as serious as 53 Reasons Why Beer is Better than Women, and that’s why I used the word “comedic” in my post. Or perhaps I led you astray with an insufficient number of smilies.
I thought it might be fun to start with an upside-down premise, “What would happen if virus writers and system coders switched places?”
Amazing! I got the same combination on my luggage!!!