More BS from Micro$oft?

http://www.microsoft.com/ntserver/nts/news/msnw/LinuxMyths.asp

Is any of it true?

Well, MicroSoft obviously has a vested interest in denigrating Linux, but I just read the latest issue of PC Mag and they say yes. You can probably read the article on-line at http://www.pcmag.com

And now, for the other side of the story… Microsoft is obviously running scared. Windows 2000 is running WAY late, and is still not ready for prime time (I tried Win2000 RC2. What a bugfest!). The ‘Myths’ page is a combination of half truths, factoids taken out of context, and outright lies. The Linux community has mostly been laughing at this bit of tripe. For a reasonably good rebuttal, check out this Linux Today article.


Yes, I AM an agent of Satan… but my duties are largely ceremonial.

Yes and no. MS is not giving the full story, but of course they really shouldn’t be expected to. In marketing, everyone wants to make themselves look best.

The tests that are referred to were geared towards a situation that would specifically illustrate the situation in which NT performs better – high scale single systems. Linux is not well designed to utilize multiple cpus. This can be a bit misleading. Those high scale systems are very expensive and often not needed. I run a Linux webserver on a plain old PII-233. It gets about 750,000 requests a day, and doesn’t break a sweat. I’d love to see a comparison of NT vs. Linux on a system like mine. :slight_smile:

Linus Torvalds said he had no doubt that NT would win this test because is was a very “NT friendly” test. He also said that it would effectively serve as a reminder to Linux developers of where they need to focus next.

MS’s attack on the claim of Linux’s greater reliability is not substatiated. Their argument is basically that no entity guaruntees the reliability of Linux. The issue of file journaling is valid, but file journaling is a fairly new idea that will undoubtedly get added to Linux in the near future. The claim that there is no proven high-avaliablity solution completely ignores the fact that there are proven network-based high-availability options that are independent of the operating system.

The argument that Linux is weaker in security is not only a bad argument, it is also pretty funny. They argue this because UNIX systems were originally designed with a moderate, naive assumption of trust of users. They ignore the fact that UNIX security has had over 20 years of challenges to help it improve. NT is relatively immature in security which is reflected in some of the gross holes that have been found in it in the last several years. Most exploits of holes on recent UNIX style systems are very very hard.

Their claim that administration cannot be delegated is just plain untrue.

The cost issue is largely speculation. One argument is that administration is easier, and thus you can employ cheaper admins. The problem with this argument is that administration of NT has poor remote access and poor options for automation, so it is likely that NT will require more admins to manage. In reality, the comparitive cost effectiveness will undoubtedly vary by situation.

NT is still superior on the desktop for the most part. Few would debate that.
What MS should be worried about is that (at least IMO) Linux is improving faster than NT is improving. I think their recent increase in focus on Linux may reflect that.

As usual, I need to add a comment that I should have added at the end of my last post.

Another thing that I think rightly scares MS is the psychology of this fight. Many look on this fight much as they would look on the Star Wars fight between The Empire and the rebellion.

I’ve found an
interesting article from TechWeb on NT vs UNIX for chip designers.

Some of my favorite quotes from the article:

Analyst Ron Collett, president of Collett International (Santa Clara, Calif.):

Mike Murray, CAE manager at Acuson Corp. (Mountain View, Calif.):

One more that actually mentions Linux by name:

MicroSoft wrote:

I’m not sure I agree with this, but let’s ignore that bit… They seem to have forgotten that NT fundamentally relies on DOS and the 8080 architecture. For the history buffs, MS-DOS was cloned from CP/M which was developed by Gary Kildall in 1972… which means the NT OS is based on a 28 year old operating system technology and architecture…

I’ve not used Linux, yet, but I’m a long time user of various unix boxes, as well as NT, and the Mac OS. I flip-flop between which is superior Mac OS or unix… but I don’t even put NT in the same class!

As for reliability, unix wins hands down. I have only to look at our tech support staff as evidence. It takes twice as many techs to support the same number of NT boxes as unix boxes. By the way, when our desktops were predominantly Macintoshes, our desktop support staff was about a quarter of that required to support the NTs.

This is not the fault of Linux - it’s the fault of Microsoft. Trust me, I had to retrain myself when I moved from Mac and unix boxes at work to NT. The Windows GUI is so counterintuitive it’s laughable. It’s true, if you move from Windows NT to Linux, you’ll have to retrain yourself… but I’d consider it relearning to do things right!

Not that I have an opinion on any of this…