Aside from commercial greed, why NOT universal computers?

You can’t say ‘bloatware’ without saying ‘Microsoft’. Want to know why Microsoft Excel is so huge? There’s a Flight Simulator easter egg in it! If you perform a series of commands (put value x in cell y, etc.) you are taken to a Flight Simulator, where you can fly around and see the names of the coders and testers on a big, black pyramid. Lord knows what else the other Office apps have.
I have to agree on the previous posts: do I want to embed video movies in Word? Who comes up with ideas for these ‘features’ anyway???

-dd


From Hell’s heart, I stab at thee-

      • I am informed that much of the “bloatware” aspect of the Windows op systems and Internet Explorer is that they contain significant amounts of programming to account for, and work around many types of errors. I usually use Netscape for surfing, but if a site won’t load with Netscape I try visiting the same site with Internet Explorer and often it will show a “this page contains scripts errors” notice, and then show the page with some small aspect nonfunctional or missing. Netscape will load the HTML but won’t show you any image; if I had to choose between the two I’d take Internet Explorer any day. - I find this kind of interesting; it’s programmer geeks that seem to do the most bitching about MS software being top-heavy but the reason that software is top-heavy is to continue functioning with the programmers’ mistakes. - MC

MC writes:

You’ve completely missed the point. Except for really gross errors, you want the errors to be invisible to the average browser user. I suspect, in this case, Netscape is behaving right. I write some pretty tricky HTML code and tend to favor Netscape’s browser, though there are one or two nifty things that IE (Internet Explorer) can do that I envy. For the most part, I’ve found IE a bigger problem. A proper browser is supposed to ignore tags it doesn’t understand, but IE either reports script errors or does very strange things to the display. I sometimes have to put in special traps or create entire separate web structures to support IE. I also worry about IE’s non-standard java implentation and Active-X support. It’s no wonder the Wintel platform has so many viruses, MicroSoft makes it too easy.

First, the OP:

Why don't all computers run the same software? Well, they can't, because the instruction sets -- the primitive operations of the chips -- are different for different chips. Why not have everyone use the same chips? Well, because there are lots of ways to design chips and their accompanying instruction sets. A well-designed chip and instruction set has a lot of advantages -- fast execution, fewer operations to accomplish certain tasks, simplicity of compilers, etc...

Problem is, once you settle on an instruction set, you have to stick with it so your legacy programs still run. Meanwhile, someone else has an idea for a better architecture and implements it. If it’s good enough, people start writing code for that architecture. As someone said, you can emulate one architecture’s instruction set on another chip, but you lose a lot in speed, partly because of the overhead in translation, but also because the program was likely compiled to take advantage of the original architecture’s strengths.

For example, without getting into a Mac/PC flamewar, here are the facts: Intel’s architecture is widely recognized to be outdated and it speaks volumes for their engineers that they have managed to push the clock rates up to the levels that they have. But their chips are huge, relatively expensive, consume massive amounts of power, and run hot.

The PowerPC chips are a relatively new architecture. Yes, Apple’s benchmarks are inflated, but it is true that a 500 Mhz PowerPC is more or less equivalent to a 750 Mhz Pentium/Athlon. (Some operations are faster, some not so. The vector processing stuff is very fast. They’re not wrong about that.) The PowerPC chips, because of the architectural innovations are smaller, cheaper, take less power, and run cooler. Being newer, there’s more room for the architecture to grow in terms of speed.

So here you have a case where the basic processing elements of two computers are not equivalent for some very good reasons. They can’t run the same programs because part of the architectural differences is the choice of instruction sets.
Now, why so many different standards for, say, media players or publishing programs?

Sort of the same process as with chips. You propose a standard and it is accepted. Later, when available bandwidth increases or memory becomes more available or new compression algorithms get invented, new standards are developed and adopted. Often these are incompatible. Manufacturers often develop competing standards simultaneously.
Here, corporate greed does play a role. If you control the standards, you control the associated revenue stream. Software can be copyrighted and patented.

Finally, if your major income is, just as an example, selling an operating system, you don’t really want to see the advent of the “run on any architecture in a browser” universal software. So there is resistance to the idea of universal software.
Which is why we won’t see it soon.

Oh, can’t resist respond to theNerd’s:

>But, that isn’t the reason I don’t own a mac. The reason I don’t own a mac is because you cannot build one yourself. I can and have built all three of my machines, and have upgraded each a significant number of times. Macs are computers for people who don’t like computers. I’m not one of those people.

I’ve got a PhD in computer science. You can assume I like computers. I’ve got a Mac.
Thing is, once you’ve proven that you can learn six or seven editors, compilers, languages, and so on, it gets old. The Macs were the first machines on which you didn’t have to learn and remember esoteric incantations just to use a word processor. Even now, there’s just less fiddling around with the computer and more just using it.

To return to the OP, this illustrates that you can’t have a “universal” operating system, because what people value in an operating system differs. Some people value raw power and the ability to customize the environment (Unix). Some people value ease of administration, security, and robustness (VMS). Some people value the user-interface (which is not necessarily mutually exclusive of any of the above).

Finagle -
Are you pretending the G4 does not maintain backwards compatability to the 68000 chipset? Sure thats not as old as the 8086 and it was more advanced to begin with but your point is irrelevant. Both processors have the legacy issue.

There is absolutely no compatibility between the PowerPC and the 68xxx family. Completely different architecture.

Apple solved the legacy issue by developing emulator software that ran their old software on the new chips until manufacturers could recompile code for the new chips. They could get away with this because the PowerPC chips were just under an order of magnitude faster than the 68xxx family. Even so, the emulated code was notoriously slower than native code. That’s why, for several years, Mac software was delivered as “fat” binaries. This was code that was compiled both for the 68xxx family and the PowerPC family. The operating system executed the appropriate code.

This emulation software, btw, was a major tour de force of technology – PowerPC-based Macs could run almost all legacy software with great reliability.

Apple is one of the few (only) companies ever to change architectures in mid-stream and survive.

Note that Intel is rapidly approaching just such a “flag” day. Their new generation of chips, due in 2000 or 20001, are going to be more RISC-like, and they will probably have to employ some kind of emulation to support current software.

Well, learn something new everyday I guess. Care to speculate why PCs won, and Macs still survive while vastly superior platforms (for their time) such as NeXT and Amiga bit the dust?

      • And you’re completely missing my point. And that is, Netscape seems to load pages a bit faster, but IE will show pages with errors more often. Netscape will load (some, at least) of the HTML, but often doesn’t try to resolve it into any sort of image. I don’t want to sift through one or two hundred lines of HTML code to try to figure out what went wrong; I just want to see the page. IE says “there’s an error” and then will often show most of what the page is supposed to look like, when Netscape will not show anything. If an error causes the browser to ignore the entire page, it’s not a very useful browser.
  • I don’t know enough to get technical; I just know what works more often. If IE is more of a hassle to write script for, I don’t really care - that’s not my job. I’d guess most other people feel the same way. - MC

posted 12-17-1999 11:21 PM

>Well, learn something new everyday I guess. Care to speculate why PCs won, and Macs still survive while vastly superior platforms (for their time) such as NeXT and Amiga bit the dust?
Well, as long as you realize it’s just speculation and has possibly no relation to the reality of the situation.

  1. Why PCs “won”. You know that commercial of Apple’s that pissed everyone off about business-suited Lemmings flinging themselves off a cliff? Well…people didn’t like it because the truth hurts. PCs bore the IBM imprimateur. The saying was, “No one ever got fired for buying an IBM machine.”. A significant factor was the reverse engineering of the BIOS chip, which let everyone and their brother make cheap clones. Economies of scale rule.
    (Note, Apple is criticized for not allowing cloning of their machines. Cloning keeps the architecture popular, but not any particular manufacturer. IBM for a long time and regularly took a bath on their PC division.

  2. Apple survived probably because they had a sizeable war chest from their years of selling Apple II’s, some truly creative engineers and software designers, and, until '95 an incontestably better user interface. Since '95, an arguably better and more consistent interface. And enough of an entrenched user-base with a software investment and a (possibly irrational) fondness for the company to keep them alive.
    Their problems in the '90s were mostly management related.

  3. NEXT was simply ahead of their time – a superior operating system on very expensive hardware.

  4. Amiga, I’m not as familiar with. I think its story is summed up by: superior hardware and software, and inferior marketing and management.

I would agree with this, but also point out something else: most every business already had a very significant relationship with IBM. The IBM selectric was the typewriter of typewriters, and I doubt if hardly anyone had a Mainframe that was not an IBM and did not run MVS,VTAM, CICS,IMS and DB/2. It was only natural that when users started asking for desktop machines that the IT division looked to IBM first. For many people, buying a computer at home was for work first, education/gaming second. If you used a PC at work, you wanted to have the same thing at home (also so you could copy all the software at work!).

I would agree with this too, except I’d point out that I don’t think you can argue that OS 7 is better than 95 - with OS 8 you have a decent argument. OS 7 had way to many annoying problems such as trying to force the user to insert a disk that the last user had been using (if I had a dollar for everytime I told a user how to do the command-period thing or whatever to get rid of that dialog I could buy Bill Gates). Of course, there are different kinds of users - my experience with Macs was purely from supporting them in a campus computer center - we had a third as many Macs as PCs and they generated more problems. I’m sure someone who was familiar with the OS would not have these types of problems. My biggest interface gripe is the lack of a command line - there are many things that it is faster to do from a DOS prompt or with a batch file you write on the fly.

Along similar lines: What things could IBM have done (or not done) that would have resulted in OS/2 being the dominant platform today? I’m not a PC user of either the Windows or the OS/2 persuasion, but I’m under the impression that OS/2 was thoroughly more advanced than the Windows that was its competition at the time of its release and initial marketing (Win 3.1?); in fact, I gather that its devotees still say it is superior to Windows NT and blows away Win98 and Win95 (and Mac, and AIX, and Linux, and AmigaOS, etc). In what ways is the failure of OS/2 similar to the failure of MacOS or AmigaOS to become the dominant personal computer platform? In what ways (in addition to the obvious, that MacOS and AmigaOS were not compatible with x86/PC architecture) is OS/2’s failure to rule as dominant OS different?


Designated Optional Signature at Bottom of Post

OS/2 is, from an architectural stand-point superior to windows 9x, and probably comparable to NT. From the interface standpoint however, it is horrible. I’ve been using and supporting OS/2 warp four for the last year and a half, and I have nothing good to say about it. Let me give you an example: if you install any service release that affects networking protocls, you cannot later install other packages from the installation CD. Basically, you have to manually extract and copy the files to the hard drive, then you can download and run the service release for them. I have never had this problem with NT - the worst that can happen there is you install the service, then reinstall the SP and it all works.

Also, even at version Four I think the GUI itself is not really superior to Windows 3.1. When it was released, it required more memory, faster hardware in order to run at a decent speed. For all this, you got an interface that was questionable and incompatability with many of your existing progams - the programs that you could run sometimes required extensive tweaking of your VDM in order to get them to work.

The adherents of this OS are, in my opinion the most ludicrous of the entire OS controversy. It is not surprising at all that this OS has failed, the only surprise is that IBM continues to dump good money after bad (although not much money, judging by the quality of the product).

Windows NT (and Windows 2000), has a hardware abstraction layer that will allow the operating system to run on any architecture, as long as someone writes a kernel for it.

Mac people don’t like to admit it, but the IBM PC had a lot of advantages. First, it had an open architecture. There were clones, which brought the cost down. You could buy high-performance graphics cards and big monitors (and the original Macs were saddled with small, B&W monitors). The Mac’s biggest advantage was its operating system, but that didn’t matter to the average PC user of the time, who simply wanted to pop in his Lotus 1-2-3 disk and start spreadsheeting away.

A better question is why did the Amiga and machines like the Atari ST die? The ST had most of the advantages of the Mac, more memory, was faster, had a larger monitor, and cost 1/3 of what a Mac did (I was selling computers back then here in Canada, and the 520ST was $999, while the 512K Mac was $3495). The Amiga was similar in price/performance to the ST, with arguably a better architecture and better graphics.

I think Finagle must be right that the demise of the Amiga was due to poor marketing and management. There might be something else at play here, but I cannot think what. I think part of the marketing problem was that it really was sold as a kind of toy - great for playing games, making pretty pictures and music. Most people at the time were really buying computers as tools - at least ostensibly. The fact that the Amiga was great at doing all the things PCs and Macs were already doing is kind of irrelevant when that is not how the public perceives it. When the Nintendo came along at 1/10th the price and met many of the desires of the Amiga market, it simply had no chance.

I think there were a number of factors that probably contribued to the decline of the Amiga, but I’m not sure this one was real high on the list. It might seem like a reasonable explanation here in 1999, but for a long time, the Motorola 68K series CPUs had it all over the Intel x86 line, both performance and architecture wise, and the Amiga had better hardware in general (back in the mid to late 80s) than the PC or Mac. It wasn’t until PCI, for instance, that the PC got a bus architecture as good as Zorro II/III, and the Amiga never had issues like IRQ conflicts.

On the other hand, it was never pushed into the business or educational markets like the PC and Mac were, at least outside of a few niche audiovisual applications such as TV stations. Nobody ever bought an Amiga because they had one at work or their kid’s school had them. This was before “multimedia” was a buzzword, so digital sound and high color animated graphics were seen as making it a games machine only, and consumers had no clue what preemptive multitasking was (“why would I want to run more than one application at a time?”). It also never had anywhere near the brand recognition of the PC or Mac, even in its heyday, since Commodore never spent much effort to market it, and the software base was heavily tilted towards games, and thus kids instead of adults. I guess it was really killed by a range of factors.

Little known fact: Microsoft used to write Amiga applications. They were generally the worst ones for the platform by a longshot, doing things like busy-looping while waiting for input that were OK on a single-tasking PC, but horrible practice on a multitasking OS.

Another interesting machine that I have little knowledge of is the Acorn Archimedes. I wonder if anybody reading has/had one.

peas on earth

“PCs won?”

I think that is debatable. The platform is certainly more prevelant, but from a business perspective… well, let’s just say I’d rather be Apple than any other PC OEM…

As for Atari and Amiga. I think the real problems were (1) perception - they don’t look like serious computers, so no one takes them seriously (2) software - there was never really enough inertia to get software
developers excited and (3) differentiation - they didn’t distinguish themselves significantly from the Macintosh, except in price. Of course, Amiga had the GenLock, but this was not useful to the general public at the time.

NeXT? Steve Jobs was definitely ahead of his time and too stubborn to admit that the world was not ready for a network-only computer. Lack of a floppy disk in the early days of the NeXT box was still a limitation.
Also, like it or not, NeXT was in competition with Sun Workstations not PCs.

I agree with Cooper’s observations regarding IBM, but disagree with the viewpoint that Mac OS7 was inferior to Win95. Your point about floppy disks is actually an excellent example. The Mac OS would only ask for
the floppy to be reinserted if it had been ejected while the disk had not been updated. This was to give the user the chance to allow the update. If the user chose to disallow the update, a simple command-period dismissed the dialog, never to be seen again. In Windows I’m constantly nagged about missing floppies or CD-ROMs, even when I’m not doing anything to access them. Usually, after I tell the OS to Ignore or Abort a half dozen times, the dialogs stop. But here’s the really annoying part. If I reinsert the floppy, the OS doesn’t even know if it’s the same floppy or that it may have been changed until I do a Refresh. With the CD-ROM, it’s even more of an issue. Sometimes if I tell the OS to ignore the missing CD, it locks out the CD-ROM drive such that I can’t get it to recognize CD-ROMs anymore. I have to restart to
correct this condition.

In any argument about the relative merits of one OS over another, the thing that stands out the most for me is this. I’ve never, ever, EVER had to reload the Mac OS to correct some system conflict (or for any other reason, for that matter). Not in OS 7, not in OS 6, and not in OS 4… The OS on my brand new Dell system has had to be reloaded twice in the last 5 months and most Windows power users tell me that they reload the OS at least once a quarter… To me, this is one of the most obscene things I’ve ever heard of. I could go on, and on, and on, but I won’t… well… maybe just a little…
Cooper writes:

My company replaced all of their Macs with PCs. It takes 4 times as many techs to support the PCs as the Macs and the support is considered to be grossly deficient, still. One of the techs once admitted to me that this is why he always promotes PCs over Macs… you can’t build empires supporting Macintoshes!

Most users don’t need a CLI. For those that do, Mac power users use Frontier, MPW, or MacPerl… or they run Linux.
dhanson writes:

This was good in theory, but the liscensing fees and the complexity ultimately make this infeasible. Most other platforms (that I know of) have dropped NT infavor of Linux/BeOS, etc…

No doubt about it. Even the most evangelical Mac person has to admit that the open architecture of the PC helped propagate the species. The question is, was this an advantage to IBM or merely the platform. Linus Torvalds may be perfectly happy with the meager financial gains he gets with open architecture… but don’t fault Apple for having a business plan.

Not to mention bringing the overall quality down, as well.

COULD” being the operative word there. Let’s face it, though; in the mid 80s when the Macintosh first came out, most PCs had 14 or 15 inch monitors with 640x480 resolution and the text on these displays was not that crisp. Given that this was the environment that the MacPlus had to compete in, I wouldn’t say the PC world had that much more to offer… except for color, at first.

Except that the installation process for Lotus 1-2-3 was far too complicated for “the average PC user of the time”. Many users had to pay experts to manage their software.
Cooper writes:

bantmof quotes and writes:

Of course it is! Up until very recently, this was even a semi-valid argument against Macs. Of course, now that Macs can run much faster than PCs and run reliable emulation software such as Virtual PC, this argument doesn’t hold up as well… Now tha Macs can run more software than PCs. I’ve always felt there was enough software for the Mac to meet my needs, though I don’t play games. I have friends who have Amigas and Ataris and lack of software titles is their biggest complaint.

Replace the word “Amiga” with any other platform that Microsoft develops or has developed software for and I think the statement remains equally valid…
Whew… rather long winded… responding to a number of posts… sorry, somewhat ad hoc…

Oops! A couple of problems in that last message. Somehow one of my responses got pruned:

Cooper writes:

Almost no marketing OR management. Unfortunately, this is what happens when very technical computer nerds expect computers to sell themselves, based on technical merit. Gates, Jobs, and Michael Dell are much better business men than they are techno-wizards.

I see, that you can’t nest quotes. Hopefully it will be clear who wrote what and what I intended in the sement that starts out with:

> bantmof quotes and writes: