Does Microsoft innovate?

Visual Basic came out in 1991. There were several visual IDEs before this, including ThinkC and TurboPascal.

Really bad example. Age of Empires is based strongly on Sid Meier’s Civilization series, which came out first, and were in turn based on a 1980s board game.

What’s funny is that IBM was barely aware of the market they had.

My Dad had a lot of copies of Byte hanging around the house at the turn of the 80s, and all the writers were saying that the mass-market micros then current were mere toys compared to what IBm would certainly come out with eventually for everyone.

Meanwhile, according to this guy, who was doing internal technical writing for the PC at the time:

Great example. That is what Microsoft does. They take over someone elses ideas . Improve them and make a bundle.
They have a huge staff and I am sure they have advanced computing by leaps and bounds, but they suffer from what made IBM call Gates. They are too big to do anything new and fast.

I agree that’s what they do (apart form the “improve” part, although I haven’t played AoE, so I speak from experience with their other products, rather than that specifically), but I’d be hard-pressed to call it technical innovation.

Marketing innovation, on the other hand, as mentioned earlier, is their main talent.

This thread is all about examples of this. Got any?

I disagree. OS/360 can hardly be called a timesharing system, after all, and that was hardly the first OS. Operating systems are a layer between the user program and the bare machine, in the simplest sense. I’ve used lots of non-timesharing OS’s. Though you’re right about there being no operating systems in the '40s (and early '50s) the existence of an operator does not mean there is no OS. Operators protected the computer from the user, and did tasks (like feeding in cards, changing tapes, etc.) that you didn’t trust the user to do.

As for the OP, I came in here thinking that with all the people working for MS there must be some, minor, innovation - but I’m drawing a blankl. Automatic update, maybe? DLL hell for sure. :slight_smile:

PnP hardware–if they didn’t invent it, they at least did a lot to popularize it.

Remember, if not for Estridge the PC would never exist. It was built and designed in a vastly different way from anything else IBM had ever done - that is, the use of external components, which enabled the clone market. That probably led to the explosion in that space more than anything.

Since IBM had never been in the consumer space before (except, maybe, typewriters) you can’t blame them for not forecasting very well. The PC was designed as a business machine, and I’d argue that helped in the consumer space also, since a lot of the early consumer purchases were work at home type of deals. The TI, Atari and Commodore machines were purely consumer machines. IBM was able to introduce relatively expensive innovations for the office market then mgrate them to the consumer space. The IBM PC wasn’t a big deal because it was so much better than anything else (especially the first, two floppy, model,) it was a big deal because it was the IBM PC.

IBM did try something directly for the consumer space - and I’m sure you remember how the PC Jr. turned out.

Maybe. Apple’s MacOS 9, introduced in 1999, had “Software Update”, which is their tool for the same thing. When was the feature introduced in Windows?

They popularized it. The Mac has been “plug and play” almost from the beginning, for the hardware standards it has supported anyway.

OS/360 appeared in 1964, three years after the CTSS at MIT, and seven years after development on CTSS had begun, leaving plenty of time for people working on it to leave MIT and get jobs at IBM, taking the multitasking ideas they were working on with them.

The need for a layer (there is no barrier other than knowledge to writing a program that makes a chip perform a function directly, with no OS) only comes about when you want to multitask. The first need for multitasking came about when timesharing of a large computer system became desirable in the late 1950s.

True dat. My only point was that the push for timesharing created the need for multitasking, and the desire for multitasking was the impetus for the idea of an operating system. Timesharing is simply the idea of letting multiple users run their processes on the same computer in a cognitively simultaneous way. Build a system to do this, remove the extra users, and you have a multitasking system, which is what an OS really is.

I was merely trying to provide historical context, not claim that an OS MUST be a time-sharing system.

Microsoft didn’t come up with that. The bean-counters at Microsoft should offer daily thanks to the Gods for the rise of Compaq and the other early IBM-compatible clones.

In sequential order:

• IBM prototypes a personal computer, to be called the Personal Computer, or PC

• IBM has the most expensive brainfart in history and decides to farm out the OS development to any 3rd party who wants to supply them with one

• Microsoft beats out CP/M for the bid, handing in something they acquired, originally called QDOS (quick and dirty operating system), repackaged as MS-DOS

• The IBM-PC does quite well, marketshare-wise. Each IBM-PC runs MS-DOS and ships with MS-DOS on diskettes (or, upon the ubiquity of hard drives, preinstalled)

• Other hardware companies reverse-engineer the IBM-PC and produce IBM-compatibles. They run MS-DOS, too.

• MS, which didn’t do anything in particular to make that happen, sure doesn’t do anything to prevent it. IBM does, (tries to change architectures, micro-channel I think it was called? Also tries to supplant MS-DOS with OS/2), but to no avail. So no one needs an IBM-PC to run MS-DOS (and later, Windows).

• Fast forward a couple decades and Apple, whose computers were not reverse-engineered and whose OS is produced in-house to run only on their hardware, is still in business and doing decently well; Microsoft, whose OS runs on everyone’s hardware (including Apple’s nowadays), kicked everyone’s butt and is the biggest software company on earth; IBM, whose hardware got reverse-engineered and the OS of which is made by other people, bailed out of the personal computer business (they put their logo on machines made by another company).

Which is exactly my point. What has always interested me is the huge disconnect between what IBM was building and who they thought they were building it for, and the consumer market’s assumption of what was being built.

The assumption prior to the release of the PC had been that IBM, being the premier American computer company, would come out with the most kick-ass consumer desktop machine, and you should wait and not buy any of the also-rans. When in fact, IBM seems to have seen the development of microcomputer technology as merely providing the opportunity for an adjunct to their mainframe business. They (and all the other big hitters who later introduced clones) seem to have been caught with their pants around their ankles by the size of the consumer market.

A vicious circle, I believe. A lot of consumers were not buying a desktop because they were waiting for the established companies, rather than start-ups like Apple, to introduce one, and the established companies were not introducing them because they thought the market was small. Then IBM introduces what they think is their single-task microcomputer for small businesses and workgroups, the consumer market explodes, and suddenly you have Xerox, DEC, and HP bringing out clones instantly.

Many reviews of the original PC seemed to be operating on the assumption that since it was IBM’s, it MUST, de facto, be by far the best one available, which is arguable*.There were only a couple of lukewarm reviews of the PC in 1981, which were essentially saying, “This is it? I put off making the major investment of a desktop computer for THIS?” Not that it was a bad machine in any way, but it did not distinguish itself to the extent that people had assumed it would prior to its release.

*This assumption of excellence in the minds of many, without actually doing a side by side comparison, has now passed to Microsoft, IMO.

If I write this post with only 100% original ideas, not rehashing or repeating any concepts or viewpoints that you’ve ever heard before (yeah, I know…for the moment pretend that I’m actually able to do that), …

…you won’t understand it. None of you will. You wouldn’t have any frame of reference. No jumping-off point, no place to start out from that you already understand in order to follow where I’m going with the part that is new.

Good innovative software companies take some good ideas (oftentimes some that have never been brought together into one place), refine them a bit, implement them well, and put them into our hands and we say “wow”.

Microsoft occasionally does that. I personally think Excel kicked the butts of Lotus 123, VisiCalc, and any other spreadsheet program that had preceded it. Better than Quattro Pro, Claris Resolve, and others that have tried to go toe to toe with it. For once, despite all the trinkets, doodads, bloatware features, and API spread that tends to flourish in product lines (esp MS product lines) as they age, here’s one that you can still just launch and hop right in and start using; it has been and remains simple and intuitive, and the gewgaws and deelybobbers don’t get in your way.

Is Microsoft unusually bad about mostly buying out successful companies or copying what they did without innovating much? Well, yeah, in my opinion, they do indeed have such a track record. What’s more annoying is that they often buy out a good small company with a kick-ass product and then don’t do squat with it and it withers and dies. Run any FoxPro databases lately?

Did a lot of successful Microsoft products kill their competition mostly because of the Microsoft brand and/or the bundling with other MS products (including the OS), and not because of the superiority or innovativeness of the products? Absolutely. Gold Disk Astound, Aldus Persuasion, Lotus Freelance… truly inferior to the PowerPoint they went up against?

But yeah, sometimes they innovate, as much as any company ever does.

I know CTSS well, though I never used it. I did use Multics in '69, and was well indoctrinated in my OS class at MIT. Though IBM claimed to invent timesharing with TSS, I don’t think OS/360 can be considered to be a time sharing system.

[/quote]

The need for a layer (there is no barrier other than knowledge to writing a program that makes a chip perform a function directly, with no OS) only comes about when you want to multitask. The first need for multitasking came about when timesharing of a large computer system became desirable in the late 1950s.

[/quote]

Do you mean multitasking or timesharing? Is a simple OS with SPOOLing a time sharing system? How about one that allows keyboard interrupts? The DOS we used for our old PDP-11/20 had both these, but I’d hardly call it timesharing.

There are other reasons to put a layer between the user and the bare hardware. One is to provide the user with what we’d kind of call an API today - a system call making it simpler for thye user to get access to hardware resources without knowing the protocols of everything - not the least to keep them from screwing things up. These also provided some feeble measure of security, even when jobs were run one at a time.

[/quote]

True dat. My only point was that the push for timesharing created the need for multitasking, and the desire for multitasking was the impetus for the idea of an operating system. Timesharing is simply the idea of letting multiple users run their processes on the same computer in a cognitively simultaneous way. Build a system to do this, remove the extra users, and you have a multitasking system, which is what an OS really is.

[/quote]

Well, I do agree that multitasking is a major benefit of an OS, even ones without timesharing. I’m not sure that I’d agree that timesharing was the major driver - at least in a commercial environment. I think not sitting around waiting for a print job to get done before you get a prompt was motivator enough! Remember that lots of people used minicomputers in the late '60s early '70s, all of which had multitasking OS’s, but few of which had timesharing systems. Time sharing migrated down from academic and big mainframe type of systems to minis (like the PDP 11/70) to micros.

I didn’t think you were claiming that. I was just trying to say that there are many other, more important, motivators for OS development beyond time sharing. Back when I started computers could hardly keep up with one person - why stick more on them!

Aha. I was looking at an IBM perspective, not the view of the consumer market. (I was reading Datamation at the time more than Byte.) I agree with you from the consumer perspective. They were hungry for validation from a real computer company, not one run by hippies or gamers.

Remember, no one ever got fired for buying IBM. :slight_smile:

Nitpick: The latest edition of Operating System Concepts, dinosaurs and all, is sitting on my desk as we speak, and Tannenbaum isn’t listed as a author. It’s written by Abraham Silberschautz, Greg Gagne and Peter Baer Galvin.

I think DirectX can be said to be an innovation. DirectX is a set of standard functions for sound and video output, created for computer games. Before DirectX, games had to access the relevant hardware directly, which often lead to compatibility problems because the game had to deal with many different hardware configurations directly. DirectX instead handles different hardware itself, providing a more uniform interface for games. It certainly hasn’t solved all compatibility problems, but it’s a big step in the right direction.

Oops! I switched Silberschatz with Tannenbaum in my head for a second. :smack:

My bad.

I would definitely have to agree with you there. Score one for Microsoft!

In other words, the computer is providing UI responsiveness at the same time as overseeing the completion of the print job, thereby engaging itself in multiple tasks at the same time? :dubious:

How is that not multitasking?