Why is modern software so damned big?

Undoubtedly spyware for DHS, CIA, FBI, and the BMGF*.

*Bill and Melinda Gates Foundation.

Well I beg to differ, I have run XP pro happily on a 300 AMD k6-3 192 MB ram till service pack 2. After which the antivirus program had to go. A few updates later and it became totally unusable because it was just too darn slow to the point that is appears like it has hung up.

I was able to 'down’grade it to win 98 SE and it seemed like a new computer.

Perhaps XP home is different.

I suppose I should have said that this didn’t involve antivirus software :eek: (which I abandoned about four years ago, instead relying on firewalls, firefox, frequent backups and common sense), and also was only using it with a single-user setup. But it was with SP2 and a fair few subsequent updates.

Admittedly, that computer is now a Debian box, running a lot faster!

How many programmers does it take to change a lightbulb

Maintaining such huge and complicated code bases is not a trivial task. Software Engineers have studied this problem far more carefully than you have and the economics of it is often not very intuitive.

Another question:

why are there so many programs that say they’re “1M” in the list of programs but manage to clean out “1G” if I uninstall them? Is it because Windows is seeing only the “main” program and most functions are in other files?

Ahhh. Exactly what I was thinking. I read your specs and thought “a perfect candidate for Ubuntu.”

I have a similar machine that has been my home Linux file server for a couple of years. It ran 98 pretty well, but would have been quite painful with XP.

Are you a programmer? Things that are very simple for a human to do can be quite difficult on a computer.

Same thing with Web sites. In the mid-1990s, best Web design practice called for pages that were no larger than about 25K to 50K, including graphics, so download times on dial-up lines were reasonable. Web designers considered that users generally had multiple windows open on thier computer, and that like other applications browsers usually weren’t opened to fill the entire screen. Web designers were urged to make pages less than 480 pixels wide - the default width of a Netscape 1.0 window.

Now, with near-universal broadband service, Web pages are much heftier, and they often take as long to download as those on a plain-Jane 1996 site accessed from a dialup connection. “Fluid pages” are increasingly a thing of the past, the vast majority of Web surfers surf with browsers in full-screen mode, and the standard minimum page width is now 800 pixels. Many design for a minimum width of 1080 pixels. The days of using a Web browser in windowed mode is drawing to a close.

I get a trifle disgusted at times with all the useless goodies but since everyone does it, I’m stuck with it.

For example, when Windows XT boots up I have no need for a voice telling me that Windows XT is now booting up. When I transfer a file from the net to my computer I don’t need a picture of sheets of paper flying acrsss the screen from one place to anohter. My first computer at home was a Tandy 8 mHz processor and it booted up with breathtaking speed compared to the 2 gHz systems today.

And I really miss the old default option of not downloading pictures automatically. Think of how many pop-ups you could avoid seeing if you had that as a default.

You think this is bad? Try installing an HP printer! 850MB just to be able to print.

I’m a software developer, occasionally developing for Windows and *nix, and I spend almost zero time worrying about the size of the software I’m writing. And I spend very little time worrying about the memory it takes, or the speed it runs at. It’s simply not a priority.

Partly that’s because it doesn’t drive sales. No one considering our products is going to choose a competitor’s product because it takes 10% less diskspace. No one considering our product is going to choose a competitor’s because ours uses 20 MB of memory and theirs uses 5MB. No one will call our support line and complain about either of those. But they will choose based on features. And they will choose based on (past experience with) bugs.

I sometimes refactor code in a way that makes it smaller, but I do it because it makes the code more modular and easier to expand in the future, not because it might save a few megs of disk space. The reason is that refactoring code is risky. It introduces bugs. Deleting large sections of crazy logic in favor of a simplified and extensible model will almost certainly screw up a few weird cases (which is part of the reason the logic is so crazy in the first place). Hell, it’s not that uncommon to introduce/uncover bugs just because you’ve made something more efficient. Sure, those bugs were already there, somewhere, but they weren’t problems until the code was rewritten.

Basically: If it ain’t broke, don’t fix it.

Given the knowledge that nearly every piece of software has known bugs, it is a much better use of my time to fix those bugs and add new features than it is to optimize something that doesn’t really need optimizing, and possibly introduce new bugs.

I’ve never encountered the “bootup voice”, but the “flying sheets of paper” is a good thing. It’s a progress indicator to let you know that the task is in fact going on. If there were no indicator and the file transfer were taking a long time, you might think that the process was hanging. Unfortunately, the “flying paper” indicator itself is just to show you that something is happening, and doesn’t display the actual degree to which the file transfer has progressed. So some progress dialogs will also display information about the amount of data that has been transferred, or provide an estimate of the time remaining.

That depends on the product and on the users’ machines. If a piece of software requires 300 megs of memory and I only have 256, no, I’m not going to buy it. I’ll either buy the more efficient competitor’s product, or I’ll do without, because a program which requires 300 megs would be worthless to me.

What if the “more efficient” product costs twice as much?

All very true, but irrelevant to Windows, because the paper will often keep on flying even though there’s no data being transferred, and because the time estimates are about as accurate as Vista release dates.

What complexity?

Psuedo-code to accomplish this would be:

  • save the formatting info at the start of the paragraph.
  • at end-of-paragraph, compare it to the formatting info for the next paragraph.
  • if the same,
    • don’t insert any formatting info.
    • proceed to the next paragraph.
  • else if not the same,
    • insert the closing formatting info on the last paragraph.
    • insert the new formatting info for the new paragraph.
    • save that new formatting info.

That’s not very comples at all – handling a standard page break when printing a report requires more effort.

P.S. And I do know something about this: I’ve been doing computer programming for 30+ years now, starting with COBOL-1968 on a Control Data machine.

In a lot of cases, though, the potential customer will choose to upgrade his machine. After all, the benefit of having more RAM can be spread out among lots of different applications he hopes to run. In the particular case of my company, it’s a given that the customer will have plenty of budget to spend on more computer. Our software suite (of which my work is a small part) starts at several thousand dollars per seat. Our customers are not going to skimp on RAM. It would be more important for small, cheap software, although I still doubt that it’s usually important enough to be a significant competitive advantage.

In other words, you’re relying on the customer absorbing a hidden cost?

My Windows XT comes on and says something like - Internal self test completed. Windows is now booting from the operating system. And I agree that some indicator that the task commanded is taking place is needed, but does it have to be a movie? Why not a simple text “File being transferred?” Seems to me like it would take less program code.