How fast do chips have to get until it doen't take for FUCKING EVER to Boot up & Run?

Last week my employer’s IS dept added some new “administrative software”. Net result, we now long to get HOME to our 56K dialups, because it’s FASTER than the T1s at work.

Every major office software upgrade adds 1 or 2 functions I like, takes away 3 or 4, is 30% bigger, and slighly SLOWER.

I’ve owned an IMB compatible since 1985, and other than image processing (which, to be fair they weren’t designed for in the first place) their performance hasn’t noticeably improved.

Why? Because the fucking SOFTWARE makers INSIST on using EVERY clock tick and every single bit of memory to do MORE fru fru pointless stuff.

Take a fuckin’ BREAK, kids. Howbout in your next upgrade, you JUST fix bugs, or you compact down your code to make it quicker, and SMALLER. How bout advertising, “Uses 15% less drive space, and loads 30% faster”. If I liked the program, I’d buy the upgrade if without a SINGLE “NEW” “capability”.

The hardware folks seem to be doing their jobs. I really have ne complaints against THEM.

SOFWARE WRITERS, LISTEN UP! Smaller’s Better! Add some new thing LATER, once you get the old stuff tuned and hoppin!

Don’t get me started on not having access to a Command Line at work. It would break my heart.

I concur. I especially like the generic Windows “Let us arrange your files so they start faster.” How about you quit with the fucking bloatware?

Sweet Jesus, tell me about it. When I first got this stupid laptop, it took ten minutes to boot up. Now that I have god-knows-what programs trying to run themselves on start up, it takes my laptop fifteen minutes to boot up. Fifteen fucking minutes!!! If my school wasn’t making me lease this damned thing, I’d fling it out the window and go buy a Sony or something. Not that the time it takes for the infared monitor, the anti-virus software, AIM, and probably whatever spyware makes it onto my computer wouldn’t just slow it down again.

Oh yeah, and I forgot to mention, how many programmers have the INCREDIBLE ARROGANCE to write their installation routines as if my computer will be dedicated to running their program and ONLY their program.

They will write changes all over my system files without telling me. Their deinstall routines don’t restore my system files to their original state. They insert themselves into my STARTUP routines, goddammit (and I’m a fuckin’ ATHIEST)!!!

Mark my words, it’s not going to get better intil we’re so reliant on computers that these kind of “glitches” start killing a lot MORE people than they undoubtedly already HAVE.

You know, I’m as much against a 29 million dollar settlement for a coffee spill as the next guy, but I predict that American product liability litigators will someday tear Bill Gates his 3rd or 4th asshole (depending how you count).

That’s why the client I’m working with is taking away the regular systems and switching everyone to Citrix. Instant boot on the terminals, they control the software installation/bugs from a central point.

Oh Crikeys. I use Citrix from home. Explain to me how it’s going to make MS Office Suite run better? Explain where my macros go when I write them from home. Not only do they disappear, but if I edit existing ones through Citrix, they are corrupted and don’t work when I return to my real desktop.

So now, not only don’t I have a real command line, I don’t even get a real DESKTOP?

Oh yeah. Maybe you say the IS department didn’t implement Citrix properly, or take into account what might happen with files edited or programs run trough BOTH Citrix and a normal network interface. Perhaps your right.

But those some Einsteins would be the folks who, “…control the software installation/bugs from a central point.”

Someone deleted a folder containing an important database from our department drive. I said so explicitly on the workorder to restore BOTH the folder and .mdb file. The IS guy fucked around for a week before calling me because, “I couldn’t find the right folder to restore the file to”.

For most companies, the result would be replacing overworked and mismanaged programmers at the megacorporate level with the merely incompetent of the local level.

WHICH FUCKIN REMINDS ME! IS departments are supposed to SUPPORT OPERATION, not build ROBOTIC EMPIRES. Only ONE place I’ve ever worked had so much as a SINGLE IS person who had any knowlegde whatsoever of the programs they “support” at the operational level. EVERY PLACE I GO, “support” REALLY means “reinstall” and refer the user to a website or external vendor if that doesn’t work.

Bloatware? MS bashing? Sorry, but after I upgraded from Win98 to XP, my computer went from loading up in a minute and a half to loading up in 15-20 seconds. If anything, Microsoft is on the ball in regards to load time.

Well, maybe things are different in Office XP… I haven’t bothered installing that.

As for other things… why make programs smaller? The only point behind making the programs smaller would be if hard drives aren’t continually getting bigger. Programmers can make their programs larger because the average computer is getting more and more system resources to use. There is no point in making programs smaller just for smaller’s sake.

I mean, what would be the point of having a 160 gigabyte drive if the largest program you own only takes up five megabytes?

Of course, there’s no excuse for software making a T1 connection run slower than a 56K. That’s either a result of a bad program (which has nothing to do with its size), or there’s some sort of incompatibility somewhere, or perhaps this “Administration software” deliberately limits the connection speed of each individual computer.

The trick is to not shut down your computer at all – just let it sleep when you’re not using it, and you can wake from sleep in under a second.

Okay, us Mac users can, anyway…

Ah! rjung!

I wonder - is bloatware exclusive to PCs? The new Photoshop 7 (while I LOVE it - LOVE LOVE LOVE) does seem to take longer to boot, and the “save for web” function is slower. (So far that’s all I’ve noticed, but I haven’t explored every nook and cranny of PS 7 yet.) Of course, it just came out, and Adobe is (apparently) working on an update that will be less taxing on system resources. I’m on a G3 500 Mhz with 768 megs of RAM, and I’m giving PS 7 about 300 megs of RAM. Maybe I should throw more RAM at it. What do you think? And, would PS be considered “bloatware”? I don’t know if it is - I do love all the new features. (The new “brush engine”! Wow!)

Calm down, calm down. Remember: you are a SDMB poster. You have values. Intelligence.

Computer startup depends on hardware, software, firmware, and CMOS configuration. And on OS, OS configuration, etc.

First, every PC engineer wants to make their part look fast, slick and wonderful. That means, whenever there’s an improvement in the CPU, everybody wants ALL of it to make their part look better.

Second, corporations have no regard whatsoever, in any respect (don’t give a shit), about the performance of their employee’s computers. Their goal is, above anything else, to CONTROL what employees do. If it takes twice as long–so be it. You work for them. You will wait when they tell you. Not before. Not after.

…ahem…to continue… On your computer, depending on how clever your company has become, are several pieces of software that watch anything you do that the company is interested in. Speaking from much experience. What files you run, what Web sites you visit. And especially, when you login and logout. I.e., did you get to work on time?

Don’t rail about performance, because that has nothing to do with what companies want from you and your computer. Complain and they’ll utterly ignore you.

Because the entire point of engineering is supposedly to design the simplest most efficient solution that uses the least resources possible to get the job done correctly in a reasonable amount of time?

Yes, it is possible to have hundreds of Gig of storage space (or hell, why not a TB?) and four Gig of RAM, but none of that is really useful if every fucking software suite on the market attempts to suck up all of it for it’s own memory or virtual memory space.

That kind of resource wasting is the thing that makes me the most frustrated with my fellow engineers. Just because it’s there, that doesn’t mean it has to be used up. Resource wasting, and it is a waste if it doesn’t seriously improve the functionality of the program, is a major major problem with new software.

This ‘well we have the resources’ line is just an excuse for software engineers and programmers to write sloppy programs with serious memory leaks and not give a damn about their shoddy work.

Hrrm, I’m not having any problems…

I have about 15-20 things in my start menu and I still boot in under 30 seconds. Hell it takes 40 seconds to shutdown and restart :slight_smile:

My only problem is AIM.

That fucking program… I’ll be doing something else and put on an away message so I don’t get yanked out of whatever program I’m using which ALWAYS results in a crash since AIM is a piece of goat felch.
Well after 5 min aim crashs… yanks me out of the damn program so I can watch it autologin … which crashes again making me click the sign on button… then I go back to work, and again AIM yanks me out so I can see its popup news window, which I can’t shut off. But this point whatever program I was running to get stuff accomplished has died. So I log off, and close AIM and rescue my file and get back to work. 5 min later… AIM IS RUNNING AGAIN! What is it with this piece of rancid dogcum!

Re XP: I concur. Running it now at home, and it boots extremely fast. Of course it helps that I have the fastest hard drive I could get my hands on (the speed of the drive being, in my experience with Windows, a lot more important than the speed of the chip), lots of memory (256k, the second most important thing to the speed of Windows, in my experience), and a 1.4 ghz chip (still important, of course). And a fast motherboard too.
At work, my computer was recently upgraded. It runs NT, and also boots pretty fast, though not as fast as my home computer.

Yes, but there comes a point of diminishing returns, when the effort to make something “smaller” and “more efficient” becomes both incredibly time-consuming and necessary. Why delay the release of a program by three weeks just so you can trim fifteen kilobytes off of it?

So it’s either “Use next-to-none of the system resources” or “use ALL of them” with you? Whatever happened to the idea of “the middle ground”?

I don’t know about you, but none of my programs - PSP7, Word, Premiere, Winamp, etc. - use “all of my system resources”. And I’m hardly on a resource god machine.

Seems like y’all are pointing to a tiny minority of lousy programs and thinking them to be the norm.

Then there’s the opposite mindset… if it’s there, why not put it to use?

If we lived in a world where you can run five web browser windows, Photoshop, Word, Dreamweaver, Premiere, AIM, Winamp, and three games, all on a 66 Mhz chip with 16 megs of RAM, who would want to buy better hardware?

The software industry keeps pace with the hardware industry, it’s as simple as that.

Or, conversely, it’s an excuse for them to not waste massive amounts of time for a minimal amount of gain. Again, I ask you, why bother wasting weeks of time just to scrape out a couple of excess kilobytes?

Something tells me, SPOOFE, that you don’t write code for a living. We not necessaily talking about making code smaller, but of making it better. Sometimes knocking a couple of K off of a program can help enormously because you’ve decided to run your loop in a different way. Or you’ve cut two lines down to one to eliminate a system call. Or pulling things into memory using three lines instead of using file I/O which takes five lines. All of these things can make a massive difference in how well code runs.

A good rule of thumb is that shorter code isn’t guaranteed to run faster, but longer code is pretty much always guaranteed to run slower.

Just because memory and space isn’t expensive any more doesn’t mean that code shouldn’t be smaller, better and more elegant. A key reason for code bloat is that hardware is cheap. At some point the wasted time of the user needs to be factored into the “total cost” of the application.

…and again I find myself screaming “elegance through simplicity” at the product development team. If I’ve gotta find and diagnose the bugs, for the love of Cecil, don’t point with pride at an obfuscated routine and comment on how elegant it is. No. No. No. It’s not. It’s kludgy.

Elegance
through
fucking
SIMPLICITY.

Being a programmer, my comment is that the simpler and more logically structured a program is internally, the simpler and more logically structured the user interface is, usually. I’ve always found the correlation between the two to be extremely high.
Such programs also tend to use less memory and to run faster, IMO.

And I haven’t said anything to disagree with that. I’m just saying “smaller isn’t always better”.

And other times, I’m sure, knocking a couple of K off of a program won’t do a whole hell of a lot. There are two sides to this whole issue… why is one of them being ignored?

And at the same time, the wasted time of the programmer needs to be factored in, too. At some point in the streamlining process, you get a program that runs quite well on the vast majority of machines, and this would allow those that produce such programs to focus more on perfecting applications and adding new features rather than keeping things trimmed down.

To use an artistic analogy… the larger your canvas is, the more you can put in your painting.

Now, this is NOT to say that programmers should be sloppy with their work… as always, well-written code would, obviously, be preferable. But as resources become more and more plentiful in the average machine, that gives programmers more leeway on how they construct their programs. And I DON’T need to be a programmer for a living in order to see this.

WRONG.

The entire point of engineering is to make a profit for the company you work for.