Dog years analogy for computers?

We are likely all familiar with “dog-years.”

This might be off-the-wall, but I was wondering if it would be at all possible to apply the same principle to computer hardware and software?

For example, I have a 2000 Dell with Win 98SE. (If you want tech specs, let me know) My pc may be 5 years old in calendar years, but how old is it in comparative terms, of progression of contemporary technology?

I’ve heard it said, many times, that the moment you buy a new pc, it’s already “obsolete.” I don’t doubt the implication. I would just like to know exactly to what degree of obsoleteness, any given computer is, at any given time. (e.g. Does anyone else even use Win 98 anymore? Is Win 98 “Jurassic”?)

“Dog years” is an analogy to “human years”. So you can know that a 10 year old dog is sort of like a 50 year old human.

In order to do the same, you need something to compare the computer with. Compared to cars? TVs? What?

Without the analogy aspect, a 5 year old computer is a 5 year old computer.

Compared against what? Dog years are in terms of people years - I don’t see anything comparable.

The important things are the apps you are running. If you want to run the latest games, the Win98 system has two feet in the grave. If you are doing word processing and light web browsing, it is fine. I have a Win 98 machine from 1998, which is working okay, if a bit slowly. I’ve just migrated most of my work to a newish laptop. The biggest problems I’ve had is software and firmware that firmware that doesn’t quite work, and apparent hangs from software doing something that takes forever. (And it is not the fan or heating - that I’ve checked.)

On preview, what ftg said - though I thought it was 7 human years to a dog year.

Ok, lets use an example:
Could you not compare, in terms of general use (e.g. speed, bug fixes, crash-resistance) how outdated Windows 98SE is compared to Win XP?

I don’t really understand the question, it’s about 7 years outdated (2005 - 1998).

Well, it is a difficult question to ask in a way that another will understand, so I’ll give you that.

It’s just that it seems technology is advancing so rapidly, at present, that you buy the fastest whatsit out there, but 60 days later, they come out with new one that makes your 60-day-old one seem years old, comparing terms of efficiency, speed, capacity or whatever. (3 examples that come to mind are hard-drives, RAM and Intel processors) Hope this helps illustrate.

I suppose some of this relies on individual perception, but then we’re possibly heading for GD territory. I don’t want that.

But it does not seem years old? The things that are years old are even slower.

As an aside I remember reading about a discussion of the depreciation rate used for writing off computer equipment being unrealistic. Under US tax law you can deduct the cost of the computer over 5 years. People were complaining that computers were pretty much useless before the 5 years were up and that the rules should be changed to more accurately reflect the true life time of computers.

Devices for home use are designed to last about 7 years. Some big name computer vendors (like Dell) will occasionally skimp on the CPU cooling to save a few bucks. It makes the computer marginally cheaper than the competitor, but it cuts the expected life down to about 5 years. Most folks upgrade every 3 or 4 years, so they figure no big deal.

Figuring humans last about 70 years and computers last about 7, your computer would be 50 years old in human years. Computers don’t really age like living things do. Generally they work perfectly (no joint pain, memory loss, or thinning hair) right up until the day they go to the great big bit bucket in the sky.

Microsoft supports operating systems for 5 years. As far as Microsoft is concerned, if you have anything older than 5 years you should upgrade it to the latest hardware and software. A lot of folks in the industry follow suit with this belief. Some don’t. Personally, I still use Quicken for DOS. It still does everything it did under DOS 4.0, and by some miracle it actually runs on XP in a DOS window, so in my opinion it’s not obsolete. By Microsoft’s philosophy, your computer already has one foot in the grave, and your OS is a 105 year old guy who just refuses to die.

Computer processing power doubles approximately every two years. That brand new system starts to get long in the tooth around three years old, is unable to handle current-day software after six, and is all-but-obselescent around ten years old. These days computers are running at 2 or 3 gigahertz – ten years ago they were running at 90 megahertz. Personally, I’d say that a computer year is about 10 to 12 human years.

By the way, I have 2 computers at home that run 98. One is used exclusively for games. It also boots 2000, but very few games I have will run under 2000 so it spends almost all of its time in 98. The other computer runs my CD burner (a bit slow, but it works) and several electronic design packages that won’t work under 2000 or XP.

You’re not alone.

Thanks for understanding my question, Punoqllads! :smiley:

Interesting info, engineer_comp_geek! Thank you.

Dog years are a correlation for length and stages of maturity compared to humans. A computer doesn’t compare with a human maturation process. It’s a bad comparison.

Computers compared to cars is better. Buy a computer and you’ll find out it depreciated $1000 when you took it home.

My mother-in-law runs Windows 95. Just thought I’d share…

Time for you to upgrade to a new mother-in-law.

Or go open source.