Are we in the 'Golden age' of home computing?

In the past computers were limited as to what they could do, and expanding the ability of the computer was usually not a straight forward process, having to deal with IRQ conflicts and the like. Not to mention that parts that would expand the capabilities of the computer was expensive as well as the initial computer price itself.

Now, with the development of USB, most things are really plug and play, others require the insertion of the cd, where drivers are automatically found and installed and the thing just works, the days of plug and pray are for the most part gone. Yes I know that Mac users will chime in and say that they have had this for years, but Macs were a minor player and didn’t have the other advantages back then, so read on.

Parts are cheap, I have seen digital cameras as low as $20-$30, and for that matter wireless cards, wireless routers, DVD burners, and a whole host of other computer accessories undreamed of a few years ago.

The internet and broadband connections has also opened up new computing options, including a new slew of software, and has spawned the open source movement, where people can download and use full featured software, such as Open Office, for no cost, replacing the expense of commercial office programs like MS Office. Even first rate computer protection programs, such as antivirus and firewall programs are free for home use.

Still not there.

Computers are still open to a lot of improvement, and not just in terms of computing power. Personally, I think that a call for breaking all reverse compatibility is needed so we can fix, at minimum, email spam, virusses, and such.

  1. USB can easily be improved upon. A standardized driver definition would allow for drivers to be included as part of the device (i.e. you plug the device in, and the driver comes through the cable), so they could be used by any OS that accepted a driver of that type. At the moment, USB mostly works because most devices (mice, keyboards, etc.) use a standardized set of signals so a single driver can be used for all of them, or becuase Windows pre-packages the driver into the install.

  2. Character codes are a mess. If you want to support just Japanese, you need to handle EUC, JIS, S-JIS, UTF-8, and UTF-16 at minimum, plus both CR+LF, LF, and CR line endings. And of course, the only way to determine which character code you are using is by using heuristics on the data to try and determine which is most probable.

  3. Email is largely untraceable back to it’s origin, and anyone can write any information into the header, without worry.

  4. Webpage programming is a messed up pile of doo. To accomplish anything, you need to use at least three languages: HTML, a serverside language like PHP, and a client language for DHTML. And all of those get combined into a single file of source code in a truly ugly manner.

  5. Application installation is unstandardized and evil, and there is no method beyond file permissions or third party applications to apply sandboxing.

Just to list a few things.

And really, there is still “piped” computer access where all you have in your apartment is a terminal into a central system that you pay a monthly bill on. But this will require fast enough connection speeds to make the experience be instantaneous and for nVidea and such places to start making 3D accelarators for mainframes (I imagine.)

Basically, computers suck. And the sad part is that the situation is getting worse, not better.

Hardware has improved a lot and is getting to the point of magical (this is where the OP started, more or less).

Software, though, comes to you in a half-baked form that needs to be patched and repaired on a regular basis. Security and privacy are the realm of prayer and blind optimism. Bloatware seems to be the norm and, if my circle of friends is any indication, people is every day less and less able to get decent results from something as basic as a word processor (I suffer the responsibility of playing tech support for them).

The upgrade treadmill is running like it is trying to stop a jet from taking off and you are constantly forced to upgrade software that works perfectly fine until you reach the point where your hardware is no longer sufficient for it.

Give me my WordStar 2.0 back!

A friend of mine who works at IBM makes this analogy:

Computers right now are like motorcycles were in the 1940’s.

If you wanted to ride a bike with Marlon Brando’s gang in “The Wild One”, you had to pretty much be a mechanic. You had to be able to ground your own plugs in a parking lot in the rain somewhere.

Computers are like that now. Individual users have to be able to get under the hood and perform all kinds of maintennance because of the lack of standarization previously mentioned in this thread.

Some people like tinkering, and that’s fine. But for everyday users, things are still too complex. Too high a level of knowledge is required in order to make the computer a time-saving tool.

So, no. It’s not the golden age yet.

No way. I’d like to do things, and help others do those same things, which sounds uncomplicated to me, but are no where near being a reality, at least not in the near future.

OK how about the Bronze Age then :wink:

It will be more expensive in the future. Vista requirements will force some people to get new computers. Most aps will require faster computers and expensive video cards.Even the monitors may have to be upgraded.

Bronze age sounds about right. Maybe we are entering into the industrial age. There is lots of noise and smog, there is crowding and problems, but there are a few people working on standardization, which could mean we will hit the Industrial age soon, say in the next two or three decades.

Perhaps we are in the Computer Renaissance.

Oh, I should add that it is very possible, and pretty probable, that home computing will itself never take off and be standardized and as ubiquitous as some people dream. Mobile computing is getting to be cheap and accessible.

I guess it depends on what you mean by home computers. Will a smart home system be considered a home computer? How about a home server system that stores Multimedia files such as “TIVO’ed” movies, music, pictures and the like, and then there are nothing more than a few terminals in the house? What if the desktop computer dies an early death, and only those “mechanics” that like to play games have one?

Why have an expensive and power hungry device that takes up room in the house if you have a 10", 1-3 lb notebook, mini-pc, or blackberry that has all the great functions of a PC for web surfing and the like. Why have a home based broadband internet service if you pay 30-50 bucks a month and have unlimited nationwide wireless access?

PDA’s didn’t take off years ago, but that is no indication that future miniture computing devices wont. Will that mean the end to home computers? Quite possibly. At least to the average person.

It’s not the golden age of home computing, but then I’m running Windows 98.

Here are a couple of things I’d like to see:

[ul]
[li]the extinction of that form of software known as a “driver” (in favor of firmware built into the device, as mentioned above);[/li][li]the end of the long boot time: you should be able to switch a computer on and off just like you would a light;[/li][li]on the web side, I’d like to see web video much better integrated with the browsers, so you don’t have to deal with these external video players that work 50% of the time;[/li][li]the extinction of that piece of software known as a “firewall”; the idea that some remote computer can come in and muck with your machine, without explicit permission from your keyboard or mouse, is grotesque.[/li][/ul]
Here’s a question for another thread: is today the golden age of anthing?

Interesting points… I have to say that I do consider notebook computers to be ‘home computers’ if and when they are used in the home, though PDAs and true miniatures aren’t. And… there will be a market for a long time for computing that involves a full size keyboard, pointing device, and a monitor-type display screen, which won’t be different enough from ‘home computing’ as the category now is to really differentiate it. Sitting down at a terminal of a home server that you describe also qualifies… it won’t be a desktop computer like we have now, because the guts of the computer will be hidden away somewhere and not sitting next to the desk, or under it, (who actually has a home desktop computer ON their desk anymore?) but it’ll still be home computing.

Mini computers will come into their own alongside this, and they will both start to interface a lot more with entertainment media equipment, home appliances, and so on. :slight_smile:

Probably not. Seem the term “golden age” is used for nostalgic purposes. They say “golden age” of cinema, but there have been improvements (opinion based of course), so it certainly wasn’t the peak.

Now if you use it to mean “period of time in which great tasks are accomplished,” and you are speaking on a relative term in regards to the history of mankind, and you are talking about America in specific, then I would say that we are certainly living in a golden age. Sure, there are a couple wars, some skirmishes, but I would wager to a vast majority of people it is distant and remote, in region and in mind. We have luxuries we could only dream of in years bygone.

Things are certainly more complex, but that, IMO is a good thing. YMMV.

And software makes a piss poor firewall. Go hardware, or go square…

:wink:

I am old.

My first home computer cost £4,000 ($7,850).
If I rmember right, it ran at 16 herz.

So I guess today seems like Nirvana!

16 Hertz? What was it, an abacus?

So, has it finished booting yet? :smiley:

This one’s just because you’re on 98, I’m afraid. My XP laptop never gets rebooted until (a) I boot into Linux (unfortunately still a rare occurence), or (b) I update Windows, at times of my choosing. All other times, it just goes to standby when I close the lid, and the time to restart is dictated purely by how long it takes for the hard drive to get back up to speed.

I think an expert abacus user can go faster than that.

It won’t be the golden age of computing until my father-in-law stops calling me to tell him how I can fix what he’s screwed up this time. It’s worse than it used to be, but I don’t know if it is because XP is worse than 3.11 or he’s older. It seems that it is easy for people who don’t know what they’re doing to screw things up - until that stops happening, computers will keep their bad reputation.

That’s only true if you’re using a scripting language like PHP. If you’re using Java or ASP, you get a much better seperation of your business logic(in Java, C# or VB) from your presentation(in Javascript and HTML).

OK, now I’m going with the Iron Age of computers, as the vast majority have access to it and can do things with it - as opposed to the Bronze age where only the rich have access.

That’s OK, an ‘age’ can end at any time.

The problem with something replacing the home computer is the interface, so far there is no good practical replacement for a large monitor, keyboard and mouse. Those 3 items are really what the home computer is about.

I do suspect that true voice recognition will bring in a whole new level of computing, and make devices like car based GPS’s much easier to use, but the GPS is going to have to understand people saying things like:
“GPS, I’m going to the store”
“GPS, take me to Grandma’s”
“GPS, No don’t take me on that f’en road again”
“GPS, I’m going home, but need to stop off at a post office on the way”

But these things may just diminish the use of the home computer, but not eliminate it.

I have a projector instead of a monitor which is also my TV since I have a video jack on my computer as well as a DVD player in it, and use a Alphagrip so I don’t have a mouse nor keyboard.

One future of the computer is a setup like this, where our TV, gaming console, and computer are merged into a single device (obstensibly a computer.)