Are people becoming more tolerant of tech mediocrity?

(partially inspired by [thread=370417]this thread[/thread])

Are we, as a society, becoming more accepting of flaky and misbehaving equipment? I get the impression we are. A few examples:

[li]I played my son’s new Buffy the Vampire Slayer videogame with him for a while, and it is rife with glitches. Weapons pass through objects, players can pass partway through walls, and other odd artifacts abound. This didn’t strike my son as odd. He said that it’s true of most videogames. I used to program graphics software, and a glitch like that would have stopped production immediately. Granted, it wasn’t realtime software, but isn’t correct handling of object collision more important than shadows and reflections?[/li][li]My DVR freezes up and requires rebooting several times a week. According to Dish Network tech support, “that just happens sometimes.” A decade ago, would we have tolerated a VCR or TV that froze up regularly?[/li][li]I had a car 25 years ago with a gas gauge that worked beautifully. How come all three of the vehicles I own today (and the rental I had last month) all show “full” after driving 50 to 100 miles, and display non-linear behavior in the middle of the range?[/li][li]Ten years ago, I was running a network of about 75 computers: a mix of Windows NT, Windows 95, Macintosh, Solaris, and Linux. Everything talked to everything else. It rarely took me more than a half hour to get a new system operating well on the network. Today, I have a mixed network in my home. The Mac talks to the WinMe and WinXP systems. The WinXP system talks to the Mac, but not the WinMe system. I’ve put in hours trying to fix it, and talked to two consultants, who both told me you can’t make WinXP network reliably with WinMe (I don’t believe them, BTW). The systems in my bookstore (a WinXP and a WIn2K) stop talking to each other about once a month and I have to reboot the router and both systems. Have we fallen this far backward? Wired networks are flakier today than they were 10 years ago?[/li][li]When using Microsoft’s video player, the video frequently stops while the audio keeps playing, and the video catches up several seconds later. Wouldn’t you think audio/video synchronization would be old hat by now? We’d never tolerate that on a VCR or DVD player, although I’ve seen it happen on TiVo units sometimes.[/li][li]I got a watch with an altimeter as a gift. Readings on successive days from the same point vary by hundreds of feet (approaching 10% variance). What good is this?[/li][li]Interesting coincidence: While typing this message, I passed my mouse cursor over the buttons in the SDMB “guided mode” toolbar. The tool tip text popped up for the first button and froze. I had to remove the cursor from the toolbar and put it back on a different button, at which point that tool tip text froze. Isn’t this a pretty fundamental part of the UI to be so flaky?[/li][/ul]
Am I just grumpy today, or is this indicative of a trend towards embracing mediocrity?

I sympathize with your frustration. I don’t know if we are are more accepting of this crap; don’t think we have much choice, as corporations more and more have no regard or respect for customers. We pretty much have to take what is on the market, especially in the computer field.

There were always shoddy products on the market, but if the consumer chose to pay more, there were more high quality things available, or so it seems to me.

[li]I got a watch with an altimeter as a gift. Readings on successive days from the same point vary by hundreds of feet (approaching 10% variance). What good is this?[/li][/QUOTE]

These watches use barometric pressure to give you the altitude, so change as the pressure changes. You have to set the base altitude at a known site such as your home, or else find out what the barometric pressure is at the time and then set the watch that way.

The manual for my Suunto watch explains how to do this, and it is very accurate. When I climb mountains with friends who have GPS, am always within a few feet of what they have.

Possibly your manual does not explain this, but there still should be a way to zero in the altitude or pressure at the place where you want to start. If it is a stormy or changeable day, the pressure well may change hour by hour, thus throwing off the altitude figure.

I will address this one:

I am an IT consultant as well.

You are critisizing technology as being worse now but then you say that XP works and Windows ME is giving you problems. Windows ME is 6 years old and XP was the replacement. Windows ME sucked and we say that all the time on these boards. Windows 95 really sucked and I find it hard to believe that you think it was more stable than XP. Networkig techonlogy is ligh-years ahead of where it was 10 years ago but we also put more demands on it than ever before.

You can find really cheap copies of Windows 95 and all that old networking equipment. Why don’t you buy that and give it a spin for a taste of the good old days?

Microsoft starting taking desktop OS stability seriously with Windows 2000. That became XP and they are taking it very seriously with Vista. They retired the long line of crap that ended with Windows ME because it was crap. XP is a very good and stable operating system for the vast majority of us.

Crappy consumer electronics have always been with us. In fact the crappy ones used to cost more than the crappy ones do today. There are quality versions of everything out there but you have to do research and you may end up paying more for a good DVD player than the $30 special at Wal-Mart. I will take a crappy DVD player over a crap 1985 era VCR any day.

This is likely the result of changes in atmospheric pressure. Most altimeters don’t actually measure your height above sea level - instead, they measure the pressure exerted by the column of air above you. The result is that changes in weather produce changes in their altitude readings.

I can only say I disagree in part.
We bought a new digital camera to replace a five year old one that died. What an improvement at less than half the price of the original.
I entered the workforce in 1987 when computers were just becoming popular. Only in the last two year can I say that we now have a stable useable system that is a joy to work with. If I ever have a complaint about what I have I just think back to days not long ago when I spent most of my time on the computer cursing about lost files, slow internet or frequent crashes.
I do agree in part that there are now so many different formats of data storage. A good example is the wide variety of music files or picture files. Some programs work with some and not others. The newest craziness will be the competing high definition DVD formats.

Nope. Didn’t say that. I said that a variety of systems all talked together 10 years ago, and today they don’t. WinXP’s networking is no better than WinMe’s, given the limited amount I ask them to do.

Unfortunately, I have some legacy software that doesn’t work on XP, and I don’t want to spend $5K to replace it. XP sucks less than ME sucks, but the point is that they both came from the same company and they don’t play together. I can make two ME computers talk to each other easily. I can make two XP computers talk to each other. The Mac will talk to both XP and ME. Why won’t XP talk to ME?

Didn’t say that, either. Windows 95 was a flaky miserable excuse for an operating environment, but when I set up a network it connected and worked. Maybe the OS crashed so often I didn’t see the network problems so much, but my Win95 networks back then worked better than a mixed WinMe/WinXP network today, and both of those are more modern systems than Win95.

XP is far more stable than previous versions of Windows. I never disagreed with that. Heck, my OP never even mentioned the stability of Windows. But purveyors of modern operating systems (I’ll pick on Apple as well as Microsoft here) seem to focus far more on making sure the product has the latest cute, fluffy features than in making sure everything actually works.

True. Unless you’re trying to apply this to operating systems. A copy of DOS used to cost me $30, which was under 2% of the cost of my computer. Now, Windows XP costs $200, which is close to 20% of the cost of the computer. You’re paying a lot more for a Microsoft operating system today (although Unix is a lot cheaper ;))

I get your point, Shagnasty. Really I do. But if you can point me to a DVR that never hangs, a Windows video player that doesn’t desynch audio from video, a linear gas gauge, and a video game where the programmers cared more about quality (e.g., eliminating artifacts and collision errors) than fluff (e.g., shadows and reflections on textured surfaces), then please do.

I guess in a nutshell, I’m asking if you agree that we have a higher tolerance for mediocrity than in the past?

I’ll agree with you on cameras. My only complaint with my new digital camera is the battery life.

“A joy to work with?” I still won’t go that far :wink: Take Web browsers, for example. I’d rather see half the features yanked out if every remaining feature actually worked correctly and consistently.

Moved to Great Debates.


([li]I played my son’s new Buffy the Vampire Slayer videogame with him for a while, and it is rife with glitches.[/li][/QUOTE]

I’ve been a video gamer since the dawn of personal computing and console gaming.

Games have always been glitchy. The quality control is generally abysmal, especially for PC games, which are frequently shipped before they’re finished.

It’s because of competition. Put briefly, people would rather have something thats cheap than something that works all the time. Look at the early microwaves, they were built like a tank and would last 20 something years because they were built from quality components but costed thousands of dollars. Nowadays, microwaves are $50 a pop and will last a year or so because every component has been shaved down to the bare minimum to keep costs down. Same with DVD players and DVR machines. Rather than take 2 years to make a quality game, game makers can make a alright one in 6 months.

The MS Video player issue is something different. Your running it in a multi-tasking enviroment so it can’t always guarentee the processing cycles it needs. Either stop running tasks in the background or get a beefier computer.

Plus, I think this is a lot of “good old days” selective memory. There was a lot of stuff that was horrible, techwise back then.

I believe that the OP has a point: we accept some bugs today that we would not have tolerated a few decades ago.

Cases in point:

  1. Just three days ago the car that I was driving suddently would not get into gear when I was restarting after railway crossing barriers got up. Diagnostic message: failure in gear system (Smart Roadster, about 1 yr old). With a queue of other cars behind me I had to push the car on the sidewalk and after the mechanic could not fix it either it had to be taken to the garage.

I later got a call from the car sharing club: They fixed it by acknowledging the error log and resetting the CPU, afterwards the error did not occur again…

This is what we are used to in IT, but a decade ago we would not have accept it as ‘these things happen’ in a car. Just for the benefit of needing no clutch pedal they designed a computer in to control gearbox and clutch, and that computer crashes sometimes as we are conditioned to expect of computers…

  1. When starting life support (i.e. the coffeemaker) in our office in the morning we have to wait for some time for it to boot (displaying operating system version, diagnostics, etc.) We wouldn’t have blithely accepted this a decade ago wither (admittedly there were no compact coffeemakers then).

  2. With battery technology becoming ever more efficient I’d expect for a charged cell phone to last ever longer. Wrong - with every new cell phone I buy a charge lasts a bit shorter it seems - they blow the advantage of better batteries on colour LC, CPUs fast enough to run Java, etc.

Hardware or software?

Hardware, I disagree. Generally speaking, almost every complex consumer product I use today does much more than in the past, and does it reliably. Ignoring the software defects (which I’ll get to), manufacturing processes have resulted in cheaper and more reliable components in just about everything: my hard drives are faster, hold more, and fail less. The tires on my car are a wonder of engineering compared to when my father was young. I have a car pushing 200,000 miles with an original clutch and a very short repair history. I don’t have any stats on this, but I would suspect the “back in my day, X was built to last” thing is a bit of some rose-colored glasses on the ol’ past.

However… Software? I absolutely agree. Ask almost any developer and they will probably agree that we are far, far more tolerant of software defects than we were in the past. There are some good books that talk about this phenomenon, but the short of it is that as long as software is market driven (which I don’t actually think is a bad thing, mind you), the emphasis will be on fixing the problems that affect users 95% of the time. That last 5% is the killer when software continues to increase in complexity: this is the “sometimes that happens” event that makes you reboot once every so often, and is so costly to track down and fix. The economics of fixing that last 5% rarely allow companies to spend time on it; so yes, in a very real sense, the market is progressively becoming more tolerant of mediocrity (as long as you define mediocrity by reliability, and not features).

Add to this the nature of software to be developed as components based on various other components and architectures (which are frequently maintained by completely different companies) and the number and scope of bugs increase. As the software layer cake gets higher, problems become increasingly subtle and impossible to diagnose. “My query analyzer just crashed. Is that a problem with the people who wrote the program, or is it a Java bug, or is it the particular JVM I’m using, or is it the database server itself, or is it some nasty bug in my operating system?” Every single one of those is a completely separate company, so when that happens, I just shrug and restart. The reason I’m tolerant of this is because these programs are pretty complex, and allow me to do things I wouldn’t otherwise (and if I could do them otherwise, I will, which is why vi is still my text editor, for example).

(Of course, this trend is a generalization, of course. Individual programs improve over time: XP is more stable than 95, newer versions of the Linux kernel are much more stable than old. Regardless, I endure software crashes all the time these days where I would not have been as tolerant before.)

Now, in my opinion, the kicker is not that software has been getting buggier. Most programmers have been saying this for years. The kicker is that in the past decade, computers with complex architectures like the kind used on PCs have become present in everything – and with them, all of the complex and subtle bugs that no one can bother to fix. What this means is that even though the trend for programs to become less reliable has been going on since the 70’s, no one other than us nerds noticed it. But then suddenly, recently, everything starts incorporating more complex computers and the general public gets ambushed by software bugs in almost everything they use: cars, TV’s, DVD players, hell – apparently coffee makers as well (which was news to me).

As to whether we should be more tolerant… I don’t know. I think that’s another debate that I’m probably not going to get involved with. A lot of very smart people have been trying to figure out this particular problem for a long time, and it’s a pretty heated argument.

1995? But yes, why would we need a coffeemaker that boots? “Coffeemaker” would seem to be a device that does not require that much processing power.

I am amenable to the explanation that a surge in apparent tolerance for crappy tech may be traced partly to devices becoming cheap disposable commodities on the low end (e.g. off-brand electronics), and too overcomplicated on the high end (e.g. BMW i-Drive system; what addled focus group said people wanted that?). And let’s face it: on the one hand, a majority of consumers will decide based on price and accept lesser quality and even discomfort for the sake of 10 bucks less (see: airline business); on the other, the high-end makers know they can make more money by up-selling you options you don’t need (see: bootable coffee machine) or just plain and simply hyping the product well beyond its real performance, that you can’t help but be disappointed.

Is it me, or is it that, one day, the World Computer Control Network will fail and the earth crash into the sun because of a tiny programming glitch from 1992 that was never quite eradicated, just incorporated into ever-more-complex systems?

One of the reasons I couldn’t keep quiet in my basic engineering courses in college was the push from the professor that making something better always meant making it cheaper. Having just come from an environment where certain components had been so over-engineered to be as close to never-fail as currently possible, it stuck in my craw. But if you look at the market - that does seem to be what most people want.

Products that I have never heard of a problem with, but are a bit more expensive, are constantly being hemmed in by competitors that are cheaper, in every sense of the word. To pick a relatively mundane example: The mini-maglite. I’ve never heard of anything but the bulb failing in one of these, but there are a multitude of crappy clones out there. And people seem to be perfectly willing to by the clones, thinking they’ve saved $3, but expecting it to fail in a few years, when they’ll have to get another.

OTOH, there’s the alternate argument: While those of us quality freaks are running around with our old AT&T rotary phones that never, ever, ever died - everyone else has a 900MHz cordless phone that has more features, does more and is easier to use.

I no longer know what’s best. I just know what I’d prefer. (Looking at my 15 yo mini-Maglite, sitting on top of my HP Laserjet 4P printer.)

Because there are so few people like you, mixing them together, that it isn’t worthwhile for MS to make that work better.

MS can choose to invest their programmers time in making making XP work better with the old versions of their OS (thus allowing customers to keep using the old versions and delay purchasing the new one), or they can invest it in improving the current version and working on the next one they’re going to sell. Which do you think they are going to spend money on?

Depends on what you define as “modern operating systems”, and what features you define as cute & fluffy.

Mainframe MVS has defined reliability & security as major features that must work. And they do. It’s common for the time between reboots on a Mainframe to be measured in weeks or months. I’ve worked at shops where they only rebooted the mainframe once a quarter. That’s not unusual. And security? Mainframe viruses and spyware are almost unheard of. And this on big systems, where massive amounts of really valuable data is held.

Good point. But it has little to do with quality of the OS. Microsoft’s monopoly and marketing decisions set the price, not quality.

But that’s just your own definition of what is quality vs. fluff. Clearly the market disagrees; most customers (teenage boys) prefer what you call fluff. So the managers tell the programmers to put their efforts into programming what the customer wants.

There’s a good example. I have a cordless phone on my desk at home, and another in the kitchen, but I keep a corded phone in the kitchen, too, because the cordless phones stop working in the event of a power failure (and cell phones don’t work here at all, so I can’t use a cell phone as a backup).

I think you’re absolutely correct on why things like this don’t work. But why do we put up with it? Just fatalism?

That was was true of the IBM 370-195 we used in school in the 1970’s. It was true of the Vaxen and CDC mainframes I used in the 1980’s. It was true of the Solaris and Linux servers I used in the 1990’s. For the most part, it’s true of the WinXP and OSX I use today. I just wish it was true of my DVR and 3rd Gen iPod, both of which crash from time to time.

But core reliability isn’t what I’m talking about. I’m talking about glitches and bugs. Sometimes when I scroll vertically with in Microsoft Word on WinXP, I’ll get a line of text displayed twice. The point-of-sale software at my bookstore sometimes fails to display items in the inventory list. When I scroll with a mouse wheel in Firefox on OSX, it chops off scan lines from rows of text. These are all very fundamental problems that recur regularly, but they’re easy to fix by redisplaying the window, so it just doesn’t seem to bother people.

Excellent point. I concede that one. Although, just for the record, the average videogamer is not a teenager these days.

On the other hand, if Microsoft knows they have both (a) a monopoly and (b) user lock-in from network effects, there’s little incentive for them to improve the quality of their products.

I was using Microsoft Excel, part of the Microsoft Office 97 Suite, on Microsoft Windows 2000, and I could not open an eighteen-row, two-column Microsoft Excel document. To rephase my crazed IM rant to a friend, how the hell is this not only acceptable on a two-year old computer? And beyond that, it’s not like I’m using some obscure setup that should rightfully dragged out back and shot (though I’d argue that last bit)- I’m using the dominant OS and its integrated application.

So to answer the OP, yes.

Because, when you get right down to it, people as a whole love to say that want quality and are willing to pay for it but when you examine their actual purchasing decisions, they pretty much only want cheap.

I think people are becoming somewhat more tolerant of tech issues but I think the root cause of the issues it the speed of development, at least in the computer world. With computing power doubling every 18 months that makes for some pretty short development cycles compared to some other industries. To keep up companies have to push out new products at a rapid pace (Well, except MS. I expect to install Vista about the time I retire and that is a long ways off :slight_smile: The rapid pace means that #1 your testing time is really short and #2 you patch the flaws after the product has been released so you can beat your competitor to market.

For example, when I worked at AOL they pushed back the release of version 4.0 (It might have been 5.0, I went through a couple launches) by about 9 months if I remember correctly, it might have been closer to a year. Each time they pushed back the release date we got tons of calls from angry customers who were upset because they wanted the lastest software NOW. The thing that many of the customers did not realize is that AOL was making the software much more stable. When the software was finally released the number of tech calls dropped dramatically. The metric used (or at least the one I got to see all the time) was number of tech calls per million logins. When 4.0 was released the number of tech calls coming in dropped by over 80%. Yes, 80% fewer calls for that version of the software. The software was MUCH better. Yet the customers were pissed because of the wait. What lesson is there in that if you are running the company? Release and patch. Or give out realistic release dates but most companies can’t do that due to #2 listed above.

I think that at some point (when we hit the wall on Moore’s Law) we will see switch from rapidly developed buggy software to longer development cycles and much more robust software.

For other products the main reason I think we see less than stellar products is that many things are so much cheaper. Our first microwave that my parents bought sometime in the early 70’s lasted forever, it was built like a tank. At the same time it cost a lot when adjusted for inflation, if I recall correctly this was a Major purchase back then. Now you can pick up a new micorwave for next to nothing.