When did bad pixels become unacceptable?

I remember reading magazine reviews for LCD monitors/laptops in the 1990s where they would often mention that there were 2 or 3 dead or stuck pixels on the display, and it was clear that this was something that the manufacturer would not provide a replacement for until it was lots of bad pixels. Today nobody is going to put up with crap like that on a brand new display. So when did we stop letting manufacturers get away with it? (If there is an approximate date–I’m not expecting “March 17th, 1998 at 4:05 PM”.) (Well–I am expecting it, so I’m being proactive.)

It’s still happening. I surfed around until I found a warranty for a tv currently on the market today, and I found this on Sony’s website. It looks like a generic warranty for many/most/all of their televisions.

Wow that’s hundreds of pixels.

Back in the day, Windows XP had bitmap graphics that were pretty mediocre, but there really wasn’t much better. Then the Macs came out with some really sharp graphics. Windows hurried to catch up and came out with Vista, which ran on Vector graphics, which made its graphics comparable in quality to the Macs. It seems to me that was the big turning point.

I’d have guessed that bad pixels were unacceptable starting from the day flat panels were invented.

Whether or not they were covered by warranty is a different question.

The current ISO standard for bad pixels is the ISO-9241-3xx series. Wikipedia says most premium-level panel manufacturers are considering their panels as Class 0 (entirely defect-free) since 2010 or so, and most finished product manufacturers using these panels warrant their TVs or displays at Class 1, which allows one bad full pixel per million pixels.

Note that these are industry recommendations and guidelines (and you know what Captain Barbossa thought of guidelines) so manufacturers are free to enact their own warranty requirements.

I’ve got a $4000 CRT monitor on my desk. I paid $2000 second hand around year 2000. Now worth nothing. I’ve got a $10,000 paper-white CRT monitor in the garage, for doing page layout using MS DOS, from when IBM PC compatibles were more easily expandable with better hardware, than Apple equipment. Worth nothing even 30 years ago. And I just threw out a 15-20 year old monitor that was cost only $2000 when new.

From biggest to smallest was also from oldest to newest, $10,000 to $2000, but it’s only in last couple of years that I could get an LCD monitor as good as the $2000 monitor for reasonable cost.

Good LCD monitors used to be so difficult to make that companies couldn’t afford to replace the ones with a few bad pixels. Reject rate and subsequent failure rate was high. Superior CRT technology put an upper limit on what they could charge.

What happened was that that LCD technology improved and costs dropped. For desktop monitors, zero pixel errors became required when people first started buying LCD monitors for personal use. Before that LCD monitors went into city offices where desk space was limited, or on laptops, and people took what they could get.

I haven’t encountered dead or stuck pixels in a long time. I think an old monitor downstairs, one of the early flat panels, may have developed one or two, but that’s about it. I believe the some of Dell’s premium monitors, like the Ultrasharp IPS monitor I’m using now, come with zero defective pixels guarantees. One good thing about the new high-resolution monitors and tablets and especially 4K TVs is that the pixels are so tiny that a dead or stuck one is practically invisible. My own Sony TV is only 1080p and is fairly large, so if you’re very close to it (I mean within inches) you can clearly see the pixels. I was very relieved when I first set it up to see that they were all good.

Exactly, but they do still exist. Frankly it’s amazing that high-resolution monitors are so good (4K is over eight million pixels after all, compared to only 786,000 in the days when 1024x768 was standard). Back then Apple had a policy whereby a certain number of dead or stuck pixels was considered acceptable and not worthy of warranty service. However, if they were clustered near each other and/or near the middle of the screen, that would warrant service more than if they were scattered around the edges.

It seems that most name-brand monitors nowadays will have zero defects since the production yields have gotten so much better. The panels with defects just get binned to lower-tier products (such as TVs versus computer monitors) or sold off to no-name budget brands distributed on Amazon or AliExpress. Just recently Linus Tech Tips reviewed a cheap 4K monitor and he actually found several dead pixels, but he had to look very closely at it with a bright yellow background.

Heck, I’m old enough that I still have 640x480 stuck in my head as “standard”, and 800x600 as “high resolution”. Never mind that that’s the size of a single icon, any more.

IMO it depends on where you live and what your local consumer protection regulations were and whether you could return things that were “not faulty”.

I remember a friend of mine ordering an expensive LCD TV in IIRC the late nineties and the supplier tried the “less than 5 dead pixels is perfectly acceptable so you’ll have to live with it” line. When he responded with “in that case I exercise my 14-day right under the Distance Selling Regulations, I’ll send it back and fuck you very much” they changed their tune to “how about we just exchange the TV for one with no dead pixels so we can keep your money?”.

This is utterly false.

This thread reminded me that even though I can’t remember seeing a bad pixel in 10 years, I still have a powerpoint file on my flash drive, based on an old utility I used to use for troubleshooting, that just cycles the entire screen through red, green, blue, white, black, to spot bad pixels. (I think the original utility claimed that letting it run could sometimes get “stuck” pixels to function correctly again, and possibly even lessen damage due to screen burn-in when used on CRT monitors.)

It’s not just wrong, it’s borderline incoherent, and it’s also totally unrelated to manufacturing of screens. It’s not like display pixels get stuck based on the capabilities of the OS driving the image.

deleted double-post.