Monitor Regression: Technology or Marketing?

(I have a strange “deja vu” feeling that I’ve already started this thread a long time ago, but I can’t find it)

I’m an application support engineer and I in my line of work I primarily deal with digital camera software and hardware. Invariably I am exposed to a lot of upcoming technology and, in general, trends in consume electronics. I am a bit cynical because I get to witness how a combination of marketing and consumer behavior trends affects technological advances (very often negatively in my view).

However, I am entirely baffled by a technology that is not only not advancing, but actually seems to be regressing: Display devices. I just got a new monitor at work, and I’m once again reminded how disappointing new monitors actually are. The new display is a 19" diagonal LCD natively displaying … SXGA (1280x1024)? You’re bloody kidding me, right? This is under 100 PPI! This is also the middle of the year 2007!

My questions: Why is my monitor not 600 PPI yet? The technology is either not here yet or is too expensive, however, why is it regressing? Is it because of piss-poor software support for higher PPI devices? Are people actually comfortable with eye burning pixel densities in modern displays? Is there a cartel of display manufacturers that is fixing prices and conspiring not to advance?
Densities of everything electronic have been increasing very very quickly. General semiconductors, thin films, magnetic platters, CMOS and CCD imaging sensors, and the list goes on. The only two technologies that stand out, at least to me, like a sore thumb are displays and batteries. Batteries I understand – we’re hitting the natural limits of various chemicals to store electrical charge, nothing surprising. Displays, I don’t.

Of course, CCD and LCD are vastly different technologies, and one is input the other is output, but they both require a high density two dimensional grid of individually accessible pixels (I’ll admit CCDs don’t have to be nearly as fast).

In the high end of CCDs we have things like the Kodak KAF-39000 professional CCD imaging sensor packing away a 3750 PPI matrix and delivering 39 megapixels for around $30K. Unfair? Well, how about the consumer range Sony ICX452 5.1 Megapixel CCD that manages to provide a 8500 PPI matrix and is available in cameras retailing under $500?

I realize color LCDs need triple the horizontal pixel density because of individual R,G,B subpixels, but even then, armed with $30K you can get a 39 megapixel CCD but good luck getting a 39 megapixel LCD that is not the size of a wall. Hell, I’ll settle for a 5.1 megapixel LCD for under $500 :rolleyes:

Another aspect that confuses me is the regression aspect. A friend of mine got a Compaq laptop around four years ago (I think) and it had an “amazing” 15" 1600x1200 (that’s like almost 2MP, and like 130 DPI :rolleyes:) in a laptop you could actually use on your lap. With the widescreen mania nowadays 1600x1200 is almost impossible to come by in a laptop and the best you can hope for is 1920x1200 in a “wide-load” 17" desktop replacement.

There was hope when IBM made the T220 in 2001- a professional WQUXGA (3840 x 2400) 22" LCD for like $20K. Then Viewsonic chimed in with one under $8K. I was expecting such technology to keep appearing (as it usually does) in the professional range and then dropping down to consumer level in a few years. Nuh uh says the market – both discontinued, and right now nobody even makes such a high PPI LCD Monitor. Apple has some high resolution displays but they’re huge and low pixel density.

You can get a CRT with a slightly higher resolution than an LCD nowadays, but even then it’s nowhere near an order of magnitude. LCDs are complex and must be virtually defect free and I understand it’s not easy and it’s not cheap, but why isn’t it getting better. LCD, CRT, OLED, electronic paper, hell anything, I’ll settle for a 15" mechanically switched monochrome DMD-like array if that’s what we have to do, but even if I had the money, it doesn’t seem to exist. So what the hell is up? I mean it’s clearly a combination of technological limitations (read: increased cost) and consumer demand (read: nobody wants them at the price we can make them), but WHY? Do people not realize how much better a 600PPI display would look compared to a 100PPI one? Do professionals (CAD, design, publishing, photography, presentations, etc.) not realize how much better it would make things?

I was curious so I looked up what monitors were capable of in the past. I was right. IBM had a CRT monitor under $800 in 1991 that could do 8-bit color XGA (1024x768) on a 14" viewable tube. The computer of the day was what? 386DX 16Mhz with 2MB RAM and a 200MB HD? :smack: We can pack four 64-bit cores running at over 2Ghz (with a vastly superior, even if bloated archetecture) into a single consumer CPU today, but we can’t even triple the monitor density of consumer displays 16 years ago?

Sincereley, if irate, Groman.

As a regular consumer, I’m going to guess that high-density displays are a niche product that never caught on. MS Windows and its applications always had poor “large text” and/or vector graphics interface support, meaning certain programs’ text and interfaces would be too small. For general usage, PPI doesn’t seem to be a very big concern… if HDTVs are any indication, people generally seem more interested in larger displays rather than denser ones.

And as for photography and art, what advantage would a smaller display give you over a bigger one, assuming both are the same resolution? What’s wrong with doing it the way they have, increasing both size and resolution but keeping density about the same?

You say 1000 DPI vs 600 DPI, but what about 14" at 1000 DPI vs 28" at 600 DPI? Entirely different question then, and I suppose the market has chosen the latter.

There’s a fundamental limitation in the link standards. The DVI standard already has trouble keeping up with current resolutions at the high end (30" screens and the like). Anything beyond that, such as that IBM monitor you mentioned, requires ganging together multiple interfaces, or brand-new interfaces. Until there’s a critical mass of demand for this, I don’t see anyone doing it. There’s a reason why that monitor had a very limited market, and it wasn’t just the price; it really was a piece of hardware for a dedicated workstation, whose entire point was to run that monitor. It wasn’t general-purpose hardware by any means.

I don’t think the market has chosen either. I’ll gladly take either 14" at 1000 PPI or 28" at 600 PPI. The problem is that the standard pixel density of display devices has not strayed that far from the original ~70 PPI as established by Apple when they started developing Lisa in the late 1970’s. Right now most displays are around 100 PPI. 100PPI is entirely too low.

A friend of mine once called me on it and we conjured up the following test. He printed out the page of the same text with random # codes at the top. 5 copies at 300 DPI and 5 copies at 600 DPI. Wrote down which numbers were what DPI, shuffled them and gave them to me. I identified 600 DPI with 100% accuracy. Of course it doesn’t prove it’s better, it doesn’t even prove anything, but if there was no effective difference people would print text at 100 DPI and be happy about it.