(I have a strange “deja vu” feeling that I’ve already started this thread a long time ago, but I can’t find it)
I’m an application support engineer and I in my line of work I primarily deal with digital camera software and hardware. Invariably I am exposed to a lot of upcoming technology and, in general, trends in consume electronics. I am a bit cynical because I get to witness how a combination of marketing and consumer behavior trends affects technological advances (very often negatively in my view).
However, I am entirely baffled by a technology that is not only not advancing, but actually seems to be regressing: Display devices. I just got a new monitor at work, and I’m once again reminded how disappointing new monitors actually are. The new display is a 19" diagonal LCD natively displaying … SXGA (1280x1024)? You’re bloody kidding me, right? This is under 100 PPI! This is also the middle of the year 2007!
My questions: Why is my monitor not 600 PPI yet? The technology is either not here yet or is too expensive, however, why is it regressing? Is it because of piss-poor software support for higher PPI devices? Are people actually comfortable with eye burning pixel densities in modern displays? Is there a cartel of display manufacturers that is fixing prices and conspiring not to advance?
Densities of everything electronic have been increasing very very quickly. General semiconductors, thin films, magnetic platters, CMOS and CCD imaging sensors, and the list goes on. The only two technologies that stand out, at least to me, like a sore thumb are displays and batteries. Batteries I understand – we’re hitting the natural limits of various chemicals to store electrical charge, nothing surprising. Displays, I don’t.
Of course, CCD and LCD are vastly different technologies, and one is input the other is output, but they both require a high density two dimensional grid of individually accessible pixels (I’ll admit CCDs don’t have to be nearly as fast).
In the high end of CCDs we have things like the Kodak KAF-39000 professional CCD imaging sensor packing away a 3750 PPI matrix and delivering 39 megapixels for around $30K. Unfair? Well, how about the consumer range Sony ICX452 5.1 Megapixel CCD that manages to provide a 8500 PPI matrix and is available in cameras retailing under $500?
I realize color LCDs need triple the horizontal pixel density because of individual R,G,B subpixels, but even then, armed with $30K you can get a 39 megapixel CCD but good luck getting a 39 megapixel LCD that is not the size of a wall. Hell, I’ll settle for a 5.1 megapixel LCD for under $500 :rolleyes:
Another aspect that confuses me is the regression aspect. A friend of mine got a Compaq laptop around four years ago (I think) and it had an “amazing” 15" 1600x1200 (that’s like almost 2MP, and like 130 DPI :rolleyes:) in a laptop you could actually use on your lap. With the widescreen mania nowadays 1600x1200 is almost impossible to come by in a laptop and the best you can hope for is 1920x1200 in a “wide-load” 17" desktop replacement.
There was hope when IBM made the T220 in 2001- a professional WQUXGA (3840 x 2400) 22" LCD for like $20K. Then Viewsonic chimed in with one under $8K. I was expecting such technology to keep appearing (as it usually does) in the professional range and then dropping down to consumer level in a few years. Nuh uh says the market – both discontinued, and right now nobody even makes such a high PPI LCD Monitor. Apple has some high resolution displays but they’re huge and low pixel density.
You can get a CRT with a slightly higher resolution than an LCD nowadays, but even then it’s nowhere near an order of magnitude. LCDs are complex and must be virtually defect free and I understand it’s not easy and it’s not cheap, but why isn’t it getting better. LCD, CRT, OLED, electronic paper, hell anything, I’ll settle for a 15" mechanically switched monochrome DMD-like array if that’s what we have to do, but even if I had the money, it doesn’t seem to exist. So what the hell is up? I mean it’s clearly a combination of technological limitations (read: increased cost) and consumer demand (read: nobody wants them at the price we can make them), but WHY? Do people not realize how much better a 600PPI display would look compared to a 100PPI one? Do professionals (CAD, design, publishing, photography, presentations, etc.) not realize how much better it would make things?
I was curious so I looked up what monitors were capable of in the past. I was right. IBM had a CRT monitor under $800 in 1991 that could do 8-bit color XGA (1024x768) on a 14" viewable tube. The computer of the day was what? 386DX 16Mhz with 2MB RAM and a 200MB HD? :smack: We can pack four 64-bit cores running at over 2Ghz (with a vastly superior, even if bloated archetecture) into a single consumer CPU today, but we can’t even triple the monitor density of consumer displays 16 years ago?
Sincereley, if irate, Groman.