Recently got my first LCD monitor for my PC and it was horrible - absolutely unuseable. Then I learned that the resolution I had set in my Windows Desktop Properties had to match the inherent resolution of the LCD panel, or it would look like hell or even just display Resolution Not Available.
Since CRT’s don’t have a single inherent resolution (the dot pitch is a different issue involving the 3 electron guns for color, but those RGB dots aren’t aligned with the pixels a given mode implies), you get to choose resolution according to your personal taste.
But with LCD panels it seems you have to match the resolution setting to the LCD you own. If you want bigger icons or fonts, you have to change them one-at-a-time according to the application that is managing them.
So - is there some major change afoot to provide separate control over resolution that used to be an adjustable preference sort of thing but is becoming a hardware compatibility requirement instead?
Also, although the LCD has specific pixels, and the computer does too, the standard 15-pin cable connecting the two is in analog, and it appears there are weird issues (Phase Clock etc) about going back and forth. Does the new DVI interconnect method just fix this alltogether, or are there also weird issues with that? Is DVI tied up with the apparently-LCD future?
This doesn’t really contribute much (sorry), but it’s an interesting factoid related to your LCD monitor problems:
Back in the old days of the Macintosh, before monitors and video cards were the industry-standard kind that exist now, monitors were sold by size: 9", 12", 15", and so on. But all Apple displays were 72dpi and they represented “real world” size, i.e., an Aldus Pagemaker document viewed at 100% would be the same size (mostly) on screen as on paper. There was no such thing as switching resolutions (although some monitors swiveled to change orientation!). If you had a bigger monitor, you just automatically got a bigger view of what you were looking at. In those days, if a program requirements list said “minumum 12’’ monitor” it really, really meant it; it wouldn’t fit on the smaller screen.
LCD’s will stay at their native resolution unless some radical new technology comes along. On the minus side, you only get one resolution to play with, on the plus side, you get an image which is hugely sharper and more defined.
There is work on making resolutions other than the native one appear better but the mative one will ALWAYS be better.
Generally, for things like text, it is very noticable but for things like games and images, you dont tend to notice it much. This is good since you rarely ever change the resolution of your desktop once you have set it.
As for DVI. Yes, definately get DVI. Its a world of difference. Once you set up DVI it feels like you’ve been short sighted for half your life and you just got glasses.
I once had a cheapo Acer notebook – when it changed resolutions, it left a black border around the screen. Not bad, and legible text. Except the whole screen wasn’t used.
Next I had a cheapo CTX notebook – when it changed resolutions, made a horrible mess and tried scaling everything without any type of interpolation.
Next I had a top-of-the-line PowerBook – when it changed resolutions, it was obviously scaled but very well done. Mac OS 9 took LCDs into account whenever it drew to the screen. It worked supurbly. I’ve never tried it with Mac OS X – I assume it works okay (StarCraft looks okay at 800x600, but it’s not heavy into text).
I’ve been considering a 20" Cinema Display for my QuickSilver (using a 19" ViewSonic CRT now), but haven’t given any thought to how modern LCD’s change resolution – does the Mac know there’s an LCD and perform the PowerBook style scaling? Or is it crappy, non-interpolated scaling? Or does the LCD hardware (not being analog) have the smarts to make it look nice?
LCDs are digital. In order to use an analog signal, it has to be run through an Analog to Digital Converter, which is really quite icky. It’s MUCH better as far as clarity and performance to use a DVI connection, since you avoid two conversion steps (Digital->Analog->Digital). There’s really no way around the fact that LCDs have a native resolution; the only hack available is putting a resize filter in either the videocard or LCD to alter the image. The best option is to simply keep everything at the native resolution and play with size settings as needed.
DVI itself is being replaced soon by HDMI (High Definition Multimedia Interface) because DVI can’t send data fast enough for large displays or HDTV.
handy: Yes. The DVI connection technology does not have enough bandwidth to send high resolution video, such as is needed to drive a large HDTV display, at full framerates. Some displays get around this by using multiple DVI connectors to increase the bandwidth, but on some widescreen displays, even this doesn’t cut it. Hence HDMI.
Does this apply to laptops as well though? The text is slightly blurry at lower resolutions, but it’s hardly noticeable. The only real limit is that I can’t go above a certain resolution, I think 1024x768. It seems there shouldn’t be any difference between these and stand-alone LCDs…
I know nobody asked, but since we’re on the subject of LCD panels…
Now for my favorite feature of Windows XP: ClearType
From Windows Help
[quote] To use ClearType for screen fonts
[ul][li]Open Display in Control Panel. [/li][li]On the Appearance tab, click Effects.[/li][li]In the Effects dialog box, select the Use the following method to smooth edges of screen fonts check box.[/li][li]Click ClearType in the list.[/ul][/li][/quote]
I know that Microsoft didn’t invent it, but it is truly the best.
If you have XP and an LCD monitor, you must enable ClearType. Once you do, you will see an immediate increase in the legibility of fonts, especially italics (of course, you must have your resolution set to the monitor’s native resolution).
For those who don’t have an LCD monitor, go to Steve Gibson’s site to get a really cool little demo that lets you see what subpixel rendering is all about.
hobbes730: An LCD is an LCD. The display in your laptop should be using a native digital connection, though its almost certainly a proprietary one. Just like other LCDs, you’ll need to run it at its native resolution, or face quality loss.
The Cleartype antialiasing technology that minor7flat5 noted can be used to attain a quality improvement on CRT as well as LCD monitors. It can cause some text blurring, but many people, including myself, prefer the look of Cleartype antialiased text even on a normal CRT.
While on the topic of sub-pixel rendering, this page explains how to enable it on Linux. On most systems, all you have to do is edit the /etc/fonts/fonts.conf file. I love it - it makes a huge difference.
cleartype is not so clearcut as that (no pun intended). Some people I know hate cleartype and prefer it off on their LCD’s. Some love it and have it ON on their CRT’s. One person I know keeps it OFF on his LCD but ON on his CRT…
Laptop LCD’s all use proprietry connectors. Some are simply pin compatible with DVI and some are really different. My current laptop LCD runs at 1400x1050 which is beyond DVI spec.
I havent quite figured where interpolation is done exactly. There were comments when ATI released the Radeon 9000 that it dramtically improved the interpolation quality and low resolutions. However, it seems desktop LCD’s do all of the interpolation within the LCD.
Hmm, I have a 20" LCD monitor connected via DVI. Darned sharp display (and if only LCDs didn’t suck at color matching I’d be in heaven).
I just tried Cleartype for the first time after reading minor7flat5’s post, and I have to say: yuck. Cooltype may work for some, but for myself I could just feel a headache starting looking at that semi-blurry type. It’s off again, thank goodness.
astro: It’s a matter of opinion. The antialiasing effect of Cleartype on a CRT looks really good, IMHO, expecially after I got used to it.
squeegee: Cleartype may not be set up properly. As the site that minor7flat5 mentioned noted, if the subpixels in your LCD are arranged differently than normal, Cleartype will make the picture look substantially worse.
I wouldn’t say substantially. I have it enabled on my CRT and while it does create a slight color fringing, it doesn’t bother me. It looks just as good as regular anti-aliasing. (I have a dual-monitor setup with a CRT an LCD, and I can’t turn it on on just one of the monitors. Having Cleartype on both is much better than having it on neither.)