# What happens when you change your computers resolution?

My highest setting is 1920 by 1200 pixels. Is that the total number of actual physical pixels I have? (I think my screen is 17 inches) The next setting is 1680 by 1050. When I go to this resolution (obviously the number of physical pixels stays the same) does the screen just have copies of the other pixels? How does this work

What type of monitor do you have? Is it a CRT, LCD, or something else?

Its an LCD.

Then yes that is the native resolution and the best one to use

1920x1200 would have to be a monster of a CRT.
Wait a minute - 17 inches? Either your monitor isn’t 1920x1200 or it isn’t 17 inches, or it’s very expensive. Typically a 22 inch monitor will be 1680x1050 and a 24 inch will be 1920x1200.

When you select a lower screen resolution, the screen physically stays at the same as you say. Screens usually have a mode to either stretch the image or to just use the center of the screen, with a 1:1 mapping of pixel to pixel. When stretching, the values of the physical pixels are determined by interpolating the pixels of the smaller rendered image.

Right, I know thats the one to use, but what happens (behind the scenes) when I lower the resolution? Wouldn’t there be pixels that aren’t being used? How are these filled?

This is the question I perceived you to be asking in your OP. The computer generates less pixels so there are no ‘unused’ pixels as far as your computer is concerned. I am not sure how LCD monitors deal with this, but I assume they stretch the lower-res image to fit its physical dimensions.

There’s a built-in algorithm that extrapolates them to look reasonably close to the neighbouring pixels, but don’t ask me how it works in detail.

Yeah, just like your TV or DVd “upscales” a DVD, your monitor “upscales” the lower resolution content. I believe they use either the same or a similar process of image interpolation. Essentially, pixels are stretched out and an algorithm is used to fill in the missing data.

Actually, now that I think about it, I might be wrong. Interpolation works for video, but I’m not sure it would work for the exacting needs of a PC desktop. You might just be seeing pixels being stretched out. Hopefully someone else can educate us

It’s some kind of rendering or resampling process, that’s for sure - it might even be that different manufacturers implement it differently (i.e. bilinear vs bicubic or something).

It used to look terrible when monitors were 800x600 and you rendered a 640x480 desktop on them, but now that native resolutions have increased to much greater sizes, it seems to be more forgiving.

The monitor has built in firmware to resample the input to whatever it’s native resolution is. In some cases, such as displaying 640x480 on a 1920x1200, it won’t even try. It just centers the 640x480 in the screen.

The good ones use better routines that make for a smoother resize. Cheap ones and old ones just drop pixels or average the pixels to either side of the one to be filled in. That’s why old laptops looked really nasty when they weren’t at their native resolution.

From what I understand, interpolation and dithering are factors in down-scaling images. That’s basically what’s happening on your display when you either change resolutions or color palettes.

The 22in or 24in is measured diagonally. But in actual dimensions, it’s around 17 inches horizontally, and 12 inches vertically.

Those would be monitors in the neighborhood of 72 dpi, My iMac with a 24" LCD is 1920x1200, and it’s the perfect screen with the perfect DPI. My work laptop though, is a crappy Dell with a 17" LCD with a 1920x1200 resolution.

Stupid, stupid design. The DPI is too high to see most of the text on the screen. I’m forever juggling the Window’s interface sizes, the Windows DPI settings, and changing document magnifications when I use the built-in screen. Windows is not resolution independent, and whole experience sucks (I’m guessing the high-resolution MacBook owners have the same problem). Luckily my Unix monitor is a proper resolution and size, and the Sun keyboard works perfectly, so I get by.

Actually, OS X is making steps toward independent resolution. It’s one of the things they’re probably going to have fully implemented in the next version, and for right now they include huge icon sizes that are visible on higher resolution screens, along with some automatically scaled interface elements. You’ve always been able to change the default display font size. Some 3rd party vendors don’t do big icons, but Mac users are picky and unless the program does something that can’t be duplicated by a competitor with a better interface, they’ll probably shun the maker.

According to the linked entry at Wikipedia, Vista is supposed to support resolution independence too.

Thanks for that links, because I think it illustrates something OTHER than what the OP is asking. In that link, each picture has 4 times as many pixels than the previous picture, each pixel now being half the size both vertically and horizontally.

How does that work when the resolution is not being changed by an integer amount? The OP’s example was going from 1920 by 1200 pixels to 1680 by 1050. Especially on an LCD! What’s happening physically on the screen?

There’s only two answers I can think of, and both of them are ridiculous. The first is that the pixels are actually getting larger. That’s quite possible in a CRT, but how could the diodes of an LCD physically change in size?

The other possibility is that his screen is actually made of 13400 diodes across, by 8400 vertically, and each pixel is either 77 diodes (for 19201200 resolution) or 88 diodes (for 16801050). Could that be it?

(Bump)

No, both of those are impossible. The physical resolution of a display has a maximum size that is usually referred to as the native resolution. This is the number of pixels horizontally and vertically that the screen is able to display. That’s all you get. The pixels themselves do not change sizes. A CRT is no more capable of changing its maximum display resolution than an LCD is. CRT screens typically report this as “dot pitch,” which is the size of the dot the screen will show when energized by the electron gun.

The way resolution is normally expressed is not particularly helpful in determining how nice the display will look unless they also give the physical dimensions of the display, since that’s what tells you the pixel density. More pixels per inch (PPI) will give you a clearer smoother image, and more readable text. The iPhone, for example, has almost twice the display density of my laptop monitor. And believe me, it’s quite noticable; text sizes that would be unreadable blurs on my computer are clear on that.

When you change monitor resolutions, the program that controls the display fakes it through a combination of what I cited before: dithering and interpolation. Scaling an image is almost exactly the same process as scaling a display, but has to be done on the fly.

If you’re really curious about how it looks, look at your display with a magnifying glass. You’ll be able to see the individual pixels. A quick and dirty way of doing this is to carefully put a drop of water on it, which has the same magnifying effect. Change the resolution and see what happens.