Monitor Displays Every Possible Combination Of Pixels (And Thus Images Of Everything)

In a recent thread, someone noted that a computer monitor programed to display random combinations of pixels will eventually produce an image of absolutely everything. So, just by chance, the monitor will reproduce the Mona Lisa, display an image of George and Martha Washington on their wedding day and even produce a picture of me sitting at my computer writing this message. This all seems plausible and unbelievable at the same time.

Although there are a lot of possible pixel combinations, the number is not infinite. However, it seems to me, that the number of possible images is infinite.

Consider taking a photo of a bunch of bananas suspended from the ceiling. The camera can move around the bananas 360° horizontally and 360° vertically taking a different image from every angle. On top of that, the camera itself could be rotated 360°. While the number of images is finite (especially because the resolution of a monitor is limited so the variations can’t be too subtle), the process can be repeated with two bunches of bananas, a different colored room, the camera set at a slightly different distance, etc. The entire process could be repeated yet again using pineapples or ferns or schnauzers. Already, that seems like a lot of images for a monitor that has a limited number of permutations.

And that’s just shooting around a stationary object in a closed room. Theoretically, the monitor would display an image of everything I’ve done today from every possible angle. Same with everyone else in the world. The monitor wouldn’t just display everything that happened in everyone’s entire life, it would display all the possible permutation. So, not only would there be an image of my 12th birthday party, there’d be a still from the movie The Godfather staring Leonardo DaVinci, Jesus and Koko the Gorilla.

All the images are distinguishable and thus should be require unique combinations of pixels, but the shear number of possible images seems like it would exceed the number of possible pixel permutations.

As the display gets more rudimentary, it seems to be that the possible pixel arrangements decrease faster than the number of possible images. Assuming a 640 x 480, 8-bit monitor (essentially black-and-white television), it may no longer be possible to differentiate between a bunch of bananas in a pink room and a cyan room, there’s still be ever variation of Abraham Lincoln being shot by John Wilkes Booth, clubbed by John Wilkes Booth, drown in marmalade by the starting line-up of the 1974 Mets, and so on.

I can’t do the math to figure out how many possible images a 640 x 480, 8-bit monitor could display, but is it really enough to display an image of everything ever? (Not to mention a bunch of random static.) Or is my logic flawed somewhere?

Yes. The number of possible images in the world is infinite (or practically so). But we’re not talking about images in the world. We’re talking about images in a 640 x 480 pixel 256 color world. I believe the total number of unique images would be 256 to the power (640 x 480). At some point, you’re not going to have enough detail in this pixelated world to differentiate between objects that look slightly different in reality. But there’s nothing stopping you from claiming the image can represent both objects.

Theoretically possible, maybe. Far from practicallity.
It’s like expecting a kazillion monkeys typing frantically on typewriters with the expectiation of producing the complete works of Shakespear or perhaps the Encyclopaedia Britanica or even the Oxford English Dictionary.

DAQ=SAR

You could also look at it from a much lower resolution. Imagine a 2,4, or 16 pixel black and white camera and monitor. There’s not a lot of possible combinations with these configuations and its not hard to imagine every possible combination with the random monitor scenario. When you add color and more pixels, it’s the same basic scenario, just on a larger scale.

That points out both a strength and weakness of digital mediums. There’s always some finite resolution. That’s nice for digital processing but often bad for realistic reproduction.

Driver8 got it right. For an 8-bit 640x480 display, there are 256[sup]307200[/sup] possible images, which is well more than the number of particles in the universe. In order to see all of the images, you’d have to be seeing trillions of images per second for trillions of years on trillions of monitors. But it is technically a finite number.

Taking a more simple example, if you had a display the size of a task bar icon (16x16) that could only display black & white, there would still be 2[sup]256[/sup] or about 1,157,920,892,373,161,954,235,709,800,869,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 possible images.

When my parents couldn’t grasp this concept (I had been talking about the Monkey’s typing Shakespeare), I wrote a program to randomly generate 4 and 5 letter words so they could see that it is just a matter of finite combinations.

We had a good laugh at some of the funny words, and of course, now and then we got a real one.

I doubt the monitor can do it. A live model will ALWAYS look different than a ‘fake’ one.

When in a car, Outside will be different with windows up, or down.

Man, I’m so SICK of these remakes Hollywood keeps churning out!

Of course, that number can’t possiby be right–it’s divisible by 5, and 2[sup]256[/sup] is not.

So a couple technical points:

  1. Given infinite trials, the monitor will, with probability 1, display every image that it can display. The sample space here is sequences of combinations of pixels, not real-life images.

  2. Given infinite trials, an event that happens with probability 1 is not guaranteed to happen. The monitor may never display Abe Lincoln being drowned in marmalade by any professional football players.

But who said that these trials are independent of each other? I think it’s being assumed here that the monitor will never repeat the same image.

The monitor can certainly display images of an infinite number of things. However, it definitely cannot display an infinite number of images.

Me: :smiley:
Everybody else: :confused:

The trick to seeing how this is possible is to realize that there are many, many, many different things that will, in 640x480 resolution with 256 colors, produce exactly the same image.

Let’s go to your banana example. Say I take a picture of a bunch of bananas. I then replace it with a bunch of bananas that is exactly identical to the first bunch, except that they are a very slightly different shade of yellow. In the real world, the colors are different, but when the monitor displays them, it has only 256 colors to choose from. It will have to take the closest yellow it can get, even if that’s not QUITE the real color… and so our two banana bunches, despite being ever-so-slightly different, produce the same picture on the monitor.

Or, let’s say that the bananas are exactly the same… but there’s one little tiny brown spot that is SLIGHTLY smaller on Bunch One than it is on Bunch Two. The monitor, however, can’t represent “just a teeny bit bigger” - it can either do “one pixel bigger” or “exactly the same size”. So, again, the two pictures come out the same.

Another one - take a picture of a bunch of grains of salt on a table. Then, take a picture of the same number of grains of sugar in the same positions. They look quite different, once you get down to the microscopic level… but at 640x480, all the monitor can show you is little white grains, and the two pictures come out the same.

In that case, every image will be displayed within finite time.

Did you just miss the “about” or are you having some fun?

[pedant]
It looks like you missed two zeros in there, the exact value is 115,792,089,237,316,195,423,570,985,008,687,907,853,269,984,665,640,564,039,457,584,007,913,129,639,936 according to the Unix unlimited-precision calculator.
[/pedant]

But yeah. Really big. Really, really big. So big you need BIGGER LETTERS. YEAH, HERE WE GO: BBBBIIIIIIIGGGG.

But bigger.

Does that mean that a computer needs to process 256[sup]307200[/sup] bits just to display an 640x480 pixel 8-bit image?

No. That number is the total possible number of combinations. The total number of bits in a single screen image is 640x480x8, or 2,457,600.

Which, according to my calculator, is a number with 739812 digits. Wow.

(calculated by: dc -e ‘256 307200 ^ p’ | tr -d '\
’ | wc -c
…which happened to use 2 minutes and 51 seconds of PIII-500 CPU time to come up with an answer) :smiley:

I may be misunderstanding something, but I was under the impression that each pixel is only capable of displaying 3 different colors: red, green, blue.

Ok, after a little research, it’s obvious now that each pixel can display many different shades of each color.

It depends on what you define a “pixel”. When you’re talking about computer displays, one pixel is considered to be made up of three elements - red, green and blue. So a “640x480 display” actually has 640x480 red elements, 640x480 green elements and 640x480 blue elements.

Digital cameras are different, unfortunately. A “4-megapixel CCD” has 2 million green pixels, 1 million red pixels and 1 million blue pixels. (Or some combination of three colors that add up to 4 million pixels.)