In a recent thread, someone noted that a computer monitor programed to display random combinations of pixels will eventually produce an image of absolutely everything. So, just by chance, the monitor will reproduce the Mona Lisa, display an image of George and Martha Washington on their wedding day and even produce a picture of me sitting at my computer writing this message. This all seems plausible and unbelievable at the same time.
Although there are a lot of possible pixel combinations, the number is not infinite. However, it seems to me, that the number of possible images is infinite.
Consider taking a photo of a bunch of bananas suspended from the ceiling. The camera can move around the bananas 360° horizontally and 360° vertically taking a different image from every angle. On top of that, the camera itself could be rotated 360°. While the number of images is finite (especially because the resolution of a monitor is limited so the variations can’t be too subtle), the process can be repeated with two bunches of bananas, a different colored room, the camera set at a slightly different distance, etc. The entire process could be repeated yet again using pineapples or ferns or schnauzers. Already, that seems like a lot of images for a monitor that has a limited number of permutations.
And that’s just shooting around a stationary object in a closed room. Theoretically, the monitor would display an image of everything I’ve done today from every possible angle. Same with everyone else in the world. The monitor wouldn’t just display everything that happened in everyone’s entire life, it would display all the possible permutation. So, not only would there be an image of my 12th birthday party, there’d be a still from the movie The Godfather staring Leonardo DaVinci, Jesus and Koko the Gorilla.
All the images are distinguishable and thus should be require unique combinations of pixels, but the shear number of possible images seems like it would exceed the number of possible pixel permutations.
As the display gets more rudimentary, it seems to be that the possible pixel arrangements decrease faster than the number of possible images. Assuming a 640 x 480, 8-bit monitor (essentially black-and-white television), it may no longer be possible to differentiate between a bunch of bananas in a pink room and a cyan room, there’s still be ever variation of Abraham Lincoln being shot by John Wilkes Booth, clubbed by John Wilkes Booth, drown in marmalade by the starting line-up of the 1974 Mets, and so on.
I can’t do the math to figure out how many possible images a 640 x 480, 8-bit monitor could display, but is it really enough to display an image of everything ever? (Not to mention a bunch of random static.) Or is my logic flawed somewhere?