When I put those values in to Paint, I get a yellow/brown. Which is what I see in the picture.
Gold and (white in outdoor shade)/bluish.
I can’t see any black no matter where I’ve scrolled or what angle I look at it. This is on two different computers and a cell phone. The link in post #7 looks nothing like what I see in the viral photo.
I mark this one up to my running theory–at least 25% of people on the internet are trolls of some stripe.
And then adjusted to completely wash out and shift the colors.
The takeaway here is that the vast number of people on the various webz have crappy, poor-rendering, low-quality imaging systems that have never been color-corrected and are probably set to some variation of ‘torch mode’ - the hyper-bright mode used for TVs in stores. (If they aren’t set to “bat cave mode” to save energy.)
I never saw it as anything but white and gold but could clearly see it had terrible technical values, with the daylight beyond totally blown out. I am looking at it on completely color-corrected screens, some using technical correction systems and others hand-adjusted to match those.
My daughter swore it was black and blue… and guess what, her ChromeBook and phone, both fiddled to show things the way she likes them, were muddy and dark enough that I could see what the “black and blue” crowd was seeing.
Very much 250 million people and the elephant, only a subset of whom were seeing the image in its ‘correct’ - blown out, adjusted, screwed up gold/white - colors. The rest were, in the imaging equivalent of a freak accident, seeing as it would have looked if taken with a good camera and not adjusted. It’s these things that hit the narrow cracks that cause confusion. Otherwise, everyone just assumes the color balance they see is “reality.” :dubious:
I can only see the image as white and gold. No playing with the monitor changes my color perception.
I saw blue and faded black. As in, the black actually looks bronze, but I can tell it’s supposed to be black. I also think it’s people’s monitors.
So if someone asked you what colors you see, would you say “blue and bronze” or “blue and black”?
Bolding mine, and as others have pointed out (and posted links to), there appear to be 2 different mental compensations going on. The first is that white objects often look blue in poor lighting. As an example, I found a picture of this statue on the White House lawn at dusk. Photoshop clearly says that the marble base is blue, but my brain knows that if I were standing there in real life at dusk, I’d see a white statue base as blue. Therefore, it tells me I’m seeing white.
The same thing is happening in this picture of the dress. I see blue, but my brain dusts off a tried-and-true compensator that tells me it’s actually a white dress shot in poor light.
Thing is, there’s no such tried-and-true mental compensator for seeing royal blue as a pale blue, nor seeing full black as some shitty brown/gold color, because that doesn’t happen in nature. That only happens with low quality digital cameras in poor light conditions. So I get it now, intellectually, why a black/blue dress can look like that, and when Wired adjusted the white balance to make it look more white/gold you can clearly see that they’re adjusting the white balance in the wrong direction. But these are all tricks of photography that my brain has had to learn over the last 10 or 15 years, unlike the original “white looks blue at night” compensator that has been around in my DNA for hundreds of millions of years.
eta: Once my brain has told me that the blue dress is really a white dress in low light, there’s no way to see the gold as black because real life doesn’t work like that. If I’m seeing white as blue, then I’d see gold as, you know, bluer gold. So when my brain adjusts the blue back to white, it adjusts the gold in the picture to a brilliant gold. No way it goes from there to black without understanding crappy digital cameras.
Crowded round mine.
Uhhh…
I probably would *say *blue and bronze. If I were using it as a painting reference, though, I’d make it black, because it looks like it’s supposed to be (given the degree that the blue looks faded)…and because that brown does NOT look good with that blue.
I still have a hard time picturing how people’s monitors are so calibrated to make it look whitish, given that I know mine is already overbalanced towards white (you can’t see outlines on some white objects, or text input boxes, for example).
Then you probably have a pretty good monitor. Not all are, especially in these days of most people using mobile devices that have even wider range of variance.
Most desktop monitors also have a fixed range of viewability - people outside the viewing cone might be seeing a truly different image.
It does come down to interpretation of relative colors, and it’s possible that a flat photo of the image with precisely balanced characteristics might have the same ambiguity. But everything I’ve seen from my own (extensive) experience in imaging technology and articles like the one referenced is that it’s a combination of:
[ul]
[li]A crappy camera trying to auto-correct the image;[/li][li]Excessive manipulation after the shot, trying to lighten it;[/li][li]A combination of color values that fall into an perceptual niche;[/li][li]…all exacerbated by wide variance in display (monitor settings, monitor quality, view angles, etc.) and the tendency of people to “correct” what they’re seeing according to individual perceptual strengths. (If someone habitually watches a TV with greenish skin tones, it will shift their perception of color balance and they will confuse green and skin colors in an ambiguous setting.)[/li][/ul]
It’s just a 2015 optical illusion needing a whole lot of technical/perceptual levels to support it. Kind of like the weird stuff people used to see in early full-color printing, because some of it was “real” and some was perceptual variance.
ETA: Keep in mind that monitors have two or three axes of variation just in the controls, and more imposed by the physical characteristics of the monitor and the corresponding settings of the video drive system (card or chip). Just diddling your brightness up and down won’t cover the whole range, and if your monitor setup tends towards one extreme or the other, you might not be able to get to the opposite end.
As explained, it’s not about monitor calibration; we see blue, but as I posted above, white things look blue in low lighting. So our brains tell us that even though it looks blue right now, if I carried it over into direct lighting it would probably look white.
FWIW, I first saw the picture on my Galaxy S4 with it’s beautifully calibrated AMOLED display, which is going to render colors better than any conventional LCD monitor and probably much better than most CRTs of yore, and it looked pale blue/mustard gold, exactly like photoshop tells us it is.
The light in the background is bright. The dress is dark and contrasts against it. How do people see white?
Look in the lower left. There’s white cloth with black spots. Like a cow.
I’m almost scared to ask what “white dress” people think those colors are.
Here’s the original of the statue picture I posted above. The white house in the background is white, the statue base is dark and contrasts against it. It must be blue in real life, right?
As stated, we get that it’s actually blue pixels, but the question isn’t what color the pixels are, it’s what color the dress is.
I’m more bothered by the black. Here are the actual colors from the original picture. It’s got nothing to do with monitor calibration. Nobody is seeing that as royal blue and deep black without context.
So the people that somehow see dark blue and black, when you look at the three picture image in this Wired article, all three versions of the dress look the exact same color to you?
I have looked at this damned dress on my laptop, on my desktop, and on my smartphone.
My conclusion from examining The Dress on multiple different electronic devices is that there is a perfectly simple, logical, scientific, and consistent explanation for these perceptual differences:
Approximately 71% of the so-called “humans” on this planet are actually fucking aliens from outer fucking space.
*YOU’LL NEVER PROBE ME YOU ALIEN FREAKS!!! YOU KEEP THOSE TENTACLES TO YOURSELVES!!! LET ME INTRODUCE YOU TO A LITTLE EARTH INVENTION CALLED A “SHOTGUN”!!!1!!1!!!
EARTH FOREVER!!! WHO’S WITH ME, BLUE-AND-BLACK-definitely some kind of darkish color, maybe a little off-black, could just be a darker blueSEEING REAL HUMANS WHOSE EYES HAVE EVOLVED TO SEE UNDER THE LIGHT OF A YELLOW SUN!!!*
[del]“In our obscurity, in all this vastness, there is no hint that help will come from elsewhere to save us from ourselves.” – Carl Sagan[/del] They’re fucking HERE already! And their color perception is different from ours!!! AND THEY ALREADY FUCKING OUTNUMBER US!!!
I was about to do this myself, but you beat me to it. Exactly. While I understand the “white” where people actually see “blue,” it’s pretty clear that’s it’s not white in the picture itself, especially since you have a reference white outside of the frame (the webpage itself.) But how that bronze color becomes neutral, much less black, is curious to me. I have no doubt people actually see it that way–like I said, there is disagreement on the perception even on the professional photographers page I visit, and these are people who (like I) work with photographs and color correction almost every single day. And these people all scored perfectly or near-perfectly on the X-Rite color challenge and are using calibrated monitors. So they’re not color idiots or anything. The brain can do funny things with color perception.
I’ve got to agree, you’ve got it exactly. I noticed that if I relax my vision and observe the dress I can see the blue and black. It takes concentrated UNfocusing to do so.
I believe this is because of the glaringly bright white spots to the right of the dress. If I block them out in photoshop the dress color shifts dramatically.
Missed edit window: I took steronz’s color swatch and put the true neutral gray versions of the colors next to them for reference.
OK, if any part of that dress is either white or black, there is something wrong with my monitor or with my brain.
The main fabric is a lightish-to-mid blue. The trim is mid-brown.
How the…what the…White?! Black?! Where? How?
Now I can ONLY see white and gold. I am going crazy!