Yeah, both of those are blue to me. In the real dress, as I said in my first post, it looks like a very light blue to me (hence me saying “periwinkle” in my original post on the topic six years ago.) As a photographer, it is very common for white objects to take on a blue cast in certain lighting situations (particularly shade) hence my brain autocorrecting it to white. For me, the perplexing part isn’t the blue part of the dress – that I can see being interpreted as white or blue, but there’s no doubt there’s blue in it. It’s that goldish/brown color which blows my mind people see as “black” (or any shade of “gray.”) It very much has color in it, to my eyes, as confirmed by checking in Photoshop. But, regardless, the blue and black people are correct based on the original dress.
The black bit looks orange/brown for sure, but it looks like the picture is oversaturated causing black to appear brown, which is why I describe it as blue/black. Ie the color in the picture isn’t black, but it looks like it SHOULD be black.
They don’t at all to me. I assume they are the same colours, but it’s hard to believe it. Our visual system isn’t designed to give us literal information - this is light of such-and-such a hue and brightness - but to tell us what is really there, no matter how it’s lit or how near or far it is.
Both of the xkcd ones look white and gold to me. My brain interprets the one on the right to be brightly backlit, with a white and gold dress in shadow.
I guess the girl on the right is Asian, though. She has a golden-skinned face.
Just as speakers/sound system matter a lot to the Laurel/Yanny, your display and settings matter a lot to this.
On my screen, the blue part is too dark and the washed out part too light for me to interpret one as causing the other. And the black part, while not completely desaturated to me, is enough darker than the blue that I can’t interpret it as gold in any way. It’s a deep brown that I would interpret as how black sometimes appears when it’s faded or washed out.
Ok, if I stare at the dark blue space around the dress on the left in the XKCD link until my eyes get unfocused I can kinda imagine it being a paler blue than the one on the right, like a Robin egg blue, but I don’t think that’s what all yall mean.
I did ![]()
It’s not really about colors at all. Seeing different colors is a side effect of the compensation that our brain does for the ambient lighting.
To my mind, the real question is: what are the contextual clues that our brains are using to determine if this is a white-and-gold dress seen in shadow, or a blue-and-black dress seen in bright light and washed out?
Objectively, the latter seems the obvious choice, since we can look at the background and pick out various overexposed objects like the floor, table, and window. But we see the color of the dress intuitively, and our brains are probably using different clues to infer the situation. But which ones? Why are different people’s brains using different clues? And why does this image specifically seem to exactly straddle a line between seeing it one way or another (even for the same person)?
I just noticed the number of that comic. I guess in 1492 XKCD explained a dress of blue.
Well, he doesn’t answer my question, but he does present the same claim I’m making in an engaging way. In fact what he says is that none of the hypotheses (as to why this image seems to straddle a line between possible interpretations) have had any science done on them.
Our brains pretty clearly use more than just the overall color of a scene to determine the white balance. Consider this famous illusion:
A and B are the same shade of gray, but B looks lighter. Think about what we’re doing subconsciously, though: B looks lighter because it appears to be on a white square. This requires our brains to have the concept of a checkerboard in the first place, with alternating light and dark squares. The scene is framed from a 3/4 view, and there is a cylinder in the corner. The scene is lit from the right, a little towards us, and the cylinder casts a shadow on the board.
All of this is completely intuitive and largely subconscious. And yet required for the illusion to work at all. A dumb computer program processing the image pixel by pixel “sees” A and B as exactly the same shade, because it knows nothing about the scene. But our brains are constantly analyzing the scene for clues to how it’s “supposed” to be and adjusting our perceptions to fit.
If you built that model in real life (an actual cylinder on a checkerboard, not attempting some other trick), then B would in fact have been painted a lighter color than A. That’s just the physics of shadow and light. Our brains would be making the correct determination in that case.
Getting back to the dress, we’re presented with a scene that’s much less clear than the checkerboard. Where is the light coming from? Is the washed-out background actually evidence that the dress is washed out, or is it far enough away that it’s expected to be different? Or perhaps evidence that the dress is actually back-lit? Is the background just too fuzzy and ambiguous for our subconscious to make sense of at all, and so we randomly pick one interpretation or another?
To me, these different factors actually lined up with reality: I see it as a blue-and-black dress in washed out lighting, and that’s what it in fact was. But that doesn’t necessarily mean my brain was responding to the right clues; could be I just got lucky. It would be interesting to figure out what those clues actually are.
For me, I think it’s the light bloom around the top right hand side of the picture that made me think that it was strongly backlit, like from a window to the outside, and that the dress is sitting in a darker area relative to the background. I’ve always found this photo interesting because while white/gold is the first thing that comes to mind when I look at this photo, I can actually shift myself to see blue/black particularly if I scroll to it in a way that I see the bottom of the photo first instead of the top.
But I don’t understand all this talk about a “washed out background” etc. For me, absolutely nothing changes in the perceived colors of the dress if I remove all background context, i.e. just cover up everything else and look at just a small part of the dress itself.
Whereas in the checkerboard a few posts above, if I zoom in super-close to the A or the B and remove all context, just look at the shade of the letter itself, obviously I then see that the two letters are exactly the same shade.
So, possibly you fall into the background-is-too-ambiguous-to-make-sense-of camp. When I cover up the background, the dress looks pretty much like its unadjusted color: a very light blue on brown. That doesn’t match either common interpretation.
If I’m not misunderstanding the circumstances of the photo, I believe what’s happening is that the entire photo is completely washed out, not just the background. So expecting us to see it in its true colors is like expecting us to assess the color of something with the lights off. Nice gotcha, woo hoo.
Perhaps interestingly, I categorize myself in the ‘blue/brown’ category, since that dress is definitely not white. However it appears that I reach this conclusion by making the same error as the ‘white/gold’ people do - I still see the colors as light, rather than the dark colors that (I think) they truly are. (Are in real life, not the picture.)
Same here. I’d say the white/blue is blue violet, and the gold/black is halfway between gold and olive drab.
This isn’t about a “gotcha”. It’s about the variability of human perception.
And that also means there’s no “error” and no one need to worry that they are being “tricked”. It’s not wrong to perceive something in a picture in the way the imaged object would appear in good light. It’s a natural feature of human vision. It’s also not wrong to perceive the dress part of the image as having the colors the actual pixels have.
There’s also a sneaker from a couple of years ago but that doesn’t seem to have as much traction.
That one I literally see as gray and teal, but I can see it as a muted pink and white. If it really is pink and white in real life (which I’m going to assume it is?) then that’s it a really strange color cast that I’m not used to seeing as a photographer. Typically you have amber/yellow, blue, green, or magenta color casts from various light sources or lighting conditions. This teal is odd, but I’m guessing it must have been taken in mixed-lighting conditions. Or perhaps this camera was in autowhite balance mode and the software guessed quite wrong as to where to set the white point.
ETA: The weird thing is the more I look at the big version of the photo (the one on the second link), the more and more it looks pink & white as my brain is forcing the shoelaces to go white. It’s actually pretty cool. I don’t get the effect with the dress, but here it started out very strongly teal & gray, and slowly faded into pink and white. I can “reset” my white balance by looking at the pink shoe or at the sides of my screen (which are white or neutral gray.) The effect of these would be better if the webpage itself didn’t include white reference points. Like in the picture, you can judge the laces against the side of the webpge and see it’s not white.
Just look at the thumb holding the shoe. Unless it belongs to a Smurf or someone going hyperthermic, there is something majorly wrong with the color in that photo.
The thumb I didn’t even notice. It doesn’t look all that Smurfy to me, just drab. But maybe that’s because my brain already white balanced it.
ETA: OK, that is clearly what’s going on, because when I look at the side-by-side photos, the hand looks pallid, but when I look at the shoe by itself, it shifts. It’s interesting how I don’t see the teal and gray so much anymore and anytime I look at the shoe now, it’s automatically corrected to pink and white.