What color is this dress?

Yes? Ish?

The left side picture is a crappy, yellowish, muddy filter on what is obviously a blue and black dress - the whole thing is faded out, the blue is greyish, the black is rusty, but I KNOW it’s intended to be a picture of a dress that is blue and black. (As a possible factor, I am a designer, and I am very fashion conscious - I MAY have actually seen the original dress (or the original design that this is a knock-off-of before this whole kerfuffle, but I don’t consciously remember seeing it.) If I am really charitable, I can see how someone might think that the blue is white with a blue light, but that doesn’t make sense to me given how yellow and overexposed everything in the background is.

The middle picture is so obviously blue and black that I can’t even fathom how someone would see gold and white - the black is still a bit on the rusty side, but the blue is BLUE - a bit faded still because of the exposure, but so clearly totally blue that I really thought I was not finding the correct picture to look at when the internet exploded.

The right side picture is almost as dark and rich and saturated as the original dress actually is.

All of them are blue to me, you guys. They just are.

The dress has never changed color for me. I’ve looked at it on three different screens, scrolled up and down, different lighting conditions: it is always blue and mustard (or periwinkle/bronze, lavender/gold, those color families). Now, I can see how some people can be tricked into seeing the blue as white because our brain is trying to color correct for the admittedly whacked out lighting conditions and extraordinarily shitty photography, but I have no idea how black ever would enter the picture from this photo alone. Maybe if you were a purveyor of shitty cameraphone pictures to the point that your brain is automatically correcting such bad photos without you realizing (I’ve seen really bad photos expose this way but it’s such a rarity for me I’d never automatically see that is what happened here).

Don’t get me wrong, I’m talking about the color that is showing up in the photograph itself.

I am a trained artist so I am supposed to be able to see colors “as they really are”, and in this case the eyedropper results from photoshop say that the colors, isolated, are absolutely blue and mustard within the pixels (not counting artifact outliers) so I’m seeing the “true” colors of the photograph without my brain trying to color correct it to a real-life object (which I assume other people’s brains are doing). So my brain doesn’t sit here trying to reinterpret lighting conditions but instead just sees the photo more as a flat object that is what it is.

I don’t even see the optical illusion. Every photo of it, except for when it has been purposefully edited to be either darker or lighter, has looked exactly the same. Light blue and bronze. As clearly shown by the Photoshop eyedropper tool.

Again, anyone who looks at that and thinks it’s white has something wrong with either their eyes or their monitor. I’ve taken those online color tests mentioned above on this monitor and gotten 100% correct.

ETA. Macca, exactly. Exactly. I am seeing the photo how it actually is.

It’s a schooner.

Something else that occurred to me is that software/apps that display images - from browsers to pic viewers to editors to gallery organizers - tend to have a LOT of fugwit features. That is, they only display images - *just *images - and fugwit them according to their own arcane interpretation of how it should be presented. You might open the app and have borders, background and menus that are displayed exactly the same as any other, but the actual image presentation has been “cooked” and adjusted and enhanced and “matched” to your display, and even adjusted according to some perceptual curve the maker or the default settings think is better.

I have a triple monitor setup, all Samsungs on Nvidia cards, and two are technically calibrated and the third is hand-matched. A number of apps - Windows Picture Viewer among them - display images quite “normally” on two monitors, and then throw a distinct yellow/sepia cast on the image in the third monitor. The app frame does NOT change. Only the image displayed does, because WPV is being smart and helpful and fugwitting the image for me.

Another factor that is probably screwing with many people’s view. They see whites and blues and blacks normally everywhere but in a displayed image - and it may be the muddled characteristics of this image that make the software work so hard to “fix” it.

Exactly!! The background (on the right) is white and gold, so the contrast is even more obvious.

I totally get the “white looks blue in low light” thing, but there are obvious cues here that should be telling you the opposite.

What someone needs to do is sell a dress that is viewed in white and gold from one angle and blue and black from a different angle (I think the technology already exists). It would be an instant bestseller and best practical joke of the week.

It’s not a monitor thing. We all stare at screens for hours a day, and we understand what images look like. Just stop with this line of thinking, that’s not what’s going on.

Basically, we are used to compensating for the fact that different colors look different under different lighting. Without that internal white balance, moving from inside to outside or from noon to dusk would be wildly disorienting as everything would seem to change color all the time. Cameras just see light in front of them, so without color correction, stuff looks strange. Even the best photographer with the best camera needs to use color correction for most images.

The cues that we use to calibrate that white balance in this image are ambiguous. It can look either like:

  1. An somewhat normally exposed image taken in cool, daylight-type light, giving it a bluish cast

OR

An low-contrast and over-exposed image which washes out deep colors, giving dark colors a dull, light feel

That’s all there is to it. No shitty eyesight or bad monitors needed. It’s just a non-color corrected image that doesn’t have the cues we normally use to make sense of color.

Well, I finally looked at it on my laptop instead of the desktop. With the screen angled as I see many people keep it, the dress could definitely be called blue and black. With the screen vertical and my line of sight perpendicular to it, it looks blueish-white and golden brown.

Now, if after I said “this picture is blue and gold, obviously” they said, “no, what color is the dress REALLY, in REAL LIFE, if this is a bad photograph?” I would have to think about it a while. My first instinct is the “easy” way of reinterpreting the blue as white but I can see the background is so overexposed the colors must be washed out instead. I would actually have probably said blue and forest green if you asked me to reinterpret what the colors would be on a real object in a badly exposed photograph. I would not have realized that the overexposure was so extreme that the original dress was dark blue and black. Not enough experience with photography. But this isn’t what my eyes are “naturally” seeing, I’d have to mentally jump through these hoops.

Sorry, I should have been more clear. I’ll concede that the lighter area is not pure, extra snow white, maybe even a tinge of blue. However, in the Wired link, you are saying that the lighter parts in the right picture, and the middle picture are the EXACT same shade? And you are saying that you see the darker areas as the EXACT same shade in all 3 pictures?

I showed this to people at my work, and when I placed the picture along side an ACTUAL black item (my laptop), they instantly changed their mind and said it was gold colored. And after I moved it away, they still saw gold. Anybody else try that?

It’s not really that white looks blue in low light per se. It’s more that true white light, when yellowish tungsten light is in the scene, will appear blue because your eyes are compensating for the yellow of the yellow light in the scene and trying to push that part of the scene into white.

Take, for example, a piece of pure white paper. Look at it in daylight. It is white. White balance your camera to daylight, and the paper will look white in the photograph. Now, stick this piece of paper in deep shade. Look at it. Shade lighting (it doesn’t matter how bright it is–it could be perfectly bright) is much bluer. However, your brain compensates for this and your perceive the color as white. Now take a picture with your camera still set to daylight white balance and take a picture. It will look very blue compared with the original photo.

Same thing happens with incandescent (yellow) light. Your eye/brain will perceive the white paper as white, but the photograph will show that white paper as being quite yellow.

While I see the color as bluish/periwinkle, my initial guess would have been that this is a dress illuminated by a white or bluish light source (like window light), and that the “blue” is supposed to be a more neutral color. In fact, you can take a white and bronze dress and have it come out pretty much exactly like this picture in that sort of set-up. In mixed color-temperature lighting situations (which are fairly common), it is often difficult to determine what the actual colors of the objects in a frame are without having other references in a scene that are illuminated by the same light to compare with.

They just showed this on the local news. (Otherwise, I wouldn’t have clicked on this thread.)

The image they showed on TV definitely looked white-and-gold. They showed a different image and it looked blue-and-black. Clicking on the link in the OP, both images appear blue-and-black.

Since we’re looking at images of images (of images), who knows what it looks like in real life? I’m guessing this is a viral marketing thing.

Not the same shade no, but I see them all three clearly and instantly as blue and black.

The blue is clearly blue in all three pictures, from a pale winter sky blue to a really rich royal blue.

The black is much more affected by the image quality, it’s actually dingy yellowish in the first image, more of a nasty really dark burnt orange muddle in the second, and getting close to true black in the last, but they all read INSTANTLY as black to me, and I have to look at them to see what color they “really” are in the image.

Again, for me this is likely due to fashion background - once I identified the blue as blue, from experience, the dark lace HAS to be black. I wasn’t thinking this consciously tho - it just is what it looks like.

If it’s a marketing trick, it’s a bloody amazingly tricksy one, and a lot of effort for a really mediocre and frankly already outdated dress.

We can all agree that AB has superior monitors and superior knowledge of color though, right? That’s the main thing here.

No, it’s not. Is that how the news presented it? :dubious: It’s a real pic of a real dress that someone’s mom wore to a wedding. The mom had taken the pic and sent it to the girl to show her what she was thinking of wearing. It caused a stir because some people were like “Why would you wear a white dress to a wedding?” and other people were like “A white dress? But it’s blue!”

Here’s the story.

Sorry, but I don’t really get what you mean here. You think pale winter sky blue and rich royal blue are the SAME exact color?

And you think dark burnt orange muddle is the SAME exact color as black?

if not, when asked what color the dress was in the picture, why wouldn’t you say “dark burnt orange and pale winter sky blue”? Why would you say dark blue and black?

That makes sense, except what is being marketed and by whom? Or *to *whom, for that matter? I don’t even recall who first brought it up or how / why it first got spread. If there is a hidden agenda the perpetrator is doing a helluva job because every damn body(here at least) is talking about it.

I first caught it on my local news but didn’t have time to pay that much attention except to see the picture. They featured it on the Today show and included a model wearing the actual dress. I’ve looked at several different websites, and every incarnation save the dress worn by the model looks clearly, no doubt about, whatchoo talkin’ 'bout Willis white and gold.

I can’t see the black and blue version no matter what trick I try. I don’t know whether to fear for my eyesight / sanity or to just enjoy being part of one big ass magic trick:confused:

There is probably that going on. (Although not for everybody.) Even in the darkest picture on the right, the lighter parts of the “black” in the dress (the right side of the neckline) still look muddy bronze, not neutral, to me. Poking around in a couple of places in Photoshop it is, indeed, a muddy brown, not black.

Here it is side-by-sidewith the Wired version on the left and the “true neutral” dark gray version on the right. Maybe the differences show up more because all my monitors are calibrated, but, to me, I see brown in the picture on the left (and Photoshop confirms this) and black/gray on the right.

Yes, it is. It’s a picture (an image) of a dress that is shown (as an image) in a studio, that is displayed on my TV (an image).

EDIT: Oh, wait. You were talking about the viral marketing comment, right? Might not be. But for something totally irrelevant it’s getting a lot of buzz.

.

Sitting in the dark while reading a tablet (lots of white background), she leaned over with her phone and asked what color it was. I immediately said “white and gold”. She called me crazy and said it was “black and blue”.

I then sat up to pay more attention, and after scrolling down to read the content, I scrolled back up to a…black and blue dress. “Ha!” I thought.

Then I sat up to pay more attention than before, and started seeing both, depending on where I focused. The top (leftward) half of the dress is more white/gold, while the bottom brings out a blue/black. I see both when looking at it, but then it shifts to one dominantly, if I look away at something else before returning to it.

That photo is playing all sorts of games with my mental-mind.