How sensitive are our brains to time?

When I’m watching tv, and can also see the reflection of the same tv on a nearby window or mirror, I see the change in picture on the tv first. For example, I’m currently watching CNN on a tv, and I can also see its reflection on my laptop screen. When the background changes colors, I sense that the change occurred on the tv first, followed quickly by the same color change on the reflection. Does this mean the my brain is recognizing that the light from the tv is reaching me just a fraction of a second earlier than the light from the reflection (by the speed of light)? I first noticed this in my old apartment that had a window next to the tv. I would basically see the tv, and its reflection on the window, at the same time. But it looked like the reflection was just a bit behind. I acknowledge that I probably sound like a crazy person to most of you that just read all that.

I have to believe you’re imagining it. The difference in time would be vanishingly small over any sort of household distance.

This was certainly my first thought. But then I thought about the fact that our eyes have sensors that communicate a change in color to the brain. Isn’t is possible that the brain can detect which sensor was triggered first? After all, the brain has to work immediately on “translating” both pictures (the one from the tv as well as the one from the reflection). So while it may not be able detect how much later the 2nd picture (from the reflection) arrives relative to the 1st (from the tv), it may still know that 2nd came in after the 1st.

People vary a bit, but the fastest your brain can separate stuff is roughly on the order of about 10 to 50 milliseconds. Any closer together than that and our brain can’t separate them and thinks they are simultaneous. Sounds that start closer than that together seem to start simultaneously to us. TV pictures “move” at about 30 frames per second which seems smooth to us because our brain can’t separate the images into individual frames. Video games don’t smooth the image at all like a video camera does, so video games with their harsh rendering can look a bit choppy up to maybe 100 frames per second or so (10 milliseconds per frame).

Light travels at about a foot per nanosecond. So if the light is bouncing about 20 feet or so (all the way across a 10 foot room, hit the mirror, and back to your eye) that’s only about 20 nanoseconds. It’s roughly a thousand times too fast for your brain to recognize which one comes first.

Uh… make that a million. :smack:

Brain fart.

I’ve noticed a similar effect with my bike headlight. I usually set it to flashing, and a lot of signs have retroreflectors, which means that it looks like all of the signs are blinking. And it seems to me that they’re blinking just a touch later than the headlight itself is.

It’s clearly just an illusion, though. As to the cause, I suspect that different parts of the visual field in our eyes have different response times.

The central area of vision is good at detail and colour; the edges are good at detecting change and contrast (and each of these areas is weaker at the opposing skill).

This is why things that flicker more than about 30 or 40 hz are more noticeably flickery when you’re not looking at them.

This appears to be the opposite of what the OP says, but I wonder if the effect is actually noticed when looking at the reflection in the window, and seeing the change on the TV in peripheral vision.

In any case, it’s very unlikely to be actual perception of a lightspeed delay. Our brains play all sorts of weird games in the representation of reality to themselves - what we see isn’t actually the world, it’s an elaborate representation of it, containing fictions, shortcuts and placeholders.

On the topic of how our brains perceive time, the answer is ‘not necessarily accurately’ - for example, if you glance at a clock, the first ‘tick’ of the second hand often seems considerably longer than subsequent ticks.

Have you tried watching the reflection instead of watching the tv? See if that alters your perception of which changes first.

The brain does a HUGE amount of processing in 50 milliseconds. Much smaller differences in excitation times are recognizable.

For example, sounds from your left arrive at your left ear sooner by less than a millisecond than at your right ear. Yet direction can be determined to about 2° ! Of course this uses very specialized brain circuitry, but still we’re speaking of phase differences of about 20 microseconds. (UIAM there are other brain functions where small phase differences play a role.)

But that’s dependent on the speed of sound. I’ll agree that the speed of light is too fast. :slight_smile:

Thing is, you can’t see both the tv and the reflection at exactly the same time. Your eyes only take snapshots of the light, hence why the eyes always have to move (very slightly) to register a smooth image.

So you are only seeing either the tv or the reflection not both. It feels like both… but your brain also does a wonderful job of filling in the gaps around our field of vision. So when you consciously notice the tv, you think you can also see the reflection… but you can’t… either your eyes have shifted slightly to take in the reflection, or the brain fills in the darkness with the previous image the eyes looked at a split second earlier.

I don’t know if that makes any sense, but it does to me…

So you basically take a snapshot of one image, then quickly take a snapshot of the next… one before the other…

Plus there may be a slight confirmation bias going on. You expect the tv to change first, so that’s the narrative your brain is going with. You think you’re being unbiased, but your brain has other ideas.

Worse, you can’t even control your eye movements totally. You might be trying to, and can’t do it, because you are thinking about the other image, you might flick your eye back rather then forward,and then forward again…

I think Guitario has it.

I think that there is sensitivity to phase involved in localization of sounds, rather than simply time of arrival…

Apparently, I’m wrong here- you use a combination cues, including interaural time difference, loudness, and frequency profile (which is adjusted by the shape of your external ear (the pinna or auricle, or simply the ear, in layman’s terms).