Video of backwards time

Some years ago, I saw a video that was about super-short exposure time photography, of such a super-short duration that the universe is unable to decide which direction time is traveling and it creates some strange artifacts in the image that is made.

As I recall, the sample camera shots that the video discussed were of a lighted scene of water (though, I may be wrong).

I cannot find this video nor even anything to confirm this effect on Google. Can anyone confirm and, ideally, point me to this video?

Sounds like woo to me. The universe doesn’t ‘decide’ about anything; it just is, and time ticks forward however fast we film it.

No, it was a video from Smarter Every Day, Royal Institute, TED, or something… It was a physics explanation, not random nonsense.

Obviously, I may be misremembering or have misunderstood, but hence seeking the video.

Since you mention ‘an illuminated scene of water’ and very short time intervals, I wonder if you’re referring to the work done by the MIT Media Lab to develop a trillion-frames-per-second imaging system? Their demonstration video showed a laser pulse travelling through a bottle of water. I don’t remember any discussion about the direction of time, however.

That got me to it.

This one, at about 9m in:

It would be a video of a situation where entropy is changing only very slowly. All of the cues we perceive to direction of time are based on entropy increasing, and so if it increases slowly enough that we can’t tell the difference, it’ll look equally valid forward or backward.

Time-reversed causality is an interesting concept :stuck_out_tongue: — but this ain’t it.

It is simply the distance from a reflected light pulse to the camera that “reverses” time in that example. Perhaps a diagram with some trig calculations would make this clear, but I’ll try an English explanation:
If pulses a and be are emitted, in that order, by some source, reflected off the floor, and then sent to the camera, pulse b may arrive at the camera before pulse a if its direction was more toward-the-camera compared with pulse a.
I wonder if this effect is slightly akin to “breaking the sound barrier”. :smack:

That was a fascinating video. Thanks to hammos1 for the link and Sage Rat for asking the question that brought it up.

Regarding the question itself, it’s pretty much what Septimus said. Here’s part of the abstract from a paper about the femto-photography system:
Because cameras with this shutter speed do not exist, we re-purpose modern imaging hardware to record an ensemble average of repeatable events that are synchronized to a streak sensor, in which the time of arrival of light from the scene is coded in one of the sensor’s spatial dimensions. We introduce reconstruction methods that allow us to visualize the propagation of femtosecond light pulses through macroscopic scenes; at such fast resolution, we must consider the notion of time-unwarping between the camera’s and the world’s space-time coordinate systems to take into account effects associated with the finite speed of light.
https://dl.acm.org/citation.cfm?doid=2461912.2461928
To the best of my understanding the issue here is simply that the consequence of the finite speed of light along different path lengths and the super-fast “exposure” times of the streak camera can result in the following situation: light arrives at a point p1 and is reflected to a point p2, but the paths are so arranged that light from p2 arrives at the camera before the light from p1. As the paper technically puts it, “although objects closer to the source receive light earlier, they can still lie on a higher-valued (later-time) isochrone than farther ones”.

This may create the appearance of effect-before-cause, but is really nothing more than an artifact of the extraordinarily short effective exposure times during which the light for each image frame is collected.

And in case it wasn’t already clear, the “movie” was made by processing and computationally assembling millions of images created by repeatedly pulsing the laser. The streak camera itself only creates a one-dimensional image along a horizontal scan lane, so the many exposures required for each scan have to be repeated for hundreds of scan lines, and then the whole process repeated for each frame, and then gigabytes of data computationally assembled.

Yes - the rig is set up to capture light travelling from left to right through the field of view, but other parts of the scene are at different distances from the camera - and that matters a lot in this context.

I believe there’s an experimental video game somewhere which simulates the effects of light being slowed right down to walking pace and below - it creates some very counter intuitive effects, for similar reasons.

The game that **Mangetout **refers to is ‘A Slower Speed of Light’ (coincidentally, another product of MIT)- more info here.

I thought of a way to visualise this problem (or at least something like it).

Imagine the speed of light is 10 feet per second. Two observers are standing 20 feet apart, facing each other. Between them, a plank of wood, 10 feet long is suspended level with their heads - each end of the plank pointing at one of the observers.

The plank is dropped and falls whilst remaining parallel to the ground - it falls flat. The two observers will disagree with this - they will both perceive that the end closest to them hit the ground first (because the light from the farthest end of the plank took a second longer to reach them than the light from the nearest end).