That was a fascinating video. Thanks to hammos1 for the link and Sage Rat for asking the question that brought it up.
Regarding the question itself, it’s pretty much what Septimus said. Here’s part of the abstract from a paper about the femto-photography system:
Because cameras with this shutter speed do not exist, we re-purpose modern imaging hardware to record an ensemble average of repeatable events that are synchronized to a streak sensor, in which the time of arrival of light from the scene is coded in one of the sensor’s spatial dimensions. We introduce reconstruction methods that allow us to visualize the propagation of femtosecond light pulses through macroscopic scenes; at such fast resolution, we must consider the notion of time-unwarping between the camera’s and the world’s space-time coordinate systems to take into account effects associated with the finite speed of light.
To the best of my understanding the issue here is simply that the consequence of the finite speed of light along different path lengths and the super-fast “exposure” times of the streak camera can result in the following situation: light arrives at a point p1 and is reflected to a point p2, but the paths are so arranged that light from p2 arrives at the camera before the light from p1. As the paper technically puts it, “although objects closer to the source receive light earlier, they can still lie on a higher-valued (later-time) isochrone than farther ones”.
This may create the appearance of effect-before-cause, but is really nothing more than an artifact of the extraordinarily short effective exposure times during which the light for each image frame is collected.
And in case it wasn’t already clear, the “movie” was made by processing and computationally assembling millions of images created by repeatedly pulsing the laser. The streak camera itself only creates a one-dimensional image along a horizontal scan lane, so the many exposures required for each scan have to be repeated for hundreds of scan lines, and then the whole process repeated for each frame, and then gigabytes of data computationally assembled.