Not a thread about race. I was watching something on TV the other day, I don’t remember what it was, but it was recently made. It was clear from the context that it was the sort of thing for which an actor and possibly the foreground were filmed in front of a green screen and composited on a separately filmed or generated background. Now, back in the day, there were various technical reasons why green screen compositing might not look right, but pretty much all of them have been solved, and for the most part usually it’s done convincingly now a days. But…
In this example, although various other aspects of lighting and color may have been matched, the light levels were not properly matched. The background was both sharper, and it’s darkest black was darker, than the foreground. This is a big obvious mistake - the eye is used to cues from reality in expecting (because atmospheric scattering increases with distance) much farther away things to be slightly more blurry or stochastic (visually noisy), and also to be lighter or at least to have less contrast of luminosity. But this isn’t the only example. I see this happening a lot with composite images. But even with non composite images. Every thing goes through a computer now, and pretty much everything gets digitally color corrected anyway.
So why is it that the editors are largely not properly adjusting luminosity levels? Instead of a black = black schema, the darkest black in most scenes, even in scenes that take place in the dark or at night, is only about 80-90% true black.
Lest you be tempted to say, “well TV pixels can’t actually make a pure black”, I’ll point out that in presentations which have an alternate aspect ratio such that there are black bars, the darkest black isn’t even as black as these black bars.
So what’s the deal? Why are TV editors afraid to adjust luminosity levels properly?
Various broadcasting authorities around the world have defined “broadcast safe” limits for things such as black levels, white levels, which region of the x lines of resolution you’re supposed to use for the actual picture, etc. This was to account for the different capabilities and calibrations of consumer sets, so that viewers would see roughly the same thing on any set.
A lot of these limits are now obsolete in the age of digital video and pixel displays, but they linger on, and people who are accustomed to computer video with its absolute blacks and whites are often unfamiliar with them. For example, you’ll find skins for DVR software such as Myth TV that employ white values of 255/255/255, or fully-saturated colours, which can look horrible on TVs, especially when compared to the usual levels of black/white/saturation seen in TV broadcasts.
So I would guess that the mismatching black levels are due to different editors not conforming to the same standards.
One factor to be taken into account is the fact that different media have different contrast ratios (‘contrast ratio’ = the range from the darkest part of the scene to the lightest that a given medium can accommodate in a given frame). 35mm film has a greater contrast ratio than videotape.
Let’s suppose you have a movie scene involving some green screen or other form of compositing, and care has been taken to get it to look ‘right’ - including matching the apparent black levels. When the movie gets copied to video and converted for showing on TV, the scene may no longer look as good because of the difference in contrast ratio.
Can this be corrected? Yes, to some extent. But often the problem is simply ignored, or the necessary work is not done (or done very badly) because it’s too difficult / time-consuming / expensive to be considered worthwhile.
Heck, I encounter that on my CRT monitors. The one on this computer is horrible for video/photo editing because there is a much higher contrast between white levels than black levels. Like I can tell the difference between true white and the f8f8f8 background of the Dope easily.
And, yes, my contrast setting on the monitor is at maximum, and this is the best I can get tweaking the card’s settings. And I can’t afford to replace the monitor–although I’d gladly accept a donation.
The worst, probably an artifact of “On Demand” is 3 sets of aspect ratio black bars, each lighter than the next. Which brings up another question, why are TV videos always lighter when converted or incorporated?
Another problem with compositing mismatches is that studio lighting can’t recreate natural lighting very reliably. Sometimes it just works, but most of the time there’s a subtle difference that manifests in imperfect colour matching.
Plus, a bad compositor is going to do a bad or rush job occasionally. I know I struggle with it.
This, I think, is just a matter of the person doing the compositing not doing a very good job - if you had an example or screen cap we could probably be more specific…
Back in the standard NTSC video days, pre-HD, pure black was something to avoid on a TV signal - my experience is from the pre-HD days so things may be different;
The NTSC signal is divided up in to 140 “IRE” (heck if I remember what IRE stands for). The lower you get, the darker the image. The thing is that the signal uses 0 IRE (pretty damned black) as a reference point, and if the image in the visible part of the signal is also that black, it can cause problems with transmission or playback. So the standard is for the blackest signal in the visible transmission to be set at 7.5 IRE. This is notably grayer, as opposed to pure black.
Giving a TV a signal outside of the specs can result in some serious picture issues - a really good example of this is if you’ve ever seen a tape copied from a copy-protected (Macrovision encoded) tape. Macrovision (at least the early versions of it) takes advantage of this by encoding cycling black and white levels that exceed the NTSC spec in the parts of the image you don’t see on your TV. This causes the recording machine’s gain control to freak out, rendering the picture unwatchable.