Does hi def make special effects more difficult to create?

Hello Everyone,

We’re at a family members house and they have a new flat screen that had a much better picture then e have at home. The HD shows are incredible, but I’ve noticed that it’s easier to tell when they use a blue screen or computer graphics. Is it me or does hi def present s serious challenges to special effects designers?

EVERY advance in audio and video technology has made it more difficult to make non-real things seem to be real.

If you are interested in the progression of FX, there are lots of shows that talk directly about it.

One great example, is built on Star Wars, because that series of films has had it’s special effects completely redone several times, as technology improved.

It makes a difference for practical effects and set design too. When the revived Doctor Who switched to filming in HD, the trusty old Tardis prop looked like, well, what it really is - some painted wood - so they had to make a new one finished to a much higher standard.

I think you have to consider that along with the challenges that HD may have brought, other concurrent technological advances made things a lot easier in other ways. There are free apps you can get for your phone that do some pretty advanced computer animation on the fly!

Definitely.

I’m a special effects artist in the games industry. Working on effects for a platform like, say, the PSP is *so *much easier than doing similar effects for the PS4. The increase in visual fidelity means that PS4 textures have to much larger and more detailed, for example. Lower resolution systems are a lot more forgiving than high-rez games.

Hi-res requires more processing speed, and more storage depending on the imaging technology. With images produced for HD larger sharper screens make artifacts more apparent. Different kinds of technology are used for games and video production. The software for video production is mostly scalable to any resolution now, for video games which have to generate images in real time more storage and faster processing is needed so the challenge is to the software and hardware designers and not so much for the graphic designers.

Just for rendering CG animation and visual effects, going from standard definition (720x480) to HD (1920x1080) raised render time by at least 600%.

Obviously going from 2k resolution to 4k quadrupled render times, because that increases with the square of the resolution.

So, all else being equal, oh yeh, higher definition always means more processing time or power.

I remember when TV stations started going digital and hi-def, and anchorpeople everywhere had to learn new ways to apply their makeup, as the improved resolution revealed every flaw.

Star Trek: Deep Space Nine made a time-travel episode in which they interacted with the original Star Trek Crew. They reused scenes from the Tribble episode with the new characters inserted. There was a “Making of…” special in which they described some of the difficulties such as coffee stains on the shirts now being visible. And I’m pretty sure they weren’t even using HD. Just better technology on the old systems.

TV special effects are much harder than radio. Even color vs black and white. They used some pretty crazy makeup in the black and white days. And did you know the set of The Munsters was mostly pink? :slight_smile:

Then double that (or more) for high frame rate video (48fps, up to apparently 120fps).

For visual effects it’s a double whammy. For example by doubling the linear resolution of the output not only are render times for a particular frame quadrupled (as CMYK explained), but now, for example, 3D models and texture maps have to be higher resolution too, that not only increases rendering time but also production time necessary to create the finer detail; particle effects, such as smoke and water also have to be done at a higher resolution which increases simulation times; compositing has to be made using higher resolution images too, which slows down the process, and so on and so forth.
Not all of those things translate to a doubling of time necessary to make them (except stuff like particle/fluid simulation that can grow not by a factor of two, but by a factor of three), but it all adds up to the process.
At 4K and 48fps and a lot of visual artists are quite ready to cry “uncle!”; on top of that creating 3D content doubles rendering times for obvious reasons.

I remember reading an article about the BBC soap EastEnders, which said that a lot of the sets had to be rebuilt, or at least heavily touched up, when filming switched over to HD. Here.

The settings of the TV can make a difference in how the effects look as well. They can search for their TV model + ‘settings’ (as well as their model and ‘soap opera effect’). Generally, sports look better with motion settings ‘on’ and movies with motion ‘off’.

There are some stories that HD made or killed careers in the porn industry. No idea if they’re true. I know it’s not exactly what the OP is asking, but let’s face it, breast enhancement is a form of special effect in porn, and seeing a bunch of scars can be … off putting.

Many people still don’t see Groucho Marx’s eyebrows and mustache for the stage makeup that they were. (It was the style at the time.)

Maybe, but I recall someone mentioning (maybe it was here, maybe somewhere else) there’s certain shot angles that simply aren’t used in porn anymore because they would show the actresses implant scars. Since it would seem that nearly all porn actresses have had breast augmentation of some sort, the cinematography is what had to adapt.