I think that’s somewhat true in some cases, but if sharpness and detail were detrimental then hi-def and 4K wouldn’t be as popular as they are (frankly I think 4K is overkill for any but the very biggest TVs and for projection systems, but that’s a topic for another time). I used to think that videotaping was responsible for some of the “soap opera effect”, because the absence of intermediate film and film-to-video processing made the image noticeably sharper than normal, but clearly the real problem was crap production values since far sharper 1080p movies don’t have that problem.
The author of the linked article basically agrees. Their problem seems to be with movies that have the high-rez everywhere, not selectively. So that a viewer doesn’t have a visual clue about where to focus and instead has to process the whole screen.
Anyway, it’s nice to have the view of someone in the profession, although it is a decade out of date now.
Some TV shows are shot in 60fos. For example, Supernatural was shot in digital UHD 4K 60fps. I’m pretty sure a whole whack of shows are now being filmed in 4K/60 or even 4K/120, which the ARRI Alexa can do (which Supernatural used)
We’ve been rewatching it in 4K UHD, and it’s very strange. Even after watching 20 episodes or so, I can’t get used to it. It feels fake.
Why would they shoot soap operas in 60 fps? Is that even true? I can’t see any advantage to shooting 60 fps just to then display it in 30 (29.97) fps unless you wanted to slo-mo the entire thing.
Video taping meant for broadcast over 60hz devices -TVs (I’m fuzzy on how 60 fps is more compatible with 60hz - but I’m told it is). Tape is cheaper than film and can be reused, so there wasn’t a cost or storage constraint on video tape like there was for film.
My understanding is that this is about old analog video. Soap operas were recorded on video, not film. And since TVs ran at 60hz (interlaced), so did video. The higher quality shows (and movies) would use film, which ran at 24 fps.
I don’t know what soap operas in the HD era have done. But those are also watched much less often.
What I find interesting is that no one says that high frame rate movies look like YouTube videos. So many of those shoot and display at 60fps.
Personally, I find I can’t tell the difference with YouTube videos. I watch videos that run at 60fps, 30fps, and even 24fps. I only know this because I check the stats.
I think it may not be the framerate exactly, but that 24fps movie cameras are set up to mimic the look of film. You need the exposure times to look the same.
actually, a lot of the younger generations since the ps2 and Pentium pcs should be used to 60-120 fps since that’s what 90 percent of games are made at, and unless youre an “indie darling” trying to be “artistic” woe on the game that’s under 60 fps …they get eviscerated…
even “chill” games like harvest moon/story of Seasons are being redone at 60 FPS at the very least
if you ever want to see how a format change can make a difference in being filmed … watch some of the TV westrens that went from B&W to color it showed how badly and cheaply the sets and costumes were although the actresses were happy since it forced the shows to actually buy new costumes/wardrobes and redecorate the sets
But TV is broadcast in 29.97 fps. So why would it be shot at 60 fps?
I don’t think the refresh rate has much to do with it. I’m not sure I believe soap operas are 60 fps. Does anyone have a cite showing that this actually happens?
Normally video is shot at high frame rates to allow for smooth slow-motion. Since soaps are not in slo-mo i still cant see why it would be shot at 60 fps.
I might be wrong about video being shot at 60 fps, but 30 fps is still visibly different from film’s 24 fps. A higher frame rate was just a consequence of the new medium (and some techno- interpolation with 60hz)
NTSC Video is 30i, or 60 fields per second, interlaced. Well, 29.97 x 2 (59.94) anyway. This is because each field is only half a frame, consisting of even or odd scan lines.
If you’ve ever converted NTSC interlaced to progressive scan, you can see the motion artifacts in fast motion because the two fields are not identical.
SD NTSC (USA)** video was shot at 59.94 fields per second. A field is 1/2 an interlaced frame, every other line on the screen, so you’d get the motion of a 60th(ish) of a second with the content of a 30th(ish) of a second. It’s hard to get your head around, but worth thinking about if you want to understand how you perceive(d) video of that type.
Basically every other line on the screen is offset in time by 1/2 a frame time (1/59.94) from the other lines.
29.97 video lacked the detail of 60fps, but it included the motion of 59.94 video.
**see PAL for Europe, 25fps @ 50 fields/sec. Also SECAM for France and its possessions, not all that different from PAL.
I’ve seen this, but although it may be more obvious when viewing on high-resolution progressive-scan devices, the same motion artifacts will naturally be present in the original interlaced version if the moving object is in different places in the two interlaced half-frames. The de-interlaced progressive video looks perfectly normal until there is fast motion, whereupon the image of the moving object breaks up into alternating scan lines. For some reason seeing this on de-interlaced progressive material is particularly annoying. There undoubtedly must be algorithms that make this less objectionable simply by blurring the images instead.
AIUI, in the early days of HDTV, channels and networks that broadcast a lot of sports chose the 720p progressive standard over 1080i precisely to avoid motion artifacts; 1080i was nominally higher resolution but wasn’t good for dynamic fast-moving imagery. 1080p was technically available but required so much compression to fit into the broadcast bandwidth that it wasn’t a useful solution. These conventions mostly remain for legacy reasons but new standards like ATSC 3.0 theoretically offer as much as 4K progressive @ 120 Hz with H.265 (HEVC) encoding, provided of course that everything involved end-to-end including the TV supports it.
Thank you but it’s not hard for me to get my head around. I’ve worked in TV and film production for over 25 years. It’s the only thing I really know at an experienced, professional level.
I’m quite familiar with progressive video, interlaced video, fields, frames, drop-frame, 3:2 pulldown, 24, 30, 60 (nominal) fps and all of that.
I’m not a broadcast engineer but otherwise I know my way around this stuff pretty well.
My questions here were not about how all that works because I’m already well-versed in it. I was responding to the assertions that soap operas are, and have for a long time, been shot at 60 fps.
I have never been involved in soap opera production so I dont have direct knowledge of production standards in that specific area. But the claim sounded odd to me based on all I do know about video production standards.
I was just asking for a cite or some evidence that soaps are shot in 60 or 59.94 fps because it makes no sense to me but who knows? Maybe they are for some weird reason and if so, I’d like to learn why.
Even after a lifetime career there’s always plenty more for me to learn.
SD soap operas and lower budget shows like “All in the Family” were shot on videotape. I explained above why SD video had 59.94 motion, but you’re a video professional so you already knew that.
Other television shows of that era were shot on film at 24 fps, then telecine’d to 29.97. But you’re a video professional so you already knew that.
I don’t know how HD soap operas are filmed, but you’re a video professional so go figure that out and let us know.
I don’t think the snark is necessary. I only brought up my experience because I couldn’t understand why you were explaining all this rudimentary stuff to me when I don’t think I ever gave any reason for one to think I needed it explained when I asked about something pretty different than you were going on about.
I guess it was maybe something of a strawman (maybe accidental on your part) for you to tell me all about broadcast video having 59.94 fields per second when that is not at all the same as having 59.94 frames per second.
I had asked clear questions of the people who asserted that soaps are shot at 60/59.94 frames per second. Why then did you start lecturing me about 59.94 fields ps, 2 fields per frame, and the basics of NTSC interlacing? What has that got to do with claims in this thread that soaps are shot at 59.94 frames per second? What did I post that led you to believe I needed help with that?
Frames and fields were of no concern at all—instead my question was about the claim that soaps are shot at 59.94
/60* frames a second.
My best guess is that you thought that ‘fps’ stands for ‘fields per second’. That, at least would begin to explain your confusion.
I admitted that maybe soaps ARE shot at that frame rate for some reason but it doesnt make any sense to me so I’d like to see a cite. (Slo-mo is the only reason I can think of for a high frame rate which costs more money and needs more lighting).
Do you have any thing to add about what I was actually asking about?
*I wrote 59.94/60 fps because im not sure if the soap fps assertor conflated the two.
Comparing high frame rates to soap operas associates it with terrible writing, atrocious acting, brightly lit sets, and a 1980s video tape look.
But it doesn’t have to be that way. The frame rate really is independent of that. Eliminate the associative bias, and it’s not anywhere near as bad as everyone says - it’s different, and not what you’re used to, but that doesn’t automatically make it bad. I wish people could look at it more objectively, and give it a decent chance, but it seems we’ve been conditioned that it’s been decided already by influential movie snobs.
Ugh. The point was that a soap opera shot in SD on videotape has frames at 30000/1001 fps, but motion at 60000/1001 fps. It looks smoother than:
A TV show shot on film (say, Cheers) at 24fps is telecine’d from 24 to 29.97 for broadcast. The motion is jerkier, but it has a cool ‘film’ look.
So yes, SD soap operas are shot at a higher “frame” rate than other TV shows. The motion is essentially 60fps, even if the per frame detail is not. And it contrasts enormously with the telecine’d material that was very very broadly circulated in primetime television at that time.
I have no idea what the point of this statement is. Old soap operas were shot at exactly the same frame rate as any other production that was videotaped. They looked (and sounded) the way they did because of cheap production values, and the sharp, hard look of videotape amplified that even further. So, as far as “60 fps” is concerned, whoever made that claim was conflating 60 interlaced half-frames per second with 60 fps. It was just standard video at 480i and 30 fps, just like any other.
Also, just FTR, not to start an argument …
Yes, but All in the Family was not done that way because it was “lower budget”, it was because Norman Lear explicitly wanted to create the immediacy of a live audience that direct-to-video provided, and it also evoked the look of the British sitcom that it was based on (Til Death Do Us Part). All of Lear’s later sitcoms of the era were also shot on videotape for the same reason.
It’s true that many TV shows of that era were shot on film but certainly not all, not even close. For instance, besides All in the Family, these sitcoms of that period were all videotaped:
Maude (1972-1978)
Good Times (early 1974-1979)
The Jeffersons (early 1975-1985)
One Day at a Time (late 1975-1984)
Diff’rent Strokes (1978-1986)
Sanford and Son (early 1972-77)
Welcome Back, Kotter (1975-79)
That’s My Mama! (1974-Dec. 1975)
Three’s a Crowd (1984-85)
WKRP in Cincinnati (1978-82)
Family Ties (1982-89, except for the S4 London episode, which was filmed)
What’s Happening!! (1976-79)
Silver Spoons (1982-87)
Who’s the Boss? (1984-92)
227 (1985-90)
Kate and Allie (March 1984-89)
Gimme a Break! (1981-87)
Amen (1986-91)
Webster (1983-89)
The Facts of Life (1979-1988)
The Cosby Show (1984-1992)
A Different World (1987-1993)
Married With Children (early 1987-1997)
The Steve Harvey Show (1996-2002)
Barney Miller (1975-82)
Alice (1976-85)
Soap (1977-81)
The Betty White Show (1977-78)
Carter Country (1977-79)
Bosom Buddies (1980-82)
Newhart (Season One 1982-83)
Nine to Five (1982-83, 1986-88 Season 1 was filmed.)
The “soap opera effect” was being discussed upthread so I referenced soap operas as an example. Yes, any show shot on video tape had 30fps detail and 60 fps motion. And of course many shows were shot on tape: it was much, much cheaper than film, but film was still higher quality (better contrast ratio) and was also widely used.