I hate digital video, it looks horrible compared to film

It has been something I had been subtlety noticing for a long time, how some newer films just look off somehow.

Then I saw Inland Empire, god the digital video look makes it almost unwatchable, it really is hideous. Compare it to Mulholland Drive.

I hate to sound like one of those “they changed it and now it sucks” people but man film is clearly visually superior. I’m there are post processing tricks you can do to DV to make it more closely resemble film.

There are shitty looking chemical films and brilliant looking digital ones. Shitty looking digital is mostly due to a cinematographer denied the 35mm equipment he was planning to use due to budget, doing a terrible job, and blaming the equipment. I’ve seen too many amazing looking digitally shot features to believe that there is anything inherently superior about chemical photography. And, apart from doing an equally careful job with lighting, filters and color grading, all attempts to post process digital to “make it look like film” are evil and should be outlawed.

I popped Inland Empire in recently and it does look rough. Maybe it’s the resolution.

The problem is history. In the past cheap TV was made on video, and expensive movies were made on film. Now that movies are being made on (extremely sophisticated, extensively developed and improved, high resolution) video just reminds you of cheapness, rather than appreciating it for what it is.

You will have to get used to it. But, perhaps more importantly, filmmakers have to get used to it, and should adjust their shooting style accordingly. If they do that, then soon enough it will be almost indistinguishable, and everyone will accept it as the new default without complaint.

There are many levels in which digital video can get degraded, from the initial capturing device (i.e. camera) all the way to the delivery system (DVD, Blu-ray, Satellite etc.)

And…most of them apply to “film” too, unless you happen to projecting an original print in your house.

What, specifically, is wrong with it? Resolution? Focus? Motion? Color? Lighting? Give us a hint.

I also prefer film, but Inland Empire is not a good basis for comparison. It was shot on a “prosumer” digital camcorder with relatively minimal lighting production.

Lars Von Trier’s Antichrist was shot on digital and has a similar aesthetic to Mulholland Dr.

As people have said, it depends on the movie. I usually prefer film, but there are some amazing-looking films shot on digital. Michael Mann, David Fincher, and Steven Soderbergh usually do right by digital. Mann’s Collateral, in particular, is stunning.

On the other hand, I saw Gangster Squad the other weekend, and that was some ugly-as-shit digital. Overlit, artifical, and just plain cheap-looking.

Yeah. Fincher and Mann are 2 examples of directors who use digital video correctly. Lynch has already stated he isn’t using film anymore, but I suspect he will use higher quality cameras in the future.

Well, what I mean is that digital video doesn’t ‘degrade’ the simple way multiple generations of film prints (or analog video tapes) do: Loss of sharpness, loss of color saturation etc. It can ‘degrade’ in decidedly different and weird looking ways: Certain solid backgrounds (especially blacks) can suffer banding, fades can pixelate, movement can become ghostly & unnatural looking, large quantities of small moving objects can pixelate *really *badly etc. In other words digital artifacts, while analogous to, are a whole different animal than simple generation loss.

It’s all just technique with a new medium.

The first things I noticed that looked awful in digitally shot recordings were that the lighting and makeup were terrible. The materials and techniques that were good for film didn’t work the same with digital as they did with film. People in the industry (like me) are working hard to adjust their equipment, their skill sets and their techniques for the new medium. Eventually we’ll get it all figured out, but there are already vast improvements over the way things were done just a few years ago.

What you seem to have observed is not a digital copy (which is perfect), but a digital resampling or other processing. If 100 digital copies are made sequentially and each is verified after, the 100th will be exactly the same as the first; no color change, no banding, no nothing.

Resampling, however, will almost always cause some loss; different kinds and amounts of loss depending on the method used.

To sum up: Film cannot be reproduced without some loss per generation, accumulating for serial dupes. Digital can be perfect, with no generational loss at all.

no, film is only “superior” if you want the “film” look. Just like vinyl LPs are only “superior” if you want the “vinyl” sound.

personally I care far more about what I’m watching to get all bent out of shape over the medium on which it was recorded.

As others have said, it’s not the medium it’s the way it’s handled. I’ll echo that Collateral is one of the most beautifully shot films I’ve seen.

And what I meant is that unless you are actually bringing the film to your house, it’s probably being digitized and subjected to all the same sorts of treatment that something recorded with a digital camera is.

Cable and Bluray are digital technologies. Pretty much everything except the camera is a digital transmission these days anyway, so all those opportunities for your “digital video” to get “degraded” apply equally to everything you view at home.

I generally prefer the look of film, but that may be a nostalgia thing as most of my favorite movies are from decades past. I think Martin Scorcese’s “Hugo” looked quite beautiful for a digitally shot movie.

I’m not sure what you mean. TV and film are all shot directly onto some kind of digital medium (tape, hard disk, memory etc.) with digital HD video cameras, not film cameras. And pretty much every step after that involves some kind of compression and/or decompression, with differing results. DirecTV uses much higher compression than say FiOS for example, with noticeably more digital artifacts in its picture.

About the only thing film stock is still used for is mass distribution to movie theaters that haven’t yet switched to digital projection.

We’re so used to film that without it a lot of things just seem “wrong”. It’s Vinyl vs Digital: Visual Edition. We take all sorts of flaws of film (such as grain) as “normal.” I’ve heard people who work in digital effects joke that back in the day they spent countless amounts of money and research trying to breed out film grain, and nowadays they spend countless amounts of money and research trying to create digital effects filters that look like film grain.

This isn’t to say it’s “wrong” to like film, or even that the little unrealistic touches can’t make a movie feel more charming, just that it fundamentally relies more on the audience’s expectations of what a film looks like than the merits of the technology itself in a lot of cases.

Right; And this thread isn’t about comparing a movie as viewed in the theatre to a movie as viewed at home, as far as I can understand it.

This thread is about comparing movies filmed on film to movies filmed in digital.

So all the digitization degradation stuff applies equally to both sides and is therefore irrelevant to the comparison, contrary to what you seemed to be asserting in your initial post.