...You need film for that

Alright, fair enough. But I do take issue with the words "it will always...". That's just not true. The technology is volatile enough that there's no way to know its characteristics in say ten years. Who knows what type of sensor they will have perfected by then.

An example would be my old professor, Terry. He grew up as a cinematographer and film user. In freshman year, I remember a rant he gave where he passionately explained to us students that "There's no way that digital will approach the quality of film" and "When a photographer wants good quality, he will always choose film over digital". Fast forward and just last year he bought his first DSLR, a 40D.

I don't mean this post to be a "Ha, so there!" kind of statement. My point is simply that absolute statements about technology in the context of future capabilities are most times unwise.
 
Last edited:
At the very least it will never be the same. Unless of course sensors are someday somehow made by spreading somewhat irregularly sized pixels (and they figure out how to read where they're placed) and displays use the same design then they will not and cannot be the same.
 
this is a minor point, but film smells good.

Not those Eastern European films. I got a bunch of rolls of Bergger 200 120 (actually made by Forte or someone), and when it was in the soup it stank to high heaven. And when in dried it curled like a spring. Stinky and curly; I hated that stuff.
 
How many silver halide crystals are there on a frame of 35mm film? The crystals would be the film equivalent of pixels, right (more or less)?
 
No matter what it will always conform to a regular pattern of some sort which film grain will never do which in my eyes will always make film grain better.

I fully expect digital to be able to flawlessly mimic film grain once the source data achieves a high enough resolution and color depth.
 
I fully expect digital to be able to flawlessly mimic film grain once the source data achieves a high enough resolution and color depth.

If my understanding is correct (which might not be the case), digital would just need a certain element of randomness to achieve this, right?

If so - I think, given time, we'll see this in digital as well.
 
I could think of exactly one way we could electronically produce something like film grain we would have to use analog circuits.
 
How many silver halide crystals are there on a frame of 35mm film? The crystals would be the film equivalent of pixels, right (more or less)?

It depends on what you mean by equivalent. If you were going to add them up, and see if the biggest number equals better image quality then no, pixels and grains aren't equal. I don't know how many silver halide crystals are on a frame of 35mm film, but it's got to be well over 15,000,000, and possibly over 100,000,000. With E6 and C41 the grains would actually be dye clouds overlapping in 3 separate layers. Bayer pattern sensor pixels are different than Foveon sensor pixels. I'd say it's too mixed up to come to any generalization, but you can always compare large prints from 35mm Velvia or Tmax 100, and the 35mm Canon 5D (with a mere 12,800,000 pixels) with your own eyes. My walls don't have any 35mm prints, but there are a few shots from my Hasselblad hanging right next to similar sized prints from my 5D. I know what my eyes see. I sold the Hasselblad.

Another apples and oranges sort of thing is that with film we like fine little grains (Velvia, Tech Pan, etc...) for pretty photos, and with digital we like big fat pixels (5D, D3, etc...) for pretty photos.
 
One thing digital absolutely can't do is be analog I don't know what started the whole everything must be done digitally (not talking cameras alone here) but digital is inherently lossy you lose all that data inbetween 0 and 1 it isn't noticeable to the eye per se but specifically when editing this loss can become very noticeable. I haven't done much with color film and I assume it has this same problem to a degree (because there aren't infinitely many grains on a piece of film).
That's an excellent point and it rings true for many things digital. "Analogue" is how we humans see, hear, feel, touch and smell. Digital exists for convenience, not quality. However, if it the fidelity of a digital medium is high enough, it is beyond our perception and irrelevant in some cases.

Did you know that, given the right player (It is actually possible to play a vinyl record using optics instead of a needle) a brand new vinyl record will give you better sound quality than a CD? That's because vinyl records are an analogue medium, where CDs are a digital medium where certain bits are skipped where things are beyond human hearing.

Then, as time passes, nostalgia kicks in and the value of the older
stuff goes up.
And, right now, it's down, so I'm gobbling it up while I can ;)

Alright, fair enough. But I do take issue with the words "it will always...". That's just not true. The technology is volatile enough that there's no way to know its characteristics in say ten years. Who knows what type of sensor they will have perfected by then
No, I think you misunderstood what he was saying. The same sensor will always produce the same noise patterns. You'll get different grain patterns for each film shot you make because you're using a different frame for each shot.
 
Did you know that, given the right player (It is actually possible to play a vinyl record using optics instead of a needle) a brand new vinyl record will give you better sound quality than a CD? That's because vinyl records are an analogue medium, where CDs are a digital medium where certain bits are skipped where things are beyond human hearing.

Vinyl is seeing a sort of comeback now because of this. Now that everything is "Hi-Def", if your system is 'good enough' you can actually hear the digital break-up, or so some friends of mine tell me - my system isn't anywhere near that good, so CDs & vinyl sound (pretty much) the same to me.
 

Most reactions

New Topics

Back
Top