I've seen it mentioned here several times. A properly exposed high ISO image can have less noise than an improperly (dimly) exposed low ISO image. Why would this be? I'm not saying it's wrong, just trying to understand why. I'd like to look at as fair of an example as I can think of. Let's start by properly exposing a scene at... say ISO 3200. Then let's keep aperture and shutter speed identical and reduce the ISO 2 stops to 800. What should happen here? The first thing that comes to mind is my experience with Canon DSLRs and Canon's PP software. In their Digital Photo Professional, you can "push" your exposure by 2 stops and it "magically" brings out those 2 stops of previously invisible data. So the data is there even though you couldn't see it before pushing. Yes, it's definitely noisier than the same image with no push, but is it noisier than the properly exposed image at ISO 3200? Another observation is that Digital Photo Professional's noise reduction settings. If you take an exposure at ISO 800 by default it will use very little noise reduction. At ISO 3200 it will use a significant amount of noise reduction. So the comparison isn't fair until you bump up the noise reduction on the pushed image to match that of the ISO 3200 image. The way I understand digital ISO is that the signal from the sensor is amplified higher (multiplied) at higher ISOs. In post processing to achieve a brighter image you do the same thing in the digital realm (multiply the values). Assuming I'm right about how digital ISO works, why would the analog gain from the sensor be less noisy than the digital gain in post processing?