Dynamic Range

choudhrysaab

TPF Noob!
Joined
Jul 11, 2009
Messages
667
Reaction score
0
Location
Toronto, Ontario. Canada
Can others edit my Photos
Photos NOT OK to edit
Now this may sound like a stupid question but I would like to find out what really is a Dynamic Range in digital photography? How does one know how much "Dynamic Range" is in their picture or if there's any to begin with or not?

I tried doing the right thing by searching previous similar thread but there were too many so thought it'll be easier if I just create a thread. :blushing:
 
Dynamic range refers to the range of lights and darks that show detail. You are screwed with digital because you can only capture 6 zones of that range while film can capture ten. This is my "laymens" explanation. Google "dynamic range digital photography" and the "zone system" to get more info. HDR images take three or more combined exposures to reproduce a larger dynamic range, IE, detailed darks and detailed lights.

This why you have to make a decision when taking digital pictures what is more important, detailed shadows with blown highlights or detailed highlights with black solid shadows. This is worst when shooting subjects that include bright sky and bright sunlit subjects.
 
HDR images take three or more combined exposures to reproduce a larger dynamic range, IE, detailed darks and detailed lights.

i've seen people creaing "HDR" with picture. i'm sure experts will correct me on this saying thats not HDR, that just tone mapping. so then whats the difference between HDR pictures and Tone Mapped pictures???
 
No, that would be an HDR image, just the final format isn't. It's tone-mapped so that it's displayable on the electronic and print media we have at our disposal. This may (hopefully) change with time.

What really isn't HDR is taking an image and using three RAW conversions of the same file, say, at different exposure settings in post, then putting it through an HDR program. That's just straight-up tone-mapping, similar to what Topaz Adjust does to image. AKA exposure blending or a myriad of other terms that could be used to describe the process.
 
Got this from wikipedia 'Dynamic Range':

Photography

Photographers use "dynamic range" for the luminance range of a scene being photographed; or the limits of luminance range that a given digital camera or film can capture; [18] or the opacity range of developed film images; or the reflectance range of images on photographic papers.
Graduated neutral density filters are used to increase the dynamic range of scene luminance that can be captured on photographic film (or on the image sensor of a digital camera). The filter is positioned in front of the lens at the time the exposure is made; the top half is dark and the bottom half is clear. The dark area is placed over a scene's high-intensity region; usually the sky. The result is more even exposure in the focal plane, with increased detail in the shadows and low-light areas. Though this doesn't increase the fixed dynamic range available at the film or sensor, it stretches usable dynamic range in practice.[19]
The dynamic range of sensors used in digital photography is many times less than that of the eye, and generally not as wide as that of chemical photographic media. In the domain of digital imaging, algorithms have been developed to map the image differently in shadow and in highlight to better distribute the lighting range across the image. These techniques are known as high dynamic range imaging. This may involve shooting digital images at different exposures and combining them selectively to retain detail in light and dark areas despite the limited dynamic range of the sensor array. The same approach has been used in chemical photography to capture an extremely-wide dynamic range. A three-layer film with each underlying layer at 1/100 the sensitivity of the next higher one has, for example, been used to record nuclear-weapons tests[20].
The most severe dynamic-range limitation in photography may not be in encoding, it may be with reproduction to, say, a paper print or computer screen. Then dynamic range adjustment, not only HDR encoding, can be effective to reveal detail in light and dark areas. The principle is the same as that of dodging and burning (using different lengths of exposures in different areas when making a photographic print) in the chemical darkroom. The principle is also similar to gain riding or automatic level control in audio work, which serves to keep a signal audible in a noisy listening environment and to avoid peak levels which overload the reproducing equipment, or which are unnaturally or uncomfortably loud.
 
Another way to think of dynamic range is in terms of numbers, which is easier for me to understand: Most consumer cameras are either 12-bit or 14-bit detectors, meaning that they can record 2^12 or 2^14 different intensities of light, so between 0 and 4095 or 0 and 16383. Images saved as JPEG are necessarily 8-bit images, so each color only has intensity ranges between 0 and 255. If you're using RAW, and/or you export as a 16-bit TIFF, then you have between 0 and 65535 gradations.

Depending on your age, you may remember when CRT monitors used to be able to be set to "hundreds of colors," "thousands of colors," and "millions of colors." That's 8-bit, 16-bit, or 32-bit displays.

Creating an HDR image can be thought of like this: Say you take a 1-second image, a 2-second image, and a 4-second image. Each of them has a 16-bit dynamic range of 0-65535. But, when you put it into HDR software, it will multiply the 2-second image values by 2, and it will multiply the 4-second image values by 4. It will then add those all up along with the 1-second image. So now you have well-surpassed the 0-65535 range, but you have up to 0-262,143 possible color tones in each RGB channel.

This is no longer a 16-bit image, but it will be a 32-bit image in order to support the range. When you convert it down to a 16-bit or 8-bit image, the software can do it linearly (so 0 will stay 0, but 1, 2, and 3 will also be 0 now, then 4, 5, 6, 7 will be 1, and so on) or you can tone-map to a non-linear down-sample.
 
What really isn't HDR is taking an image and using three RAW conversions of the same file, say, at different exposure settings in post, then putting it through an HDR program. That's just straight-up tone-mapping, similar to what Topaz Adjust does to image. AKA exposure blending or a myriad of other terms that could be used to describe the process.

It is an HDR in the sense that you are expanding the dynamic range beyond what can be fit into the jpeg or tiff that you are exporting. That said, it's a much wiser idea in every case to do manual exposure blending using masking in photoshop, rather than relying on an HDR program. Manual blending is, in effect, a digital way to aproximate dodging and burning a print from a negative in the darkroom (if the jpeg is the print, and the raw is the negative, you've got to figure out how to fit something with maybe 8 stops of range onto something with say 4 [these are just random numbers here])
 

Most reactions

New Topics

Back
Top