spacediver
TPF Noob!
Hi all,
First post here. I know next to nothing about photography, and the only camera I own is the one on my smartphone. However, I do have an interest in displays, and am considering investing in a camera in the near future.
While it is true that I am not even a beginner when it comes to photography, I have chosen to post this question in this forum instead of the beginners forum, as it deals with some rather technical issues.
I have an interest in display calibration, and have some experience calibrating Sony Trinitron CRTs (if anyone is interested, check out my white point balance calibration guide here).
One of the challenges I, and few others are facing, is characterizing the luminance response in the near black region. The CRTs I work with are capable of producing black levels that are quite a bit below the measurable range of any of the instruments I use (I have an i1 pro, an i1 display pro, and a few DTP-94s). The lowest reading I can achieve is with the i1 display pro and that is about 0.002 cd/m2. A fellow Trinitron user and I have been brainstorming ideas on how to overcome this limitation. One idea involved a parabolic reflector that would focus light from the display. Another, more reasonable suggestion that came up was to use a DSLR in a long exposure mode to sum luminance information over time.
The idea would run something like this:
1: Display a full field test pattern at a luminance measurable by our instruments, say 10 cd/m2.
2: In a dark room, set up the DSLR so that the frame encompasses a large region of the display. Set the exposure time (perhaps 5 minutes), and start capturing.
3: Examine the RAW file, and calculate the average pixel value (this could be done on a single channel basis, for example, the green channel only).
4: Calculate the relationship between average pixel value with the chosen exposure, and actual luminance. For example, suppose the average pixel value for green turns out to be 500 (out of a possible 1023 in a 10 bit file). In that case, the scaling factor would be 1/50.
5: Re-measure at a few other measurable luminances, and ensure that this scaling factor is consistent across different luminances. I'm assuming RAW encodes luminance linearly, so there shouldn't be a need for gamma correction.
6: Measure test patterns that are too dark to be directly measured, and use the scaling factor to infer actual luminance.
That's the basic idea, and here are the questions:
a) Is this general approach completely misguided? Or is there a chance it could work?
b) At what point should I be concerned about dynamic range? In steps 1-5, I need to ensure that that sensor isn't saturated, else my calculations will be meaningless, yet I need an exposure time long enough to capture usable information when measuring the very low luminance patterns.
c) Are there any considerations around ISO and aperture that I should take into consideration here? My goal is to create an image that has a reliable relationship to the luminance of the display, yet retains good sensitivity to the low light. I don't care about spatial resolution, and I imagine a certain amount of noise can be tolerated, given that the calculations are averaging over millions of pixels (and many images can be acquired for each luminance level).
I'd really appreciate any suggestions and guidance!
First post here. I know next to nothing about photography, and the only camera I own is the one on my smartphone. However, I do have an interest in displays, and am considering investing in a camera in the near future.
While it is true that I am not even a beginner when it comes to photography, I have chosen to post this question in this forum instead of the beginners forum, as it deals with some rather technical issues.
I have an interest in display calibration, and have some experience calibrating Sony Trinitron CRTs (if anyone is interested, check out my white point balance calibration guide here).
One of the challenges I, and few others are facing, is characterizing the luminance response in the near black region. The CRTs I work with are capable of producing black levels that are quite a bit below the measurable range of any of the instruments I use (I have an i1 pro, an i1 display pro, and a few DTP-94s). The lowest reading I can achieve is with the i1 display pro and that is about 0.002 cd/m2. A fellow Trinitron user and I have been brainstorming ideas on how to overcome this limitation. One idea involved a parabolic reflector that would focus light from the display. Another, more reasonable suggestion that came up was to use a DSLR in a long exposure mode to sum luminance information over time.
The idea would run something like this:
1: Display a full field test pattern at a luminance measurable by our instruments, say 10 cd/m2.
2: In a dark room, set up the DSLR so that the frame encompasses a large region of the display. Set the exposure time (perhaps 5 minutes), and start capturing.
3: Examine the RAW file, and calculate the average pixel value (this could be done on a single channel basis, for example, the green channel only).
4: Calculate the relationship between average pixel value with the chosen exposure, and actual luminance. For example, suppose the average pixel value for green turns out to be 500 (out of a possible 1023 in a 10 bit file). In that case, the scaling factor would be 1/50.
5: Re-measure at a few other measurable luminances, and ensure that this scaling factor is consistent across different luminances. I'm assuming RAW encodes luminance linearly, so there shouldn't be a need for gamma correction.
6: Measure test patterns that are too dark to be directly measured, and use the scaling factor to infer actual luminance.
That's the basic idea, and here are the questions:
a) Is this general approach completely misguided? Or is there a chance it could work?
b) At what point should I be concerned about dynamic range? In steps 1-5, I need to ensure that that sensor isn't saturated, else my calculations will be meaningless, yet I need an exposure time long enough to capture usable information when measuring the very low luminance patterns.
c) Are there any considerations around ISO and aperture that I should take into consideration here? My goal is to create an image that has a reliable relationship to the luminance of the display, yet retains good sensitivity to the low light. I don't care about spatial resolution, and I imagine a certain amount of noise can be tolerated, given that the calculations are averaging over millions of pixels (and many images can be acquired for each luminance level).
I'd really appreciate any suggestions and guidance!