Could use some white balance advice

Please explain how the camera was able to accurately measure the color temperature of the light source.
Uh, I already did, in the post you quoted right above that photo. It could do it in the same way your brain does:

1) The camera could have knowledge of what color certain objects in the world are supposed to be. For example, if there's a coca cola bottle, it could identify that it's a coca cola bottle, look up the shade of red it expects from such a bottle in white light, and then adjust to make it so. If it's a photo of a camera like that one, most of which usually would have neutral silver paint, it could know that, locate a section that seems to have no major reflections, and adjust it to neutral gray. Cameras wouldn't necessarily have to store this information inside their bodies. They could use wifi and look it up on company servers, when wifi gains wider coverage in the future (it doesn't have to do this before the shot, it can work on it after the shot with RAW info and then store the profile for you for later when it finds the answer, or just store the area shot and the actual shot and calculate later doing everything at home, although that slows processing time)

2) The camera could have another tiny wide angle camera that looks out of the top and tries to identify the light source by direct identification. Long tubes? Assume fluorescent. Pear bulbs? Assume incandescent. Sky? Use the built in GPS and weather internet services to determine atmospheric conditions and light color for your location in real time, and use that. AND/OR measure the color of the pixels directly covering the brightest light source itself and calculate color.

3) If #1 and #2 fail, just fall back to averaging the scene and other stuff it does currently.



It could potentially even decide whether to try and make it look like it was white light (which is what you normally want) or to keep a cast (like for sunsets you usually don't want to correct it), by being programmed to recognize typical photographer desires and scenes and apply them, etc.
 
Last edited:
Or. What if we used three filters that transmit light in three different wavelength ranges. Then take the ratio of the intensity of the light that gets through, say, the shortest wavelength filter to the intensity of the light that gets through the medium wavelength filter?

Or what if we split the light through a prism and looked at the spectral absorption lines? Those occur at fixed positions (unless the thing is moving towards or away from you very fast), and comparative analysis of the light through the spectrum would let us compute the overall effect?

Or a spectrometer?
 
Please explain how the camera was able to accurately measure the color temperature of the light source.
Uh, I already did, in the post you quoted right above that photo. It could do it in the same way your brain does:

No you did not. It's a simple question. The photo presented is a camera generated JPEG. You can read the EXIF data just like anyone else can; it's fully intact.

In order to set an accurate white balance the color temperature of the light source has to be measured. In the photo presented how did the camera accurately measure the color temperature of the light source?

I'm not asking about future cameras or what some camera you're imagining might or could do. Simple question -- a simple answer is all that's required. This photo:

$camera.jpg

How did the camera accurately measure the color temperature of the light source?

Joe

1) The camera could have knowledge of what color certain objects in the world are supposed to be. For example, if there's a coca cola bottle, it could identify that it's a coca cola bottle, look up the shade of red it expects from such a bottle in white light, and then adjust to make it so. If it's a photo of a camera like that one, most of which usually would have neutral silver paint, it could know that, locate a section that seems to have no major reflections, and adjust it to neutral gray. Cameras wouldn't necessarily have to store this information inside their bodies. They could use wifi and look it up on company servers, when wifi gains wider coverage in the future (it doesn't have to do this before the shot, it can work on it after the shot with RAW info and then store the profile for you for later when it finds the answer, or just store the area shot and the actual shot and calculate later doing everything at home, although that slows processing time)

2) The camera could have another tiny wide angle camera that looks out of the top and tries to identify the light source by direct identification. Long tubes? Assume fluorescent. Pear bulbs? Assume incandescent. Sky? Use the built in GPS and weather internet services to determine atmospheric conditions and light color for your location in real time, and use that. AND/OR measure the color of the pixels directly covering the brightest light source itself and calculate color.

3) If #1 and #2 fail, just fall back to averaging the scene and other stuff it does currently.



It could potentially even decide whether to try and make it look like it was white light (which is what you normally want) or to keep a cast (like for sunsets you usually don't want to correct it), by being programmed to recognize typical photographer desires and scenes and apply them, etc.
 
For accurate WB it's necessary to know the color temperature of the light source.

Here's a photo taken with the camera set to AWB:

View attachment 68494

Please explain how the camera was able to accurately measure the color temperature of the light source.

Joe
How do you think measurement of any color temperature is possible? How do you think we measure the color of stars?

For that matter, how is any whitebalance the "right one" or the "wrong one" if the only way to tell is to look at a known white source which, itself, is given an arbitrary "correct temperature"?

No. There's clearly a flaw in your factual claims somewhere.

Clearly it must be there somewhere; when you find it you'll let me know? ;)

Joe
 
Or. What if we used three filters that transmit light in three different wavelength ranges. Then take the ratio of the intensity of the light that gets through, say, the shortest wavelength filter to the intensity of the light that gets through the medium wavelength filter?

Or what if we split the light through a prism and looked at the spectral absorption lines? Those occur at fixed positions (unless the thing is moving towards or away from you very fast), and comparative analysis of the light through the spectrum would let us compute the overall effect?

Or a spectrometer?

You mean a spectrophotometer? I have one of those. They work really well to measure color temperature. That's not the question. How can a camera do it when the camera is measuring the light after it has already bounced off the subject.

Joe
 
How did the camera accurately measure the color temperature of the light source?
I'm not sure I'm following you. I don't think it DID accurately measure at all. Looks way too yellow to me, and maybe a skoch green, too.
And the reason I think that is for all of the things I was just describing, happening in my brain.
Show me a lump of gray/black abstract plastic on a yellow plastic table in the exact same tones and I'd be fine with it. But wood and cameras don't feel like they should look quite like this in white light, from my life experience.

(What I'm led to understand it does is roughly look for the brightest spot and adjust it to white, unless nothing stands out in which case it takes the photo average and makes it gray. But it seems unsuccessful here)



Obviously cameras aren't wildly guessing right now. They use heuristics that are based on realistic and usually-reasonably assumptions. But the heuristics are crude compared to what we ourselves use on a daily basis.
 
Last edited:
I'm not asking about future cameras or what some camera you're imagining might or could do.
Also ysarex, what you said was "AWB in theory can't work"

That means ever. Which includes the future. So yes, you are talking about future cameras. Which could do anything our brains can do, and much more. They could even have active sensors, "pinging" the scene with additional light to gauge the reflected spectrum, or any number of things that would solve the problem.

If you'd like to change that to simply "AWB doesn't work now" then we are in agreement. But the above statement is simply incorrect in most situations, where there are oodles of solid sources of data that could, with an advanced system, work equally as well as a gray card from just direct scene observation. Indeed they could work even BETTER (in the case of multiply lit scenes or distance light sources that you can't get on your card, etc.)
 
I'm not asking about future cameras or what some camera you're imagining might or could do.
Also ysarex, what you said was "AWB in theory can't work"

That means ever. Which includes the future. So yes, you are talking about future cameras. Which could do anything our brains can do, and much more. They could even have active sensors, "pinging" the scene with additional light to gauge the reflected spectrum, or any number of things that would solve the problem.

If you'd like to change that to simply "AWB doesn't work now" then we are in agreement. But the above statement is simply incorrect in most situations, where there are oodles of solid sources of data that could, with an advanced system, work equally as well as a gray card from just direct scene observation. Indeed they could work even BETTER (in the case of multiply lit scenes or distance light sources that you can't get on your card, etc.)

I am talking about auto white balance in current practice and current theory as implemented in our cameras right now. However I'm inclined to stick with my original statement, but let's save that for an addendum.

So in that photo of the old Retina camera my new digital camera that took the photo was set to AWB and yes, as you noted, it did not accurately measure the color temperature. The white balance in that photo is off.

To achieve accurate white balance it's required that we know the color temperature of the light source. No argument there, right?

Definition of AWB: The camera takes a photograph of a subject and uses various algorithms to set the white balance from an analysis of the photograph taken. No argument there, right?

Set to AWB the camera never get's direct access to the light source and simply can not measure the light source color temperature. The camera can only see the light after it has reflected back from the subject. The spectral reflectance properties of the subject then alter the light "seen" by the camera. In order to measure the color temperature of the light source in this manner it's essential that the subject contain a target with a known spectral reflectance that can be accounted for in the measurement.

Theory: AWB sets the white balance by analyzing the photo taken. That can't produce an accurate result without a prior measurement of the spectral reflectance properties of the subject. There's no way the camera can know the spectral reflectance properties of the subject. Therefore AWB in theory can't work. In practice we know it certainly doesn't work.

In the photo of the old Retina, the camera taking the photo doesn't "know" anything about the subject. It does not have and can not acquire spectral reflectance measurements of the subject (this requires a known light source). AWB in theory can't work.

lh1h.jpg


I captured a raw file for this photo and just before taking the photo I placed a target with known spectral reflectance (color checker passport) in the light source and photographed that target. That gave me the ability to measure the light source. AWB can't do that because it can't tell the difference between my old camera and my color checker. The camera is not sentient.

Addendum: After Skynet the computers in our cameras may be able to do a better job of AWB. Until then I'm sticking to my initial assertion. AWB is better than nothing and has gotten better. The algorithms that they use have some success because it is possible to make an informed guess. But that's all AWB is -- an informed guess. It can't be accurate without a method to directly measure the light source or indirectly measure the light source using a known target. In the science fiction future who knows.

Joe
 
I am talking about auto white balance in current practice and current theory as implemented in our cameras right now.
Okay well then you weren't using "in theory" in the normal sense, and thus miscomunication.

After Skynet the computers in our cameras
1) Nothing has to be in the camera. It could just take the picture in RAW and let your home desktop do the heavy lifting.
2) All that is required is some object recognition and a big chart of reflectances of objects (or their luminance in the case of recognizing objects that are light sources). Because if you know the object from previous experience, then you have everything you need to extrapolate the color the light must be. That is on the level of something a fruit fly brain can do, and the technology already has existed for awhile. It's not "skynet" stuff. At this point, that's a matter of commercializing and building up a usefully large enough database, pretty much.
 
I am talking about auto white balance in current practice and current theory as implemented in our cameras right now.
Okay well then you weren't using "in theory" in the normal sense, and thus miscomunication.


And I just said I'll stick by it as I originally said it.

After Skynet the computers in our cameras
1) Nothing has to be in the camera. It could just take the picture in RAW and let your home desktop do the heavy lifting.
2) All that is required is some object recognition and a big chart of reflectances of objects (or their luminance in the case of recognizing objects that are light sources). Because if you know the object from previous experience, then you have everything you need to extrapolate the color the light must be. That is on the level of something a fruit fly brain can do, and the technology already has existed for awhile. It's not "skynet" stuff. At this point, that's a matter of commercializing and building up a usefully large enough database, pretty much.

Object recognition and a chart of all the objects the world contains isn't good enough. That still won't be accurate -- it still won't work. None of the generic objects in the world grow or are manufactured under tightly controlled conditions and they all change over time as they age. Their spectral reflectance can't be derived from a database -- specific measurement is required. All your database would do is make AWB's guess a better guess. There are very few objects that can wind up in your database that don't either change as they grow, fade as they age or are created with too much variability in the first place.

Joe
 
Their spectral reflectance can't be derived from a database
When you and I both said "The image of the camera on the table doesn't seem correctly white balanced," what exactly do you think we were doing?

We were looking up a database in our heads of what cameras and common wood stains and finishes tend to look like in white light, and noticing a disparity, indicating non-white light, allowing us to adjust the WB.

I.e., Exactly what I just described. It works just fine.


Aging: stuff that ages doesn't just change color. It gets rust and cracks and things. Thus, the shape and texture matching can fit it to its own separate database slot, with its own older object color.

But even that's not super critical, for a couple reasons:
1) It's rare that everything in a scene will be aging all together and changing color in the same way, so usually you can just go with that the majority of objects are telling you and ignore the outliers.
2) Slight imperfections from very slight yellowing of paper or whatever don't' necessarily NEED to be corrected, because we don't NEED perfect white balance. We only need as good as our brains can judge, which are using similarly imperfect methods. As long as it's about as good as we do, it's fine, even if it isn't perfect, because photos are for people to hang on their walls, not for spectrophotometers to hang on their robot walls.
 
"Theory: AWB sets the white balance by analyzing the photo taken. That can't produce an accurate result without a prior measurement of the spectral reflectance properties of the subject. There's no way the camera can know the spectral reflectance properties of the subject. Therefore AWB in theory can't work. In practice we know it certainly doesn't work."

Somebody SERIOUSLY needs to check into the distance-aware, color-aware, reflectance-aware metering and white balance systems Nikon has been using for years now. You seem to be unaware of the newer techynologies involved in "modern" cameras. It is no longer 1969.

The D2 series featured an external white balance sensor, a small, white-colored sensor, located on the front of the pentaprism, which took "external" color temperature readings, and compared those against internal color temperature sensing. As it turned out, Nikon dropped that external sensor in the D3 series.(Because it was not needed!)

Dude...it's pretty damned simple for a modern d-slr to measure color temperature of ambient light. Minolta was doing it way back in the Reagan years, over 30 years ago. Today's modern d-slr cameras have more computing ability than a collection of the crude computers that sent the first Apollo mission ship to the Moon's surface.

And just an added tidbit, related to Gav's comments above: modern d-slrs know the city the camera is in, as well as the time of day, the exact day of the year, and the lat/long coordinates. Along with 100,000+ actual photos, it's pretty easy for a model to be constructed that measures ACTUAL RGB values, and reflectance values, and distances, and it become pretty easy for a camera's computer to measure and analyze scene recognition data (What Nikon calls the scene recognition system, or SRS) and compare the SRS data against actual, measured data, and arrive at a damned close rendering of the white balance.

Let's cut the bullsh!+, okay: If Auto WHite Balance didn't work, then ACTUAL photos made would have WILDLY inaccurate and wildly varying white balance over 100 frames. But that is NOT the case, and AUTO WB is remarkably consistent under almost all natural, outdoor lighting scenarios, and quite good under most indoor lighting scenarios.

And, at least in Nikon's case, there is analysis of 420, to 1005, to literally thousands, of objects and their spectral reflectance, for every single metering and WB assignment the camera does. After 15 years of trying, Canon has finally managed to get around Nikon's intellectual property in this area, and has now come up with its own "four-hue" method of measuring and metering [Canon calls this iFCL metering]. Again; if Auto WB could not work, then there would never be any successes with it, but pretty simply, it works incredibly well, and has incredible consistency and repeatability. So, somehow, somebody didn't get your "memo".
 
I don't think any cameras USE that GPS type of stuff just yet, Derrel. But I'm sure it's not far away (or maybe a couple of them do, dunno). There was a guy a few months ago on this forum who said he was doing his dissertation on that: weather pattern databases combined with camera clocks and GPS to determine WB for outdoors shots. So it sounds like it's being worked on, but not quite commercialized yet.
 
Sometimes, regardless of what the camera can or cannot due with AWB, the results come out screwy, no matter what. I've grown accustomed to the AWB capabilities of my 5D mark iii, even without applying the firmware updates yet (too lazy, I guess). But two weeks ago at a church missions event, it did OK, but not it's usual GREAT AWB. There were multi colored lights throughout the front 'stage' area of the temporarily decorated gymnasium and when the speaker moved even 2-3 feet from the podium, the colors were all different. The small spot light perhaps 3 feet in front and 6 feet above shining on his head added to the fun during post processing. Note that I shoot JPG + RAW and use the JPGs only for deciding which pictures to edit the RAWs with LR5 and PSE10.

Here's the before and after of two situations (SOOC and after LR+PSE)...these were shot almost as fast as I could press the shutter button. The projected screen at the right of each image was constant. All we humans saw was the black and white image. Unfortunately, the camera shutter speed and refresh speed of the projector were on different planets. 135mm f2L at f2, 1/640, ISO 2500 on the first one, 1/800 on second shot

$Missions-Conference-62-SOOC.jpg$Missions-Conference-062.jpg$Missions-Conference-63-SOOC.jpg$Missions-Conference-063.jpg

Dealing with different colored lights from each side was an even bigger challenge. Fortunately, I could 'push process' the under exposure... 135mm f2L at f2, 1/800, ISO 4000
$Missions-Conference-176-SOO.jpg$Missions-Conference-176.jpg
 
Last edited:
Their spectral reflectance can't be derived from a database
When you and I both said "The image of the camera on the table doesn't seem correctly white balanced," what exactly do you think we were doing?

We were looking up a database in our heads of what cameras and common wood stains and finishes tend to look like in white light, and noticing a disparity, indicating non-white light, allowing us to adjust the WB.

I.e., Exactly what I just described. It works just fine.

Sorry I had to leave you yesterday -- had to go to campus. (Not making this up) And one of the things I had to do was help Courtney learn to better set a custom WB on her Nikon D5100 since the AWB was causing her too much grief.

All your database can do is provide an average. That's not accurate enough especially when accuracy is available and easily accomplished. If you get sick the doctor assess your symptoms and initially responds by checking a symptom database and prescribing an "auto" treatment. The minute it get's serious your doctor orders a culture. Before I left for campus yesterday I went to the kitchen sink to rinse out my coffee cup. Out the window I saw two male cardinals in the euonymus hedge. One was quite a bit oranger than the other. Your database would have to store an average value for male cardinal. Which of the two I saw would match the value in your database? Would either of them match your database average?

You're accepting an average. If all we could do is apply an average that would be OK. But with a miniscule fraction of the effort it takes you to create and calculate your average I can be accurate.

An average works you. I'm requiring accuracy in my expectation for "works" since it is easily available.

As for how well the database in your head works -- you assessed the camera photo as too yellow and a bit green when in fact the AWB error was too blue/cyan. Let's hope they don't get the cameras working that well.

Aging: stuff that ages doesn't just change color. It gets rust and cracks and things. Thus, the shape and texture matching can fit it to its own separate database slot, with its own older object color.

But even that's not super critical, for a couple reasons:
1) It's rare that everything in a scene will be aging all together and changing color in the same way, so usually you can just go with that the majority of objects are telling you and ignore the outliers.
2) Slight imperfections from very slight yellowing of paper or whatever don't' necessarily NEED to be corrected, because we don't NEED perfect white balance. We only need as good as our brains can judge, which are using similarly imperfect methods. As long as it's about as good as we do, it's fine, even if it isn't perfect, because photos are for people to hang on their walls, not for spectrophotometers to hang on their robot walls.

Oh no, none of this shifting the focus BS: well we don't NEED accurate WB anyway. I'm not talking about that and haven't been talking about that. What you're happy with or can/can't see or the general public is happy with or can/can't see or the army of fauxtogs is happy with or can/can't see isn't the topic. We have simple and straightforward methods for achieving accurate WB by measuring the color of the light source either directly or indirectly with a controlled spectral reflectance target. AWB does not and can not do that and isn't nearly as accurate. Improving AWB by building a database of averages isn't going to make it as accurate. I'm not talking about what reduced level of accuracy meets your or the general public's NEEDS.

Joe
 

Most reactions

Back
Top