RAW histogram

The raw has more latitude than the jpeg, so if a jpeg derived from the raw shows that it is not blown out, then you should be able to assume that the raw is not blown out either.

The only reason to need raw histograms would be if you were intending to crank it way far away from an actual white balanced image. If you want just a normal photo though, I can't think of any reason this would be necessary.

I can't completely agree. I shoot a lot of flower macros - think bright yellows and reds. While the JPG histogram looks ok, even a slight shift in WB may well blow out the red channel and lead to the loss of fine details. In that sense it is not just the 'more latitude' of RAW that matters. Besides, although this would be less destructive, as a result of white balancing (set aside contrast and saturation) your JPG histogram may show that you have overexposed one or more channels, while you are still far from it.


edit: This http://www.rawdigger.com/houtouse/beware-histogram explains what I mean, pretty precisely.
 
Last edited:
Capture One 7 Pro will tether your Canon camera and immediately transfer raw files into the application, but you're going to have to be cabled to a laptop.

Just a note for clarification: even raw conversion software like LR/ACR or C1 doesn't show you a "raw" histogram. You're shown an RGB histogram derived from the converted raw file -- better than the histogram derived from the camera processed JPEG, but still a software interpretation.

Joe

edit: I sympathize with your desire for increased accuracy. I've have never been able to completely shake the uneasy feeling I have since moving from film to digital. When I shot film I had a densitometer that allowed me to get hard data about my exposure and processing. I still long for that hard data. If you're not familiar with it, check out the software app Raw Digger.

Thanks Joe for the advice. Yes, in fact raw converters will all display a calculated (and white balanced) histogram from the converted RAW file. It is exactly the Raw Digger kind of functionality that I am after.
Now it would be great if that could be integrated into a capturing app. I use Lightroom but I am more and more being pushed towards C1.
I have also tried UniWB but I just can't get used to the awful green previews.

I don't believe what we want exists. UniWB is a cludge, I have likewise rejected that option. Because of the engineered-in biases of the various raw converters I have C1, ACR/LR, PhotoNinja and a few others installed on my system. I am not going to drag a laptop around with me in the field -- that's as nuts as UniWB.

So I rely heavily on Raw Digger. I test my equipment methodically and examine the results in Raw Digger and have otherwise learned to ignore the histogram displays on the back of my cameras as misleading at best. I would also love to have in-the-field (in-the-camera) "real" and accurate feedback on exposure. Best I can do right now is keep the cameras set to spot meter and continue verification testing with Raw Digger. Oh yeah, and there's still good old bracketing.

Joe
 
I believe that the histogram displayed by every,single d-slr camera every made used the embedded JPEG file that's in side the RAW capture. Yes, a RAW image can be transferred by many software types and many programs and hardware apps, both wired and wireless...but the histogram is based on the JPEG image's data. As overread mentioned, as long as the camera's capture parameters are fairly moderate, not "maxed out", the histogram from the JPEG file is going to be pretty useful.

Of course, the RAW file will indeed have MORE highlight recovery headroom than any JPEG image will. The Red channel is more likely to saturate than the other two.

the OP wrote: "on a couple occasion I have had my share of overexposed images even though the histogram seemed ok (and no overexpo blinking either). It would be great to overcome that by having a straight away RAW based histogram."

That seems very odd. Very odd indeed. As KmH stated, YES, specular highlights will have the overexposure "blinkies" flashing, as they SHOULD...they are to be rendered as basically detail-free highlight values...

If you set the Tone Curve to normal, or to low, and keep the Saturation set to Normal or Low, the histogram from the JPEG images inside the RAW captures will be fine, and if you are exposing to the right, you will most likely actually WANT TO HAVE "the blinkies" flashing, because the RAW data will have at the very least, one, full EV value of highlight recovery room available in the raw converter,which you would use to "pull back" the highlights.

But again, I do not think there are ANY cameras, or any software tethering apps that base a histogram on RAW data....I mean, the data has to be demosaiced and made into an actual image, and the embedded JPEG is the standard place to go to create the histogram...
 
Derrel, depending on what you shoot and under favorable circumstances, you are right that JPEG histograms can be useful. I do not intend to convince you (or anyone else) that you absolutely need to dig into RAW data to get a decent image. I guess many (if not all) of my favorite shots from a multitude of photographers were captured without differences between RAW and JPEG histograms in mind. That said, and I may not have been specific enough, I am looking for a tool that helps me get a physical, pre-WB, not contast adjusted RAW-based histogram that would indicate whether any channels have been overexposed. I see justification to this, or more precisely, I am willing to experiment with having such control over my exposure - and see whether the gain is worth pain.

Your suggestion that "the RAW data will have at the very least, one, full EV value of highlight recovery room available in the raw converter,which you would use to "pull back" the highlights" is not completely in line with my experiences. Clipping may start already while the camera or any jpeg based histogram shows no blinkies, with contrast set to low. And this may impact more than just one channel, although typically only the red channel gets saturated. In that case you may have very limited highlight recovery headroom.

Again, I do understand that what I want may seem a lot of efforts very modest benefits, and the benefits may in fact be modest. Without trying it, I can't tell.
 
Actually, the more I think about it, the less this makes sense.

What would the difference be between a jpeg and a RAW histogram? Only the bit depth. One would be 8 bits (256 values across the histogram) and the other would be 14 or 16 or whatever, depending on your camera.

Let's say your RAW format is 14 bits. That means that your histogram would have 16,384 bars' worth of data in it.


Is your monitor 16,384 pixels wide? I didn't think so. Which means that in order to actually view the full detail of the RAW histogram, you'd have to scroll like, 8 screens over side to side.

Most people's screens are 1920 pixels wide at most, which means that without scrolling you can only show about 10 to 11 bits anyway. Only a smidgeon more information than the jpeg histogram shows! And your LCD camera screen is pretty much physically incapable of showing more histogram data than 8 bits.

So if you were planning on viewing your histogram at normal sizes (like a small window on your monitor), then it would literally be identical to the one from the jpeg...




Also, there's no reason that the RAW histogram would give you any particularly better information about blowouts. If a big region of the image is blown out in one, it will also be in the other, 95% of the time. A RAW histogram gives you more detail between the values, not significantly more data on the ends.
 
Clip
ping may start already while the camera or any jpeg based histogram shows no blinkies, with contrast set to low.

Well there's your problem dude... You have you camera set to low contrast!

Don't do that. If you want the jpeg histogram to fairly represent the RAW, then set EVERYTHING to "neutral" / "0 contrast" / "0 saturation" / etc. etc.



All of those settings only apply to the jpeg, so it won't matter for your RAW whatever you set them to, and setting them all to neutral will make your histogram spot on reliable with relation to the RAW. I was assuming you had already done this, and perhaps Derrel was too.
 
Well there's your problem dude... You have you camera set to low contrast!

Don't do that. If you want the jpeg histogram to fairly represent the RAW, then set EVERYTHING to "neutral" / "0 contrast" / "0 saturation" / etc. etc.


All of those settings only apply to the jpeg, so it won't matter for your RAW whatever you set them to, and setting them all to neutral will make your histogram spot on reliable with relation to the RAW. I was assuming you had already done this, and perhaps Derrel was too.

Please do not take this offensive (as I also appreciate your intention to help), but I know quite a lot about how RAW vs. JPG works, how contrast and saturation, etc. apply to the JPG and not to the RAW. Setting the contrast to low rather than zero works pretty well (and more consistently) for neutral colors. For saturated colors, zero may be the best choice. But this is rather stochastic, and can't be easily applied in practice unless you shoot a very uniform colored image. You could argue that keeping it at zero may be best for average circumstances. I would then argue that I want to handle non-average circumstances, that's why I have started to look into this topic at all. Additionally, since what contrast setting you need can't be defined by a rule of thumb, relying on that is just relying on random factors. There is just no optimal in-camera contrast and saturation setting that would work as a one best way.

I can't agree that "setting them all to neutral will make your histogram spot on reliable with relation to the RAW". Not just not spot on, but often pretty far from it. Not each and every camera calculates histograms the same way (e.g. normalization - what level do the edges of the histogram represent?) Also, think about how output for the EXACT same (in Kelvins) WB setting varies from one camera to another. Routines to apply contrast may also differ. Plus what does +1 unit in contrast mean?

As to your statements about the size of the histogram... well, what I am speaking about has not much to do with the size of the histogram or the amount of data. And no, my monitor is not 16,384 pix wide, although I do not see why would I even need that. Why would I want to see the whole histogram, at a detail of one column per pixel? RAW histograms are practically never identical to the one from jpeg. And a histogram in a RAW converter also differs from any of those. Plus most likely any two RAW converter would also produce different histograms.

The portion of the histogram that matters is the far right. Shadows not so much if you ETTR. It is not that the RAW histogram could give more data at the ends. It would just be more accurate. The reasons behind the difference: white balancing, contrast, saturation (and a great deal of RAW converters' black magic).

As I see, most of your suggestions are contradicted by practical findings. If you do not see how, please look at the article that I have linked above (here again: Beware the Histogram | RawDigger). This would also give you ideas what the difference between a RAW histogram and a JPG histogram is.
 
.........
Is your monitor 16,384 pixels wide? I didn't think so. Which means that in order to actually view the full detail of the RAW histogram, you'd have to scroll like, 8 screens over side to side..........

Why would it need to be? Do you really think I care if a given pixel is 8,459 or 8,458? Would I need to know how many pixels are at 8,459 and how many are at 8,458?

Compress the histo down to 8-bit, view it just as a normal JPEG histo, and watch the ends.
 
I just did go and try it on my own camera. Set all the saturation and such to zero/neutral, took a raw+jpeg, and compared the histograms. I was unable to produce any situation where the jpeg said it was not blown out, but the RAW said it was. And I was TRYING to push the colors as if taking extreme flower photos. E.g. taking photos of my bright orange bed sheets with a leather jacket and a red book on top of it in tungsten light; a blue seashell and a bright green radio under UV light, and a normal scene.

I wasn't even able to produce a situation where the histograms looked very different. No, they were not identical in every little detail (I don't expect them to be, since the jpeg is compressing not just flatly sampling), but the overall gross shape was almost exactly the same, and there was no noticeable pattern of the edges acting differently. Occasionally shifted a few points in brightness or something, nothing more. WELL within the powers of RAW latittude to correct for with sliders and those extra 6-8 bits of data to shield it from any sort of visible posterization.

Here is an example from the scene with all the orange/red stuff:
$1.JPG
The arrows point to the only two sliders I had to change. Brightness by just a few points and blacks from 5 down to 1. Neither of those changes was even all that visible to my eye, but I did them anyway just to match the histograms as closely as possible.

Why would it need to be? Do you really think I care if a given pixel is 8,459 or 8,458? Would I need to know how many pixels are at 8,459 and how many are at 8,458?
You don't need to. Which is why you don't really need a RAW historgram, because that's all it would really tell you that was much different from a neutral jpeg histo.

Okay, fine, that's not ALL it tells you. It also tells you very very slight shape differences that come from the manipulation of the image's pixels during the compression process. But nothing about jpeg compression should be extreme enough to concern the photographer at all in terms of worrying about their exposure being misrepresented or anything like that. The change is too slight--probably slighter than 1/3 of a stop, such that you couldn't even adjust your camera to account for it if you wanted to, because your controls are too coarse.

Compress the histo down to 8-bit, view it just as a normal JPEG histo, and watch the ends.
Absolutely nothing happened in any of my samples. I could see a couple of the histo bars move like, maybe 1 pixel up or down (presumably as a few pixels in the image right near to edges of the zones got re-rounded). That's it. Nor do I see why anything SHOULD have changed. What are you getting at?
 
Last edited:
I just did go and try it on my own camera. Set all the saturation and such to zero/neutral, took a raw+jpeg, and compared the histograms. I was unable to produce any situation where the jpeg said it was not blown out, but the RAW said it was. And I was TRYING to push the colors as if taking extreme flower photos. E.g. taking photos of my bright orange bed sheets with a leather jacket and a red book on top of it in tungsten light; a blue seashell and a bright green radio under UV light, and a normal scene.

I wasn't even able to produce a situation where the histograms looked very different. No, they were not identical in every little detail (I don't expect them to be, since the jpeg is compressing not just flatly sampling), but the overall gross shape was almost exactly the same, and there was no noticeable pattern of the edges acting differently. Occasionally shifted a few points in brightness or something, nothing more. WELL within the powers of RAW latittude to correct for with sliders and those extra 6-8 bits of data to shield it from any sort of visible posterization.

Here is an example from the scene with all the orange/red stuff:
View attachment 57677
The arrows point to the only two sliders I had to change. Brightness by just a few points and blacks from 5 down to 1. Neither of those changes was even all that visible to my eye, but I did them anyway just to match the histograms as closely as possible.

Why would it need to be? Do you really think I care if a given pixel is 8,459 or 8,458? Would I need to know how many pixels are at 8,459 and how many are at 8,458?
You don't need to. Which is why you don't really need a RAW historgram, because that's all it would really tell you that was much different from a neutral jpeg histo.

Okay, fine, that's not ALL it tells you. It also tells you very very slight shape differences that come from the manipulation of the image's pixels during the compression process. But nothing about jpeg compression should be extreme enough to concern the photographer at all in terms of worrying about their exposure being misrepresented or anything like that. The change is too slight--probably slighter than 1/3 of a stop, such that you couldn't even adjust your camera to account for it if you wanted to, because your controls are too coarse.

Compress the histo down to 8-bit, view it just as a normal JPEG histo, and watch the ends.
Absolutely nothing happened in any of my samples. I could see a couple of the histo bars move like, maybe 1 pixel up or down (presumably as a few pixels in the image right near to edges of the zones got re-rounded). That's it. Nor do I see why anything SHOULD have changed. What are you getting at?

Just one thing; what you have labeled there as a raw histogram isn't a raw histogram. It's an RGB histogram software interpreted by ACR from a raw file, and that's a very different thing.

Joe
 
I use the RawDigger histogram when setting exposure in the studio with a tethered camera. It adds one extra step, albeit a simple and quick one. I find it very useful for optimizing the exposure, in particular the log scale option.
 
Just one thing; what you have labeled there as a raw histogram isn't a raw histogram. It's an RGB histogram software interpreted by ACR from a raw file, and that's a very different thing.

Joe

So? I read the article linked above. It also looked pretty clearly to be the exact same shaped histogram shown on the LCD screen when settings were on all neutral and custom WB, versus from rawdigger. The only obvious distance was that the horizontal range of the histo was truncated.

This makes it inconvenient to judge the true data from the LCD in very contrasty situations IF you really want to squeeze out every tiny bit of data in ETTR without going over at all, but by no means so inconvenient that a tethered computer is necessary. I mean if you're already in a controlled studio, go for it. But in the field, there are probably better ways. E.g.:

1) Figure out how many stops roughly it is from the right side of the LCD histo to the right side of the rawdigger produced histogram, and then in the field, shoot that many stops of overexposure above whatever it looks like you should shoot ETTR from your LCD, to get a pretty realiably ETTR set of data in your RAW.
2) You could also try setting your jpeg settings in camera to -4 contrast or however much lower it will allow you, and then check to see how much closer the horizontal range of the LCD histo is to the rawdigger histo. Hopefully, the in camera compensation will be enough to make the scales appear to be much closer to one another, so you can be a little more confident in whatever remaining small corrections you want to do (see #1)
 
Just one thing; what you have labeled there as a raw histogram isn't a raw histogram. It's an RGB histogram software interpreted by ACR from a raw file, and that's a very different thing.

Joe

So? I read the article linked above. It also looked pretty clearly to be the exact same shaped histogram shown on the LCD screen when settings were on all neutral and custom WB, versus from rawdigger. The only obvious distance was that the horizontal range of the histo was truncated.

This makes it inconvenient to judge the true data from the LCD in very contrasty situations IF you really want to squeeze out every tiny bit of data in ETTR without going over at all, but by no means so inconvenient that a tethered computer is necessary. I mean if you're already in a controlled studio, go for it. But in the field, there are probably better ways. E.g.:

1) Figure out how many stops roughly it is from the right side of the LCD histo to the right side of the rawdigger produced histogram, and then in the field, shoot that many stops of overexposure above whatever it looks like you should shoot ETTR from your LCD, to get a pretty realiably ETTR set of data in your RAW.
2) You could also try setting your jpeg settings in camera to -4 contrast or however much lower it will allow you, and then check to see how much closer the horizontal range of the LCD histo is to the rawdigger histo. Hopefully, the in camera compensation will be enough to make the scales appear to be much closer to one another, so you can be a little more confident in whatever remaining small corrections you want to do (see #1)

So, it doesn't support your position -- you've laid a faulty foundation.

You need to collect a lot of real world data in this case before making an assessment. Have you taken scores (or more) raw exposures and examined the results in software like Raw Digger and compared the information you get from Raw Digger with the camera JPEG along with the output from four or more (default setting) raw converters? You need to do that before you tell the OP what he's asking for isn't really needed or appropriate. I would also like to have what the OP wants. It seems to me a reasonable expectation.

When you stepped into this thread earlier you said, "Actually, the more I think about it, the less this makes sense. What would the difference be between a jpeg and a RAW histogram?" Given that you thought the RGB histogram in ACR was a raw histogram your befuddlement is understandable. The camera processing software applies a tone curve to the demosaiced raw data and the histogram it displays reflects that applied tone curve. Every raw converter does the same even when you set it to clean defaults. Raw converters won't show you a "really raw" histogram. They're all software interpretations of the data and the OP would like to see a histogram of that raw data before it gets software interpreted. So would I. Having made those scores and more real world tests I have learned to be very suspicious of any software interpretation of my data especially the software in the camera. And that's the difference. Software can't see. I can, and having read a few posts from the OP, I suspect he can also see.

So what the OP wants doesn't exist right now and you're right there's no real in-the-field solution that isn't ridiculously over-the-top. As I noted earlier I'm not about to drag a laptop around with me to take photos. But equally ridiculous in my mind is paying attention to misleading info on the camera LCD. Because what the OP wants doesn't exist right now doesn't mean he should be happy with the current situation or with a work-around.

I use a work-around as I noted. Mine is to test my camera meters carefully, every once in awhile break out the hand meter and otherwise bracket. You could develop a work-around that involved examining the camera histogram and adjusting based on past experience of how your camera's software is probably going to mangle your data. They're work-arounds. And given that what the OP wants is well within the ability of the camera engineers to give us, they're burdensome work-arounds. What the OP wants is reasonable. I hope he gets it and that it eventually shows up in the cameras.

Joe
 
Hi All, well, when starting this thread I did not intend to start a fiery debate about the value of JPG histogram vs. real RAW data - although I find this discussion very useful in a sense that it may help understanding how camera-displayed histogram may be misleading both suggesting false under or overexposure. That said, if you are fine without RAW histogram, or you do not see a difference between exposures that looked proper based on the camera histogram or a RAW histogram, well, then I guess there would be no gain for you in making all the extra efforts.

As Ysarex mentioned earlier, in the good old film days a densitometer could be used to asses exposure (post capture, of course). Now with a DSLR, checking exposure at the RAW level could be made available immediately after shooting. RAW digger is an option (I use it, and I am pretty happy with it), although I was just looking for something that integrates into my workflow. Since I shoot tethered a lot, having a tethering tool that provides instant feedback, would be nice. A great advantage of digital over film would be that you do not have to wait till your films are developed (=time + costs, plus possibly the lost chance to capture a great shot) before you can check exposure.

Why I am so eager to have this? Well, if I fork out 5-10k bucks for a trip to a remote part of the world, I am not likely to go back the next week - I want to do everything right if I can. Right there, right then. Also under controlled circumstances, lentgh of the total workflow translates to cash after all. Trial and error is usually part of the process, and I am fine with that, but speeding up the trial cycles would be nice.
 
A digitally displayed histogram by definition includes a software interpretation, so I'm not sure what you're really talking about when you say "a histogram prior to software interpretation." Rawdigger doesn't give you such a thing either. It's also a fairly well defined, standard sort of thing to do in software, in or out of the camera world. Bin sizes can change, scale can change, but other than that, a histogram is pretty much a histogram. Gross shapes will be the same.

I was wrong about the scale being similar. I have no idea why they aren't, and for some dumb reason yes they seem to cut it off horizontally. But that's not a major fundamental difference or obstacle. It's just a different linear display parameter for what is still virtually an identical histogram. All of the examples on the rawdigger page (which were I'm sure cherrypicked to be particularly instructive about huge differences and situations where it is most useful) still show the same histogram shape, just without the horizontal cutoff.

Which means you should be able to perfectly effectively compensate for the cutoff by empirically figuring out how much your camera cuts off, and then exposing appropriately higher than what the LCD histo indicates. Sure, it's a "workaround," but it takes almost zero mental effort and should work in pretty much every situation. This is even less burdensome than changing your exposure compensation when taking a snowy photo, for instance. At least that requires you to do some thinking differently for the content of each photo, whereas the histogram workaround can be applied more or less blindly as a constant rule (i.e. you could be blind to the content, only need to look at the histo, and could cover up the rest of the photo with an index card and still be successful).



Also, by the way, if you really insist on not using any compensation at all, depending on the brand, non-cutoff histos is probably a feature that people like those who code Magic Lantern or similar projects could provide to you. Maybe it already does provide that (I haven't looked through all the 80 million options in that firmware yet, and a lot of them do seems to be graphing options). Another option.
 

Most reactions

Back
Top