HDR images

See what I'm saying? How is the tone mapped JPEG any kind of representation of the HDRI. The non-tone-mapped jpeg looks more like the HDRI. HDR files just give you lots more exposure latitude - it's nothing to do with squashing multiple exposure levels into a single display gamut.

The problem with what you are saying is this:

There are now two completely different meanings to HDR.

They share part of the processing but the photographic HDR does more.

Unless people such as yourself and Helen B start creating a problem where none exists there is no confusion between the two.

The HDR that you are talking about (let's call it 'scientific HDR) is completely useless to photographers as your far right shot above demonstrates.

The term that photgraphers (and the makers of HDR software) refer to applies to a necessarily more complex process that enables the display of an image with a greater range of luminosity than the output device can handle by effectively 'cheating'. That is to say that what you see is not an accurate rendition of the scene as it was lit. But that's hardly anything new for photography, is it?

My piece about HDR from a photographer's perspective explains what is happening by using a scaled up manual example.
 
All i was saying is that stating that a Jpeg cannot be HDR is confusing... yes a jpeg cannot be a finished HDR file (or format) but it can be called a HDR...

Yeah, I know the issue and you're of course correct that many people (usually only photographers) call it an HDR so they can and do.



Now see, this is where I start scratching my head. It never ever can be - ever. Unless it's not tone-mapped at all. Here's a grab from the other thread.

The first one is tone-mapped (a little extreme but not too different from what many people do.).

The second one is a camera JPG.

The third one is a true HDRI.

Compare_Three.jpg

See what I'm saying? How is the tone mapped JPEG any kind of representation of the HDRI. The non-tone-mapped jpeg looks more like the HDRI. HDR files just give you lots more exposure latitude - it's nothing to do with squashing multiple exposure levels into a single display gamut.

Anyway I should probably clarify that I think this whole issue isn't all that important. It's kinda fun to hash out tho. ;) It gives us something (I think is) interesting to talk about. :thumbup:

If anything the second one looks better, IMO anyway.:lol:
BTW Bifurcator, do you know any Japanese? We need some help and you came to mind, I'm not sure if you know any though? Click Here. :)
 
Yeah, the 2nd one was sharpened in-camera. I should have turned that off for a comparison like this but I was too lazy. :D
 
Ok, so I have a question in regards to bracketing for this purpose. When I set my camera to +1/-1 for bracketing, what in effect is actually happening to those images?? What settings (if any) are actually being applied inside my camera to change the exposure of those 2 images??
 
Ok, so I have a question in regards to bracketing for this purpose. When I set my camera to +1/-1 for bracketing, what in effect is actually happening to those images?? What settings (if any) are actually being applied inside my camera to change the exposure of those 2 images??

The camera will actually change the exposure.

It isn't something done in post capture processing.

This is why people say that making an HDR image from one shot 'developed' differently is not HDR. (I don't necessarily agree with them - it's a bit more complicated than that).

So if you have your camera set on AP (say, f11) and it deems the correct exposure is 1/250,when you take the series of shots you'll end up with three shots: (1/125 f11, 1/25 f11, 1/500 f11).
 
Ok, so I have a question in regards to bracketing for this purpose. When I set my camera to +1/-1 for bracketing, what in effect is actually happening to those images??

Well unlike 12-bit and 14-bit camera files or even 16-bit working files, the 32-bit HDR format actually increases the min./max. range values. So you can maybe imagine placing 4 standard histograms end to end for one long (32-bit) one. When you assemble an HDR from multiple exposures it shifts the values of each frame per-pixel based on it's brightness value plus the exposure bracket value and distributes it into is new place on the expanded HDR histogram.

HDR_Histogram.png

HDR Histogram​

There's actually more to it than that and I'm paraphrasing for simplicity but that should give you the gist of it. If you're having trouble visualizing it in relation to 8, 12, 14, and 16 bit files thinking of digital sound files may help. Most of us know what sound waves look like in an editor. 8, 12, 14, 16, 24 bit sounds are just chopped up (digitized) into finer slices but the min./max dynamic range (volume) remains the same (0~255) across all formats. Imagine if there was an HDR format for sound (and there actually is btw but...) that the actual dynamic range would increase in such a way where very loud pops or normally over-modulated sounds would not clip off at the top and bottom but would have many times the vertical range. Such it is with image files. Dynamic Range is a different beast than bit depth (per pixel color). Bit depth is the number of steps. Dynamic range is the height of the star case. :D


What settings (if any) are actually being applied inside my camera to change the exposure of those 2 images??

What settings change inside your camera depend on what mode your camera is in for bracketing. Basically it's just increasing or decreasing any of: ISO, Shutter, and/or Aperture depending - I think. Others will know more about this for your specific model and bracketing type.
 
Last edited:
Well unlike 12-bit and 14-bit camera files or even 16-bit working files, the 32-bit HDR format actually increases the min./max. range values.

No.

True, a 32 bit HDR format will increase the range but that is just like moving from a 12 bit to a 14 bit sensor.

The 12 bit sensor will allow a maximum value of 4,096 and a 14 bit one will allow a maximum value of 16,384.

This compares with a maximum of 256 displayable values for a monitor, somewhat less for an inkjet printer.

Imagine that is there was an HDR for sound (and there actually is btw but...) that the actual dynamic range would increase in such a way where very loud pops or normally over-modulated sounds would not clip off at the top and bottom but would have many times the vertical range. Such it is with image files. Dynamic Range is a different beast than bit depth (per pixel color). Bit depth is the number of steps. Dynamic range is the height of the star case.

Bizzarely you seem to be making exactly the same mistake os Helen B when you say range is seperate thing to bit depth.

It may actually be easier to stick with the sound example to demonstrate why what you say above is incorrect.

For simplicities sake, let's assume the electronic make no noise themselves - this is never the case but it simplifies the explanation.

The actual range of sound loudness actually depends on how powerful your system is and how loud you've got the volume control. Obviously this is no help in defining dynamic range so what we use is the difference in sound level between the quietest sound that can be reproduced and the loudest.

So if you have an 8 bit system (which sounds pretty abysmal) the loudest sound is 256 times louder than the quietest sound.

If you have a CD system - which is 16 bits - the loudest sound is 16,384 times louder than the quietest sound.

If you have 24 bit audio the the loudest sound is 16,777,216 times louder than the quietest sound

But none of this has any actual bearing on the sound level at which you listen to the audio (although you wouldn't want to listen to 8 bit at very high volume :mrgreen:). It is nontheless what determines the dynamic range - see Here under examples of usage, Audio.

For dynamic range, the exact same logic applies to light as does to sound.
 
I'm so glad I can't read the above message. The ignore list is a wonderful invention. I'm told however that it's full of accusations, inaccuracies, and false information - so let the reader beware. ;)
 
I'm so glad I can't read the above message. The ignore list is a wonderful invention. I'm told however that it's full of accusations, inaccuracies, and false information - so let the reader beware. ;)
If there are any inaccuracies or false information I'm sure they will be pointed out in due course.

As things stand you have made certain statements of fact and logic with which I disagree.

I have given detailed reasons why I disagree with those statements (including a citation that directly contradicts one of your assertions).

This moves the debate forward.

Your rather puerile head in the sand approach does not.
 
Yet another HDR thread going slowly down the drain...:(

I apologise for that but there is little I can do about it.

At first glance you may take the view that it is just pointless squabbling.

However, my contribution in post 22 was a genuine attempt at debating, addressing points that I genuinely believe that Bifurcator has got wrong.

His response (to which I have alerted the moderators) could most charitably be described as heckling.

You my wish to ask yourself why, if someone has discovered that my post was full of 'inaccuracies, and false information', they did not post pointing them out, rather than allegedly PMing Bifurcator.
 
You are both obviously very informed on this subject, regardless of one thoughts about the other. Unfortunately, Im still very green in this area, but copious amounts of reading I do every day on the subject is helping me to try and understand what it is you are explaining to me, to us beginners.

I no longer use the bracketing, instead i shoot 3 or 4 images and change the shutter purely because I could not see how changing something in photoshop would be any different to exposure compensation in the camera. Why you would have it change the aperture or ISO, I dont know. Having images with different DOF and possible noise makes no sense to me if you are going to layer them over each other.

Above, Moglex mentions some values. Its the first time Ive seen someone mention it. Im only bringing it up because earlier today I did some research into RAW format. Why?? Because people said it was lossless but failed to mention what it was exactly that is kicked to the curb during processing in your camera. When I saw those figures, I knew I had seen them mentioned in the article I read. It helped me understand and I hope it helps others. So to avoid a public flogging for raising an old debate, I'll just post the link and be on my way.

http://www.luminous-landscape.com/tutorials/understanding-series/u-raw-files.shtml

Thank you both for addressing my questions.
 
His response (to which I have alerted the moderators) could most charitably be described as heckling.

Seems rather a statement of fact to me, however this statement, "Your rather puerile head in the sand approach does not... " is downright insulting and something for which an apology should be offered.
 
See what a load of nonsense all this HDR stuff brings?

Single exposures are the way to go.
 

Most reactions

New Topics

Back
Top