Challenge: Noticeable differences between RAW and a jpeg edited in 16 bit mode?

Jpg artifacts aren't too much of a problem if you save high-quality jpg.

Editing alone can cause the problem, long before you save it.

Jpg artifacts are only created during the jpg saving process. Editing a picture which already has jpg artifacts in them could enhance them but editing will never create artifacts.
 
Further:

In-camera noise reduction is theoretically better than offboard, and in practice it has been shown to be better at least some of the time. This is a case where having the camera produce JPEG (since, in general, you can't get the in-camera noise reduction without converting to JPEG) is a win. Point JPEG.

The OP's original point appears to be that, if you set the camera up with sufficient precision, and expose with sufficient precision, the JPEG output will frequently be indistinguishable from the RAW. This is probably true, most of the time. I take issue with the idea that the camera can be persuaded to correctly push the entire dynamic range into the JPEG 100 percent of the time, even with the most precise setting of controls, but anyways it's irrevevant.

The point is 'if you set the camera up with sufficient precision, and expose with sufficient precision'. Why do I want to do this? Why is this a good thing? I can fiddle around in menus and poke at tiny little buttons fussing around this way and that, just to get a result that -- as far as we can tell -- is darn near indistinguishable from the one I can have the easy way? Why on earth would I work harder to get something that's practically almost all the time mostly just as good?

There is an excellent case to be made for RAW+JPEG for noise reduction, at least for some models of cameras. There is an excellent case that JPEG (without a lot of fiddly effort to make it perfect) is perfectly adequate for a lot of pictures.

To try to make some blanket statement that JPEG is just as good in all cases, though, is just silly.
 
...........Jpg artifacts are only created during the jpg saving process........

Huh?

So I load a JPEG, choose the Red channel and crank the brightness up to 100%.

jpegeditingartifact.jpg


I haven't saved the image yet... just loaded and edited it. This is how it looks on my monitor, I just took a screen shot.

You're telling me there are no editing artifacts here? It only becomes an artifact when I save it?
 
Last edited:
A few more thoughts to add:

1) White balance. RAW doesn't fix the white balance and lets the photographer set it themselves; this makes adjusting the white balance of a shot very easy in RAW editing; its just adjusting two sliders. Now if you take a shot in JPEG mode and don't have time to custom set the white balance or something happens and the white balance is off then suddenly you're in a world of pain as you have to spend a lot longer editing the JPEG to adjust the white balance of the shot to look proper - whilst if you had it in RAW it would take a few moments to adjust.
This also leaves you more creative options with RAW since you can easily adjust the white balance in editing for the best visual appearance you want for the shot.

2) Faithful in the camera is still applying a level of editing to the final JEPG. The image data that the camera captures needs somewhere to start when it comes to applying these settings. RAW processing software gives a starting point and lets you then effortlessly change the final value - JPEG on the other hand selects a point for you. Faithful is as neutral a value as typically possible based upon testing and such done by the designers; but its not "perfect" nor fully "faithful" to the scene as such. Heck you can see this fact yourself as you compare different RAW processing software options (or even different editions of the same software) and you can see that different start points by the software are selected on nearly all the parameters such as noise reduction, sharpening, contrast etc...

4) In camera noise reduction and sharpening is global - it hits the whole photo. Sometimes you'll have a shot that just doesn't need that over the whole photo, you want it in specific areas so having a RAW lets you take control (although of course, cameras do apply noise reduction and such during the capture process and bundle it into the RAW itself before JPEG processing).

In the end if there were no real gain with RAW people would not advocate and use it so widely.
 
...........Jpg artifacts are only created during the jpg saving process........

Huh?

So I load a JPEG, choose the Red channel and crank the brightness up to 100%.

jpegeditingartifact.jpg


I haven't saved the image yet... just loaded and edited it. This is how it looks on my monitor, I just took a screen shot.

You're telling me there are no editing artifacts here? It only becomes an artifact when I save it?

No no, you didn't read my post did ya. :p
I said editing can indeed enhance jpg artifacts, it can not however create them. ;)
 
No no, you didn't read my post did ya. :p
I said editing can indeed enhance jpg artifacts, it can not however create them. ;)

There were no artifacts before the edit.
 
Well if you created artifacts through editing they sure as heck aren't jpg artifacts!
Jpg artifacts are compression artifacts which are made during the saving process to decrease filesize. As long as you're editing the file there is no compression process running, that process only runs when you save the file.
Sure you can probably create artifacts through editing, but they are not really jpg artifacts. ^^
 
What IS creation, anyways?
 
Well if you created artifacts through editing they sure as heck aren't jpg artifacts!
Jpg artifacts are compression artifacts which are made during the saving process to decrease filesize. As long as you're editing the file there is no compression process running, that process only runs when you save the file.
Sure you can probably create artifacts through editing, but they are not really jpg artifacts. ^^

They certainly ain't raw artifacts. :er:

Can't be TIF or BMP or PNG or NEF or DNG artifacts.

They don't look like 1800 BCE Mesopotamian artifacts.

I doubt they're grilled ham and cheese sandwich artifacts.
 
I was having a similar discussion today. Each file type has its own uses, I get that. BUT can anyone tell me why a wedding photographer would shoot solely in JPEG?
 
Let's start with a perfectly exposed image of the Great Carina Nebula. Perfectly exposed as there's no clipping on black (noise) and only very slight clipping on one channel (24pixels on blue channel). I.e I can't shoot any brighter without clipping data.
This example is irrelevant unless you set your in-camera settings to be extremely anti-contrast. For example, in Canon, which I shoot, there is an option for custom "picture modes" and you can set contrast to negative if you like. What this will do is sample a ton of data from the RAW on dark and light ends of ths histogram, and almost none from the middle relatively.

So the RAW is like this:

LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

A jpeg made with the default settings would be sampled like this:
LLMMMMMHH

A jpeg made with neutral settings would be sampled like this:
LLLMMMHHH

A jpeg made with inverse contrast settings would be sampled like this:
LLLLMHHHH

And the eye can only see something like
XXXXXX
(something like 3/4 the dynamic range of a jpeg... I mean specifically within the boundaries of sRGB yeah yeah.. see further down)

The result is that you actually end up with much more than 8 bit data for shadows and highlights, even though your format is 8 bit, because it is essentially comandeering the data space that would have been used by midtones in order to store more shadow and highlight data (it will look extremely gray out of camera).

This is exactly the setting you would use to shoot astrophotography in jpeg. If you used that setting, then you wouldn't have to do any anti-contrast edits like you did (or only very minorly), and so you wouldn't be stretching the jpeg out of its original state much, and you wouldn't get noticeable posterization.

RAW is only necessary here if you didn't set your camera for astrophotography before taking astro photographs. Sure this is in some cases more convenient (to not have to think ahead), but ultimately, thinking ahead about lighting is something you should always be doing anyway, and setting your custom picture mode would just be a part of that process.

The result being that you don't have to fill up your entire memory card in 5 seconds or spend hours in post to get what you want.

ALSO: if you're already on a tripod and taking 68 photographs of the same scene, etc. then it would be pretty easy to simply do exposure bracketing for your jpegs. A 3xbracketed exposure will lead to even higher dynamic range than RAW, and will take up half the memory card space.

I'm concerned with the accuracy of your statement. For starters, unless you have access to both the internal chip-level micro code and firmware within the camera, there is no basis upon which to state this as a fact. Having written internal chip-level logic, I can tell you that there may be an incredible level of 'unseen' processing taking place between the RAW and JPG images produced, regardless of user settable options.
No matter what the user settings, it has to make sampling decisions in order to go from 12 bit or whatever to 8 bit. And it has to do jpeg compression, and various other things. For sure.

But what's relevant to the photographer is not borign algorithm stuff for changing to a specific format. What's relevant is what you can see, and the settings are sufficiently powerful to allow you to never see anything bad if you know what you're doing.

I can't claim to have years of experience or anything, since I just discovered this like a week ago (hence a "challenge" thread), but every stress test I've done so far turns out fine.

Raw is often used to increase the dynamic range above what is represented in the JPEG. As Judobreaker says, there is more (in many cases much more) luminance information in a raw file. You are probably going to say that an in-camera JPEG conversion could be selected that would include the entire usable dynamic range of the sensor
No, not the entire range. That's impossible. What you do is to retain the portion of the dynamic range that you care about for that photo, such that you don't need to dramatically change the contrast in post. And as long as you don't have to do that, you will not see banding. Photos never look banded right out of the camera at print size... there's a reason for that. it's only after you stretch them in editing. And in-camera setting can prevent the need for stretching in post.

For example, in the case of astrophotography above, where an extreme negative contrast sampling should be used (very much the opposite of the default camera settings).

1) The human eye can't even see all of the colors possible with an 8-bit jpeg.
That isn't even close to being correct if you are referring to sRGB or Adobe RGB.

Okay, fine. A more responsible way of putting it: Within the range covered by sRGB (and the internet and any but the most expensive prints), the resolution is plenty for your eye to work with, and you will not see any obvious color posterization.

The important point here is not the exact number of colors you can see versus print, etc. It's that using a RAW isn't going to make your color space any more vibrant. If you're going to print your photo anywhere other than a Sotheby's catalogue, or something, you're just going to convert it back ALSO to sRGB later anyway, because that is the standard of almost everything, and it is the closest you will usually get to print color accuracy.

So if it's going to end up in sRGB anyway, the ONLY advantage to RAW for color editing is therefore flexibility in choosing which bit of the color space is going to be sampled to sRGB. And if you are thinking about color and light in the field (which you should be), then you shouldn't need very much flexibility. The out of camera jpeg should already be pretty much where you want it, and whatever minor (10-20%) corrections you may want to do will not be sufficient to cause visible color posterization.

There are tons and tons of settigns in your camera for setting up a proper white balance. There's custom compensation, there's pre-set WB for different standard lighting conditions, and there are options to calibrate using cards in the field. You have all the tools you need, most of which only need to be set once for an entire shoot.

What about JPG artefacts and such? sure you're using 16 bit but it's still a lossy format and if you're editing a JPG that'll become an issue won't it?
jpegs out of your camera are not compressed very much. Especially if you shoot Canon. Nikon, by my understanding, has historically been more aggressive about compressing busy images to conform to a smaller file size, whereas Canon tends to just let the file be as large as it needs to be to have what they deem an acceptable level of compression for non-visible artifacts.

(This may no longer be the case nowadays. I can't afford to buy new bodies all the time, so I'm not up on latest developments)

16 bits might not be enough either. Scientists have discovered that you can use practically infinite dynamic range for internal calculations. The trouble is that as you apply processing steps, you might increase or decrease a pixel value outside of the dynamic range available, and then put it back with another step.
That's true. If photoshop had a 32 bit mode that still had all the tools available, then I would use that. But it doesn't. Although since RAW is only 12 or 14 bit, any discussion of hypothetical edits done with a currently nonexistent 32 bit or higher commercial photo editor is not really on topic (both jpeg and RAW would benefit equally from the extra processing range).


If even 1 out of 500 images could benefit from raw, losing that one to jpg is too many.
The idea is that RAW slows down your workflow and your burst speed.

So for those 500 images, editing them in RAW might take 2-3 extra hours compared to editing them in 16 bit photoshop from jpeg. During those 2-3 hours, you could have been out in the field taking another 100 photographs, of which many more than 1 would have been keepers.

So by spending unnecessary time in post, you are in fact losing would-be images by not being out shooting as much. Just like time is money, time is keepers.

(Also, smaller buffer size means smaller bursts, which also will cause many sports photographers, etc. to miss shots)

If there was REALLY no benefit to RAW editing, I doubt it would be so prevalent.
The system I am describing will not work if you leave your camera at default settings. It also will not work if you do not consider the light and color before every shooting situation.

This means RAW is often more convenient. But IMO, knowing how your camera works and considering light and color constantly are things one should do anyway. And if you do, then it is fairly easy to compensate and not require major editing in post that causes all the problems. But people may legitimately disagree with me on the virtues of thinking carefully about light and color each time. *shrug*

The point is 'if you set the camera up with sufficient precision, and expose with sufficient precision'. Why do I want to do this? Why is this a good thing? I can fiddle around in menus and poke at tiny little buttons fussing around this way and that
In my Canon Rebel T2i, even as a low end dSLR, the menu for this is not hidden at all. There's actually a dedicated button (down button on the directional pad) which immediately brings you to the selection of your picture mode. i.e. telling the camera what portion of the RAW to sample to jpeg.

There are also I think infinite (?) or if not at least 3-4 custom slots available that you can set and then use within 1-2 seconds. And this is not something you would do before every shot. It's something you would do before an entire shoot or major change of light.

E.g. "I'm shooting the night sky now, lemme switch to my night sky preset" or "I'm shooting broad daylight, lemme switch to my neutral mode" etc.

Is this an extremely minor inconvenience? Yes. But the benefit = faster burst, 5-6x as many photos in your memory card, and much faster workflow when editing. I think that's majorly worth it.

So I load a JPEG, choose the Red channel and crank the brightness up to 100%.
You're not following my instructions, which includes using your camera settings to get reasonably within the ballpark prior to editing. Cranking brightness up to 100% implies that you thought your original image was like 6 stops underexposed... You should never be doing that.
 
Last edited:
You should use your camera in whatever way best suits your needs.

Me, I shoot both ways. I use RAW a lot when I can't be buggered to worry much about the exposure a priori​ which is quite often.
 
JPEG is silly. RAW is serious.

The only thing silly is that statement. JPEG has many uses.

Let's start with a perfectly exposed image of the Great Carina Nebula. Perfectly exposed as there's no clipping on black (noise) and only very slight clipping on one channel (24pixels on blue channel). I.e I can't shoot any brighter without clipping data.
This example is irrelevant unless you set your in-camera settings to be extremely anti-contrast.

Why is the example irrelevant? Just because it doesn't fit your perfect idea of a normal photograph? The first photo I posted was exactly what I saw through my viewfinder. I just chose to enhance the shadow detail. Granted I enhanced it more than I normally would but that suddenly doesn't render my photo invalid.

But now you're arguing semantics. The camera gives you a choice of picture modes does it? So suddenly we supposed to do some processing in the camera in a high bit depth just to get around your argument that there's no reason to use a high bit depth in post processing? So then why shouldn't I just edit in RAW? More importantly I directly challenge you to get the same effect in 8 bit and you may start with as much in camera processing as you can achieve.

That's true. If photoshop had a 32 bit mode that still had all the tools available, then I would use that. But it doesn't. Although since RAW is only 12 or 14 bit, any discussion of hypothetical edits done with a currently nonexistent 32 bit or higher commercial photo editor is not really on topic (both jpeg and RAW would benefit equally from the extra processing range).

That's why I use Pixinsight, though the interface is arcane it provides 64bit native processing in a linear colour space.

The idea is that RAW slows down your workflow and your burst speed.

So for those 500 images, editing them in RAW might take 2-3 extra hours compared to editing them in 16 bit photoshop from jpeg. During those 2-3 hours, you could have been out in the field taking another 100 photographs, of which many more than 1 would have been keepers.

False and false. Nothing about RAW slows down your workflow. You can preset your import settings from RAW just like you can set your in camera JPEG settings. Heck if you use the manufacturer's RAW converter it will actually read the camera picture settings out of each individual file and apply them on import without any loss of data. Also 2-3 hours in the field taking photos of nothing because I miss my photo opportunities while messing with menus in the camera does not sound like time well spent to me.

Oh and RAW doesn't slow down burst mode. It only slows down continuous shooting speeds after the buffer is full which rarely happens and never was a problem even when I was firing off a ludicrous number of frames at a motorsport event.

This means RAW is often more convenient. But IMO, knowing how your camera works and considering light and color constantly are things one should do anyway. And if you do, then it is fairly easy to compensate and not require major editing in post that causes all the problems. But people may legitimately disagree with me on the virtues of thinking carefully about light and color each time. *shrug*

As mentioned earlier it's one thing to know how your camera works and quite another again to have time to set it to the "perfect" settings. In a studio I would agree with you whole heartedly (and still shoot RAW), but in the real world where photo opportunities are fleeting moments, and the lighting or conditions change at any moment the camera stays at the default settings. Hell I'm not even a proponent of shooting manual for the same reason (though I would in a studio).
 
Last edited:
jpegs out of your camera are not compressed very much. Especially if you shoot Canon. Nikon, by my understanding, has historically been more aggressive about compressing busy images to conform to a smaller file size, whereas Canon tends to just let the file be as large as it needs to be to have what they deem an acceptable level of compression for non-visible artifacts.

That's not the same as uncompressed though. The fact that artefacts are 'non visible' is disengenuous, depending on what you do they might remain invisible, with heavier editing they might not. If you're talking about JPG being a practical alternative that can equal raw then surely this is a problem.
 
Last edited:

Most reactions

New Topics

Back
Top