What's new

RAW Vs. JPEG

RAW or JPEG

  • RAW

    Votes: 53 93.0%
  • JPEG

    Votes: 4 7.0%

  • Total voters
    57
  • Poll closed .
Today's digital cameras make 12-bit depth (4096 discrete colors per color channel, as defined by colormetric interpretation of the Bayer array) or 14-bit depth (16,384 discrete colors per color channel, as defined by colormetric interpretation of the Bayer array).
JPEG is limited to an 8-bit depth, or 256 discrete colors per color channel. What happens to the other 3840 or 16,128 discrete colors?
Banding and posterization can result in image gradients because JPEG lacks enough colors to render the gradient without visible steps.
[snip]...You cannot fix underexposed photos in post-process even if you use RAW. For small amount of exposure correction (-/+ 1 stop) jpeg will do just fine as RAW.

It's not about "fixing" photos it's about processing to an intended outcome when, photographing under natural light conditions, it's not possible to alter the lighting. That's why they invented the zone system back in the good old days. Today we have 14 bit depth captures and you sure can pull up a lot of data from the low end -- a lot more than will be in the camera (sucky algorithm) JPEG.

Here's an example. You'll note from the EXIF data that I dialed in a -.3 EC when I took this photo. I didn't want to blow the highlights in the sky. In the camera JPEG the sucky algorithm sure enough has clipped the red channel and blocked the shadows. This processed by me from the raw file where I had lots of editing headroom:

$wetland.webp

Here's the camera JPEG full-res if you'd like to demonstrate that it can be adjusted to match.

wetland

I knew what I was doing when I took that photo and I knew what I intended as a final result. I knew I could get it from a raw capture just as surely as I knew that no sucky camera software could do anything with this scene other than crash and burn.

Joe
 
Ok, I have always been under the interpretation that I can only shoot Raw or JPEG with my T2i, but I have seen people in this thread mention that you can shoot both. Can anyone briefly tell me how I would go about doing both?

It's available from your camera menu -- RAW plus JPEG. In which case you'll save two files on the card for each photo taken.

Joe
Thanks a bunch, Im gonna stock up on 32's on Black Friday.
 
Heavily underexposed RAW image contains 14 bit of trash and nothing will help it. I will repeat that RAW is useful if you know you want to use own conversion algorithm, that is the main its application. You cannot fix underexposed photos in post-process even if you use RAW. For small amount of exposure correction (-/+ 1 stop) jpeg will do just fine as RAW.

My good friend you need to learn something about how digital images are processed. Firstly you need to stop with this idea of "conversion algorithm". Just use the word settings. If shoot a photo on my Nikon camera in JPEG, and in RAW, and then adjust the brightness of the RAW I am still using the same "algorithm" in the conversion, but very different settings and depending on these settings the RAW file may not be able to look anything like a JPEG.

Secondly the idea of gamma is worth very little to the discussion. Gamma can be adjusted in post processing at any time. Lightroom for instance doesn't apply the corrective gamma curve until after it's done all other modifications to the image. Personally I work with linear data when editing any astronomy photos, it makes it easier to separate luminance and colour channels, but ultimately changes nothing for how the final image will look. The only time a gamma correction has any negative effect on data is if two different values before correction result in the same value afterwards, and that is the direct result of a low bit-depth.

Enter JPEG.

You need to let go of the idea that there's such a thing as 14bits of trash. There's no such thing as trash bits. If a bit represents a photon that has hit the sensor than that bit contains data, even if that data is below the noise floor it is still data and thus still useful in post processing. The only useless data is all zeros or all ones. Because of your aforementioned gamma correction a picture can visibly be extremely dark yet still contain an enormous amount of data above the zero point. ... if you have the required bit depth to retain that data. JPEG can't. It's 8bit. Your issue is not one of RAW vs JPEG, it's one of 14bit data vs 8bit data. That is 16384 discrete values per channel vs 256 values, and that's a massive difference as people have already shown by example.

If you're making a tiny adjustment to an image will you notice a difference between the two? Unlikely. But then life doesn't always provide you with the perfect exposures allowing tiny adjustments does it? We live in the world of HDR and extreme dynamic range images. The notion that we purchase a $1000+ camera only to throw away some of this is absurd. Oh by the way part of your "conversion algorithm" in the camera typically trashes some of the highlights and shadows in an attempt to make a pleasing generic contrasty image. The result is that highlight recovery is impossible at this point. If you've slightly blown a highlight than the difference between RAW and JPEG even with a -0.1 exposure adjustment can be night and day.
 
My good friend you need to learn something about how digital images are processed. Firstly you need to stop with this idea of "conversion algorithm". Just use the word settings.
If you use Nikon software to convert your RAW files, yes, it comes down to choosing "settings". There are also myriads of other ways to produce final jpeg from raw data and gamma correction is just one of them. Different tonemapping algorithms are an actively developing topic.

There's no such thing as trash bits. If a bit represents a photon that has hit the sensor than that bit contains data, even if that data is below the noise floor it is still data and thus still useful in post processing. The only useless data is all zeros or all ones.
If you've underexposed your image 4 stops you've produced approximately 4 bits of trash per channel. Not every bit recorded in Raw file is usefull. Actually low light areas of an image do not need that much bits to record the number of photos hit the pixel and this fact is used in Nikon's "lossy" NEF format.

So back to 14 bits of RAW vs 8 bits argument. I will repeat once more: comparing these bits is meaningless, because JPEG bits are gamma corrected values and RAW bits are linear values. Gamma correction helps JPEG retain the valuable information packed into less bits. That's all I wanted to say.

To make it clear I am not 'for JPEG' or 'against RAW'. I believe RAW is useful for critical work and JPEG is more convenient where IQ is not that critical (family pics etc.)

A bit of my background: I happen to work on photo software as an image processing algorithm engineer but I'm not a photography professional (not selling pictures), just an enthusiast. At home I even do not own Photoshop or Lightroom/Aperture. I've bought my first digital camera in 2001. Using mostly Nikon, currently D700.
 
At home I even do not own Photoshop or Lightroom/Aperture. I've bought my first digital camera in 2001. Using mostly Nikon, currently D700.

It might be worth downloading an evaluation copy, and then comparing the difference between 'exposure' adjustments to JPEG and raw files.
 
I used to shoot JPEG first when I had my first digital camera.

But then I tried RAW afterwards, I thought it would be lots of work to edit and get the settings right, but it's still not so time consuming as I thought.

Without going technically, because I'm a quite simple average user, I noticed that I can do LOTS more with RAW what I couldn't with JPG.
Certainly adjusting the white balance and exposure.
I'm not going back to JPEG ever...


Maybe people who believe in JPEG aren't that detailed that they have no need for RAW? Or maybe the haven't experienced RAW enough to get a good understanding on imaging?
 
At home I even do not own Photoshop or Lightroom/Aperture. I've bought my first digital camera in 2001. Using mostly Nikon, currently D700.

It might be worth downloading an evaluation copy, and then comparing the difference between 'exposure' adjustments to JPEG and raw files.

Believe me I have enough of this (image processing) at the day job. I do not underestimate the value of raw files, my point was that it's pointless comparing bit counts in linear gamma corrected ranges.

But thinking about buying aperture as my photo collection grows too fast :)
 
Here's an example. You'll note from the EXIF data that I dialed in a -.3 EC when I took this photo. I didn't want to blow the highlights in the sky. In the camera JPEG the sucky algorithm sure enough has clipped the red channel and blocked the shadows.

I knew what I was doing when I took that photo and I knew what I intended as a final result. I knew I could get it from a raw capture just as surely as I knew that no sucky camera software could do anything with this scene other than crash and burn.

Joe

Hi Joe, surely nothing will recover a poorely exposed JPEG with clipped channels. But I did not say that! My points were: 14 bit linear bits not comparable to 8 bit gamma corrected 2) poorly exposed raws are not recoverable 3) properly exposed jpegs have some headroom for small fixes.
 
Last edited:
There are two issues with JPEG.

Any camera or algorithm will probably do a good job with easy exposures. I'm referring to well-exposed shots that don't require much dynamic range. JPEG struggles when the shots do require a lot of dynamic range. For example... JPEG is a terrible format if you are shooting landscapes with 5+ stops of dynamic range in the image.

If we think of the ranges as "#1 Very dark", "#2 Dark", "#3 Mid-tones", "#4 Light", & "#5 Very light" then you can expect JPEG to do a great job with ranges 2 through 4. It'll struggle with #1 and #5.

JPEG tends to compress detail in shadow and highlight because the algorithm doesn't think human vision would notice it anyway. If the image didn't require adjustment then the algorithm would probably be right... we wouldn't notice it. This falls on it's face when the image DOES require adjustment and you discover the detail is missing; that same detail that was definitely there with you and your camera when you were capturing the image.

Bit-depth is another problem. JPEG only has 8 bits per color channel. In theory, JPEG can represent 8 stops since it has 8 bits. The problem is that the lowest bits have no tonality... you definitely wont be happy with the lowest 2 bits (which can only describe 4 possible values) and when you add the third bit you get 8 possible values... which is also pretty bad. It isn't until you get to the 4th bit that you can describe 16 possible values and have any enough tonality within the stop to avoid looking pixelated. That leaves you with only about 5 "usable" stops of dynamic range in a JPEG.

Now that you've got 5 "usable" stops (which is all that most cameras are good at anyway), you've got the problem with JPEG compression trying to eliminate detail your eye probably wouldn't notice so that it can save disk space. So now you're down to about 4 "usable" stops.

RAW, on the other hand, has 14 bits per channel. To keep things fair... we decided that the lowest 3 bits weren't really useful as stops so we'll discard them. That leaves us with 11 stops worth of usable dynamic range that would actually look decent.
 
Here's an example. You'll note from the EXIF data that I dialed in a -.3 EC when I took this photo. I didn't want to blow the highlights in the sky. In the camera JPEG the sucky algorithm sure enough has clipped the red channel and blocked the shadows.

I knew what I was doing when I took that photo and I knew what I intended as a final result. I knew I could get it from a raw capture just as surely as I knew that no sucky camera software could do anything with this scene other than crash and burn.

Joe

Hi Joe, surely nothing will recover a poorely exposed JPEG with clipped channels. But I did not say that!

I didn't say you did.

My points were: 14 bit linear bits not comparable to 8 bit gamma corrected 2) poorly exposed raws are not recoverable 3) properly exposed jpegs have some headroom for small fixes.

I've been reacting to these statements of yours: "The only reason to shoot raw if you want to do gamma correction using your algorithm (software), not one in camera. If raw have more "editing headroom" then the in camera algorithm sucks, but usually it is not the case."

And, "It was my point that 8-bit gamma corrected values store almost all the information actually useful from RAW file (in case if jpeg codec in camera works properly)."

You were reacting to Keith's post where he noted that we have a lot more "editing headroom" with a raw capture than we do with a JPEG. That when the JPEG is created by the in camera algorithm that "headroom" is discarded and any further editing access is severely limited.

My point is Keith is correct. There's a huge amount of additional editing headroom in a raw file. You're saying that's not the case as long as the camera JPEG codec works properly -- as long as the in camera algorithm doesn't suck which you say it usually doesn't. And I'm saying the in camera algorithm usually does suck. Under ideal lighting conditions the in camera algorithms produce mediocre results acceptable to the general public. But ideal lighting occurs infrequently when working with natural light and as soon as the lighting is no longer ideal the in camera algorithms quickly degrade from mediocre to suck. The JPEG that then results is typically damaged beyond repair (as in my example). But with the huge amount of editing headroom available in the raw capture the photo can be processed successfully and the damage that would be done by the sucky camera algorithm can be avoided.

You were suggesting that it's not really that advantageous to shoot raw as long as the in camera algorithms don't suck which you say they usually don't. I'm saying they usually do suck and the damage they do can be avoided by manual processing and taking advantage of all that extra editing headroom available in the raw file.

Joe
 
At home I even do not own Photoshop or Lightroom/Aperture. I've bought my first digital camera in 2001. Using mostly Nikon, currently D700.

It might be worth downloading an evaluation copy, and then comparing the difference between 'exposure' adjustments to JPEG and raw files.

Believe me I have enough of this (image processing) at the day job. I do not underestimate the value of raw files, my point was that it's pointless comparing bit counts in linear gamma corrected ranges.

But thinking about buying aperture as my photo collection grows too fast

My apologies, your previous comments seemed strange for someone who was used to processing raw files. They still do seem strange, and they contradict experience.

It may not be straightforward to compare bit depth between a linear-encoded file and a gamma-corrected one, but that doesn't make it pointless. It is perfectly possible, and meaningful.
 
Ok, ok guys and gals. My initial reaction was about the statement that '80% color information is thrown away'. For properly exposed photos it is not correct, the most of information is preserved (and partially due to range compression by gamma correction). Raw definitely has more headroom for editing than JPEG. That's out of the question. The question is how much more? JPEG is ok for processing if properly exposed (at least not clipped) in the first place and in-camera codec is good (which is not the case in some cameras, but now it's rare). It is far more easy with raw to get the best results possible. But it's not impossible with jpeg too, but requires much more skill. That's it. For, example, for my image processing app (if you curious, it's called PanoEdit) the number one priority was support of raw files. Without it was just not useful for too many people.
 
Last edited:
Ok, ok guys and gals. My initial reaction was about the statement that '80% color information is thrown away'. For properly exposed photos it is not correct, the most of information is preserved (and partially due to range compression by gamma correction). Raw definitely has more headroom for editing than JPEG. That's out of the question. The question is how much more? JPEG is ok for processing if properly exposed (at least not clipped) in the first place and in-camera codec is good (which is not the case in some cameras, but now it's rare). It is far more easy with raw to get the best results possible. But it's not impossible with jpeg too, but requires much more skill. That's it. For, example, for my image processing app (if you curious, it's called PanoEdit) the number one priority was support of raw files. Without it was just not useful for too many people.

I understand your reaction to the 80% figure and further understand what you're trying to say about the gamma curve application to the raw data. The applied curve compresses the tonal information and as such makes use of a lot of the recorded data. But I'm still going to have to say no to this assertion; "JPEG is ok for processing if properly exposed (at least not clipped) in the first place and in-camera codec is good (which is not the case in some cameras, but now it's rare)."

This assertion requires further qualification. In a studio it's possible to control the lighting contrast. One could in fact learn to adjust the lighting contrast to take best advantage of a camera's JPEG processing software. That ability to control the lighting becomes far more difficult and/or impossible in natural light. Your assertion calls for proper exposure as an assumption. So let's do a for instance:

The photo I posted earlier was backlit. What would have been a proper exposure? The camera JPEG processing software clipped the red channel in that photo and it also clipped all three channels on the low end. So I didn't have a proper exposure. I needed to do what? Increase the exposure so the shadows wouldn't be blocked. Then I'd have a proper exposure right? Wait a minute! The highlights in the red channel were already clipped. If I increased the exposure so the shadows wouldn't block up then the highlights would clip even worse. How do I get a proper exposure then? Forced to rely on the software in the camera, all possible exposures of that scene would produce failure -- "properly exposed" wasn't an option for the sucky software in the camera. But that was a proper exposure for me as I processed the raw data.

Your assertion has to read: JPEG is ok for processing if an ideally lit scene is properly exposed (at least not clipped) in the first place and in-camera codec is good (which is not the case in some cameras, but now it's rare). I'll reluctantly let that go, but....

I don't allow my students to refer to adjusting and/or correcting camera generated JPEGs as editing or processing. I let them do it, in fact I teach them how to do it but, I insist they use the term repair. The processes of assigning a color space, reducing to 8 bit and compressing the data are finishing steps -- you do then when your finished editing. If you go back after you're finished it can only mean you need to fix a screw up. Once the book has gone to press there's no more editing.

Joe
 
For small amount of exposure correction (-/+ 1 stop) jpeg will do just fine as RAW.[/QUOTE]

I agree, however some people are working with raw captures to overcome shortcomings in the JPEG processing of certain cameras, which reduce detail.
 

Most reactions

Back
Top Bottom