Color Depth

Stevepwns

No longer a newbie, moving up!
Joined
May 29, 2013
Messages
622
Reaction score
203
Location
Maryland
Website
www.jacobeastonphotography.com
Can others edit my Photos
Photos OK to edit
What do you use and why?

I have been using sRGB, I have read and watched videos on the debate over AdobeRGB as compared to sRGB. I would be interested in your thoughts and why you choose to use what you use. Pro's and Con's, does it even make a difference, etc...
 
Consider what you plan on doing with your pictures. If you change over to AdobeRGB and none of the software or devices you want to use can use that then what have you accomplished.
 
Most of the time I shoot RAW, so it doesn't matter.
Raw image files have no color space, so the camera color space settings have no effect on a Raw file.
When I shoot JPEG or TIFF I make sure the camera is set to Adobe RGB, because Adobe RGB has a wider color gamut than sRGB.

sRGB and Adobe RGB are color spaces. What you need be concerned with is the gamut of colors each can represent.
Which color space you use depends on the gamut display capabilities of the device(s) used to view the image.
The most common TN computer displays can only display a portion of the sRGB color space, and even less of the Adobe RGB color space.
$1000+ wide-gamut ISP displays can usually display all of the sRGB color space and 95% or more of the Adobe RGB color space.

Color depth refers to how many bits (binary digits) define the colors.
8-bits can define a color depth of 256 colors per color channel. The binary number - 11111111= 255 and defines 256 levels of tone: 0 - 255. (0 = pure black, 255 = pure white)
12-bits can define a color depth of 4096 colors per color channel 111111111111 = 4095 and defines 4095 levels of tone: 0 - 4095. (0 = pure black, 4096 = pure white)
14-bits can define a color depth of 16,384 colors per color channel. 11111111111111 = 16,384 and defines 16384 levels of tone: 0 - 16,383. (0 = pure black, 16,383 = pure white)

Both sRGB and Adobe RGB can be represented by color depths of 8-bits, 12-bits, or 14 bits.
The more bits used, the more subtle the gradations of tone either color space can show.

JPEG is limited to an 8-bit color depth but a JPEG can be either sRGB, Adobe RGB or any of the other RGB color spaces.

Editing experts recommend editing in the broadest color gamut available to you, which is usually ProPhoto RGB.
Set the output color space and embed the color profile that is appropriate for the photos use.

For some commercial uses the additive RGB color model, and it's many color spaces, is not appropriate.
Instead the subtractive CMYK color model is used.
Note: CMYK is not a color space, it is a color model and has no color spaces.

RGB color model - Wikipedia, the free encyclopedia
CMYK color model - Wikipedia, the free encyclopedia
CcMmYK color model - Wikipedia, the free encyclopedia

If you're not faint of heart or math-challenged the definitive 'color science bible' (which by the way has no (none) color illustrations in it) is:
Color Science: Concepts and Methods, Quantitative Data and Formulae.

For a less technical treatise on color management I recommend - The Digital Print: Preparing Images in Lightroom and Photoshop for Printing
 
Last edited:
As an Amazon Associate we earn from qualifying purchases.
By default, sRGB is used all the digital cameras, computer post processing programs, printers, and websites I have used in the past 10-12 years. Why screw with something that works well enough for the overwhelming majority of users out there?

In my opinion, as an amateur, unless supplying photos for a specific (usually paid) purpose such as a magazine that specifically requires Adobe RGB, I don't see a definite need to use Adobe RGB.
 
Trying to wrap my head around this. Do camera color filters attempt to approximate the standard observer functions?

Not quite sure what you're asking -- "standard observer functions"?

A digital camera's raw file is not yet interpolated (demosaiced) into a three channel RGB photo. Digital color spaces like sRGB, Adobe RGB, Bruce RGB, etc. apply in practice to RGB images.

Joe
 
Trying to wrap my head around this. Do camera color filters attempt to approximate the standard observer functions?
The image sensors in digital cameras cannot record color..

With the exception of the Foveon sensor a color filter array (a Bayer array) has to be placed in front of the sensor so the colors in a scene can be interpolated by a Raw converter. Raw files get converted outside the camera. JPEG or TIFF files made in the camera get processed by a Raw converter on the image processor chip in the camera.

Bayer filter - Wikipedia, the free encyclopedia
Foveon X3 sensor - Wikipedia, the free encyclopedia
 
The image sensors in digital cameras cannot record color..

With the exception of the Foveon sensor a color filter array (a Bayer array) has to be placed in front of the sensor so the colors in a scene can be interpolated by a Raw converter.

I understand that the sensors only "count photons", but surely the color filters in the Bayer array have a particular spectral transmission function just like any other filter. I'm assuming the green filter tries to approximate the spectral sensitivity function, for example.
 
Not quite sure what you're asking -- "standard observer functions"?

I meant the CIE standard observer functions.If the filters match the standard observer functions (as they do in a colorimeter, for example), then one can directly measure the chromaticity of any given light source/object in the scene, and reproduce that faithfully on a display which is well characterized.

A digital camera's raw file is not yet interpolated (demosaiced) into a three channel RGB photo. Digital color spaces like sRGB, Adobe RGB, Bruce RGB, etc. apply in practice to RGB images.

I'm less concerned about the demosaiced product (which probably involves proprietary interpolation and weighting). At this point, I'm trying to get a sense of how to interpret the bayer filtered information pre-demosaicing.
 
I meant the CIE standard observer functions.If the filters match the standard observer functions (as they do in a colorimeter, for example), then one can directly measure the chromaticity of any given light source/object in the scene, and reproduce that faithfully on a display which is well characterized.



I'm less concerned about the demosaiced product (which probably involves proprietary interpolation and weighting). At this point, I'm trying to get a sense of how to interpret the bayer filtered information pre-demosaicing.

I got you. I don't believe so and in many cases I believe that info is manufacturer proprietary.

Color filter array - Wikipedia, the free encyclopedia

Joe
 

Most reactions

Back
Top