DBY
TPF Noob!
- Joined
- May 1, 2025
- Messages
- 2
- Reaction score
- 0
- Location
- Metro Boston
- Can others edit my Photos
- Photos OK to edit
Sorry to bring this debate up again and I promise that I researched this extensively beforehand 
I am trying to understand the RAW/JPEG influence on sharpness and have questions on pixel density. I think I have figured this out but need someone to check my thinking. For a point of reference I am looking at a Nikon D700 which has a nominally 12mp sensor. The D700 has three Image Size settings: Large, Medium and Small, and from the manual, the settings result in a pixel count of 4.3k x 2.8k = 12mp, 3.2k x 2.1k = 6.8mp and 2.1k x 1.4k = 3.0mp respectively. My first question is how does the camera accomplish this change in pixel count? It doesn’t appear to me that the camera is using fewer sensor pixels with each setting, and it is still a full frame image with the same nominal image dimensions. When I pixel peep, I can see the pixel rendering degrades going from Large to Medium to Small settings (i.e. with reductions in Image Size, there are fewer rendering pixels attempting to render the same image size, so the rendering pixels are larger with reductions in Image Size). My theory, which I am seeking clarity on, is that the reduced pixel “count” of the Medium and Small settings are not based in a difference of sensor pixels used, but is actually accomplished by a processing algorithm in the camera that attempts to render the original 12mp picture with half the number of rendering pixels at each setting. Is this how it works?
Second question is on sharpness of RAW versus JPEG. I know the raw file doesn’t change in size regardless of the Image Size setting. Since Large Image Size is 12mp, and I have attempted to pixel peep the raw file to be certain, but it seems that the effective pixel density is the same for the JPEG Large setting and the RAW image. Is this generally true? If so, then am I correct to conclude that what is considered to be the enhanced acuity characteristics of RAW over JPEG doesn’t come from a different pixel count but actually comes from the higher image capture bit count (12 or 14 bits in the case of the D700) over the 8 bits of JPEG, the lack of JPEG compression algorithm in the RAW file and possibly other processing subtleties. Not trying to start a RAW versus JPEG battle – the engineer in me just needs to know how this works.

I am trying to understand the RAW/JPEG influence on sharpness and have questions on pixel density. I think I have figured this out but need someone to check my thinking. For a point of reference I am looking at a Nikon D700 which has a nominally 12mp sensor. The D700 has three Image Size settings: Large, Medium and Small, and from the manual, the settings result in a pixel count of 4.3k x 2.8k = 12mp, 3.2k x 2.1k = 6.8mp and 2.1k x 1.4k = 3.0mp respectively. My first question is how does the camera accomplish this change in pixel count? It doesn’t appear to me that the camera is using fewer sensor pixels with each setting, and it is still a full frame image with the same nominal image dimensions. When I pixel peep, I can see the pixel rendering degrades going from Large to Medium to Small settings (i.e. with reductions in Image Size, there are fewer rendering pixels attempting to render the same image size, so the rendering pixels are larger with reductions in Image Size). My theory, which I am seeking clarity on, is that the reduced pixel “count” of the Medium and Small settings are not based in a difference of sensor pixels used, but is actually accomplished by a processing algorithm in the camera that attempts to render the original 12mp picture with half the number of rendering pixels at each setting. Is this how it works?
Second question is on sharpness of RAW versus JPEG. I know the raw file doesn’t change in size regardless of the Image Size setting. Since Large Image Size is 12mp, and I have attempted to pixel peep the raw file to be certain, but it seems that the effective pixel density is the same for the JPEG Large setting and the RAW image. Is this generally true? If so, then am I correct to conclude that what is considered to be the enhanced acuity characteristics of RAW over JPEG doesn’t come from a different pixel count but actually comes from the higher image capture bit count (12 or 14 bits in the case of the D700) over the 8 bits of JPEG, the lack of JPEG compression algorithm in the RAW file and possibly other processing subtleties. Not trying to start a RAW versus JPEG battle – the engineer in me just needs to know how this works.
Last edited: