Camera LCD vs Computer screen...

Hardrock

TPF Noob!
Joined
Apr 8, 2009
Messages
1,173
Reaction score
36
Location
Dallas
Can others edit my Photos
Photos NOT OK to edit
My question is ... When viewing the picture on the lcd screen on the camera and zooming all the way in I check the focus on the eyes and it appears to be clear but then when I upload the picture and bring it into cs3 the eyes are still clear but not as clear as the lcd screen, why? The monitor is a Dell 20 in Ultra sharp monitor (thats all I know about the monitor , since Im at work I can not check). The only thing I can come up with is the lcd is only 3 inchs and the picture in cs3 is more like a 5 by 7in. What do you guys think?
 
My question is ... When viewing the picture on the lcd screen on the camera and zooming all the way in I check the focus on the eyes and it appears to be clear but then when I upload the picture and bring it into cs3 the eyes are still clear but not as clear as the lcd screen, why? The monitor is a Dell 20 in Ultra sharp monitor (thats all I know about the monitor , since Im at work I can not check). The only thing I can come up with is the lcd is only 3 inchs and the picture in cs3 is more like a 5 by 7in. What do you guys think?

What camera? May sound silly but it matters. If it's a Nikon, only a nikon RAW editor (View NX or Capture NX(2)) is going to be able to read ALL of the raw data which possibly includes the sharpening settings in the camera. If you are using ACR for your editing then it probably isn't reading any of that and is giving you an image to start with equivalent to the sharpening setting being at 0 which will look soft without any type of sharpening applied. A slight amount of USM should get you back up where you need to be. This is the biggest reason I've never left Capture NX2.

If you don't shoot Nikon, ignore my post and pretend I wasn't here.
 
If I'm not mistaken, the image on your LCD goes through similar filters as though you were shooting in JPEG (including sharpening filters). If you shoot in RAW, the image on your computer won't have those sharpening filters applied.
 
Its a Canon 50D and Im shooting jpeg.
 
You're looking at a 3" screen on the 50D which means it will look sharper just because the screen is smaller. Once you view the image full size or near full size you can see far more detail.

Resize the picture, apply sharpening and save... then take another look. It will look sharper.
 
You have a 15 MP camera and by electing to capture images as JPEGs, let it discard up to 80% of the image data.

That leaves you with the equivelent of a 3 MP image.

At least that's if you're shooting as JPEG Fine. If you're shooting JPEG Medium or JPEG Basic it's even worse.

You may already be familiar with how JPEG compression works.

What it does is it converts those 15 MP into 8x8 pixel squares, averages the color information and discards 80% of that color information the image sensor actually captured in each of those 8x8 pixel squares.

After it does that, it then applies crude and global contrast, saturation, and sharpening adjustments based on either default parameters programmend into the CPU at the factory or based on menu parameters you have set in the camera.

Then you get to look at it on the LCD and the discarded image data is gone, bye-bye, lost for-ever.

JPEG is a lossy compression method, which removes information from the image
JPEG - Wikipedia, the free encyclopedia

But, the difference you are seeing is just what you speculated it was. The difference between a small LCD and a desk top monitor.
 
You have a 15 MP camera and by electing to capture images as JPEGs, let it discard up to 80% of the image data.

That leaves you with the equivelent of a 3 MP image.

At least that's if you're shooting as JPEG Fine. If you're shooting JPEG Medium or JPEG Basic it's even worse.

You may already be familiar with how JPEG compression works.

What it does is it converts those 15 MP into 8x8 pixel squares, averages the color information and discards 80% of that color information the image sensor actually captured in each of those 8x8 pixel squares.

After it does that, it then applies crude and global contrast, saturation, and sharpening adjustments based on either default parameters programmend into the CPU at the factory or based on menu parameters you have set in the camera.

Then you get to look at it on the LCD and the discarded image data is gone, bye-bye, lost for-ever.

JPEG is a lossy compression method, which removes information from the image
JPEG - Wikipedia, the free encyclopedia

But, the difference you are seeing is just what you speculated it was. The difference between a small LCD and a desk top monitor.

Well I wish I would have known that sooner!!! I thought the only real difference was being able to adjust for white balance. The only reason I was shooting jpeg was for the ease of uploading , editing and viewing on the monitor. Thanks! For now on its raw only.

InTempus when you say resize you mean going from 72ppi to 300 ppi?
 
At least that's if you're shooting as JPEG Fine. If you're shooting JPEG Medium or JPEG Basic it's even worse.

6mp Economy, file size 1.1mb, zoomed 200%, distortion and halo around edges. Same scene, 6mp Fine, file size 3.56mb, zoomed 200%, looked ok, less distortion.
In fact, if you have to zoom that much to see so little difference between Fine and Economy, it's not worth it to use Fine at all as file size is 3x. I use Normal as I found burst mode using Fine useless as it took too long to save to card.
I am not certain as to what's happening using Economy, if compression is used then it would take longer to save pics vs. Fine; In fact, opposite is true.
 
At least that's if you're shooting as JPEG Fine. If you're shooting JPEG Medium or JPEG Basic it's even worse.

6mp Economy, file size 1.1mb, zoomed 200%, distortion and halo around edges. Same scene, 6mp Fine, file size 3.56mb, zoomed 200%, looked ok, less distortion.
In fact, if you have to zoom that much to see so little difference between Fine and Economy, it's not worth it to use Fine at all as file size is 3x. I use Normal as I found burst mode using Fine useless as it took too long to save to card.
I am not certain as to what's happening using Economy, if compression is used then it would take longer to save pics vs. Fine; In fact, opposite is true.

Are you saying there is negligible difference between jpeg normal and raw? Or only negligible difference between jpeg normal and fine?
What camera are you using?.
 
Unless you edit your pictures there will be negligible difference between JPEG and RAW. Heck until you start tweaking your RAW converter a bit the JPEG will likely come out better anyway. Back in my JPEG days I also shot normal instead of fine. There was no visible difference.

To get back to the question, the ppi of the image has nothing to do with it until you hit the print button. One likely reason is how the image is displayed. When you zoom in do you eventually get to a 1:1 mapping of image pixels to the pixels on your tiny LCD screen? In Photoshop at 100% (providing your computer monitor resolution is correctly cranked as high as it will go for your LCD) the pixel mapping will be 1:1.

If you start to get a smaller mapping then interpolation will mask away sharpness problems. If you only have 2 pixels to display 3 pixels, how do you know if the middle pixel will be white, black, or grey because it wasn't sharp?
 
Unless you edit your pictures there will be negligible difference between JPEG and RAW. Heck until you start tweaking your RAW converter a bit the JPEG will likely come out better anyway.

ok lets not carried away here. Any new version of ACR has the tweaks buillt into the default setting that will apply a but of sharpening.

why in this day and age people on a photography forum (in the beyond basics section no less) would still be advocating jpeg over RAW is beyond me.
 
Tweaks indeed. There's no secret that Nikons and Canons render colours differently. There's also no secret that PHotoshop will try and make both these cameras render colours the same.

I say the JPEG will come out better because that's what people were used to. The number of times I have convinced people to use Lightroom and shoot RAW only to have them come back and say "but the colours don't look anywhere near as good as my camera" makes me cry. I had the same theory. When I switched to RAW the skies were rendered in such a light greenish blue that I thought "what are all these self proclaimed photographers ranting about, its all wrong".

Really all that was happening is my D200 biases that blue tint of a lovely polarised sky slightly to the purple, the ACR3.3 profile slightly to the green, the Adobe standard profile gets it pretty right though, but hey that's another change some people complained about. The defaults in ACR just simply look like crap to some. Classic example is that it "will apply a bit of sharpening." A friend of mine had his camera sharpening set to maximum. When he switched to RAW he complained about blurry looking photos...

I fully advocate sticking to JPEG unless you're going to take the creative control RAW offers you and start tweaking a LOT. RAW is a big step backwards if you're not going to use it. It's like buying a Ford F250 custom for city driving. Most of your time is wasted in the petrol station keeping it running for no real advantage.
 
Never trust the camera's LCD...it lies and lies and lies. What is shows is at the mercy of ambient light, angle of view, brightness setting, and it is based on a too small jpg regardless if you're shooting raw or not.

The LCD image is only good to confirm composition and NEVER exposure....use the histogram for that.

Don't be fooled by the camera's LCD, regardless of make and model of camera.
 
Unless you edit your pictures there will be negligible difference between JPEG and RAW.

This could not be further from the truth....plain wrong.

Raw means the camera does not post process image, as is the case in jpg.

Raw is a file that has no post processing by the camera, so a raw and it's corresponding jpg can often be very different; I would say this is the case 99% of the time. Also raw brings in 12, 14, or 16 bits/channel; jpg only a mere 8...huge difference in tonal range.
 

Most reactions

New Topics

Back
Top