Resolution

I think all of you are probably overdoing the explanations. When one resizes an image and turns it into a JPEG, most software will specify a ppi or dpi figure for the data reduction. Whether it is a mislabeled figure or not isn't really the issue. That is how most software labels the process. The problem is that the OP is trying to judge image quality from low resolution jpeg's on a computer screen and that is fairly meaningless. It doesn't say much about the original image quality or, if you prefer, resolution.
 
If it has nothing to do with anything until you print, then how do you deal with placing lo-res files on web pages.

When I decrease the resolution on an image I use the dpi setting in the image size dialog box to reduce the size of the images I place on the web.

Should be doing this in a different manner?

Cheers,

This may have been answered already but here's my take on it.

An 800x600 file at 300ppi is exactly the same as an 800x600 pixel file at 72ppi which is the same as an 800x600 pixel file at 3,000,000ppi. the resolution doesn't mean anything. the total number of pixels is all that matters - in this case 800x600px

When you decrease resolution you are actually reducing the number of pixels by resampling (take the resample button off and then change resolution). keep your eye on the number of pixels at the top of the image size menue.

When you save for the web you save pixels not resolution. Hope that makes sense.
 
DPI is also used in scanning. Not just in printing.

If you read my posts above you will see I touched on scanning


"For a start dpi only relates to the printers output (or scanning resolution) - nothing else. Many people make the mistake of using dpi instead of ppi (see above)".

I didn't dwell on this as it's slightly different to a printing resolution.
 
I think all of you are probably overdoing the explanations. When one resizes an image and turns it into a JPEG, most software will specify a ppi or dpi figure for the data reduction. Whether it is a mislabeled figure or not isn't really the issue.

True, but it does add to the confusion. It's a bit like comparing mileage between diesel and gasoline cars. I think it's important to make the distinction, regardless of what the software writers have done.

But yeah, whatever it's called...
 
Yup, although that's another place where they misname it. It should be PPI, as you are getting pixels, not ink dots. For scanning, comparing one PPI to another for the same image gives you a good indication of comparative quality, as the image size stays fixed. For film, you want to get into several thousand PPI, but for prints, generally anything over 300 is a waste. Prints don't have high definition.

Mark - it is in fact believe it or not DPI when scanning !! :lol:

You scan an analogue image in dots per inch and turn it into a digital file . The digital file is then read in pixels once in your pc.

The higher the dpi you scan at, the better the quality and the larger you can print it at (quality depends on scanner quality too). This is how high quality scanners turn small negatives into large prints. The scan at very high resolutions to allow you to print your image at reasonable resolutions.

So dpi for scanning in an image and printing out an image.

ppi for setting up the image resolution for print.
 
I think all of you are probably overdoing the explanations. When one resizes an image and turns it into a JPEG, most software will specify a ppi or dpi figure for the data reduction. Whether it is a mislabeled figure or not isn't really the issue. That is how most software labels the process. The problem is that the OP is trying to judge image quality from low resolution jpeg's on a computer screen and that is fairly meaningless. It doesn't say much about the original image quality or, if you prefer, resolution.

Fred I see what you are saying but if you understand it, you will get more out of resizing and printing.

the OP needs to know why the output resolution of the camera is meaningless.
 
Mark - it is in fact believe it or not DPI when scanning !! :lol:

You scan an analogue image in dots per inch and turn it into a digital file . The digital file is then read in pixels once in your pc.

Ok, this is a total sidetrack, but that really shouldn't be DPI. That may be the standard way of saying it, but you aren't scanning individual analog dots. Whatever is under that single "dot" is merged into a single pixel. You could be scanning something that has a super fine definition, like 1000 DPI, but if you are scanning at 300 PPI, you are only going to get 300 "dots" per inch, the actual resulting pixels.

That's why I think this is such a mess. I'll just let it drop now, as I'm obviously not going to change anything in the industry. I think they messed things up though.
 
huh, the same old confusion :p

well, people should just ignore all that DPI / PPI thing unless they scan or print and think in absolute pixels in their workflow for all other steps.
 
Ok, this is a total sidetrack, but that really shouldn't be DPI. That may be the standard way of saying it, but you aren't scanning individual analog dots. Whatever is under that single "dot" is merged into a single pixel. You could be scanning something that has a super fine definition, like 1000 DPI, but if you are scanning at 300 PPI, you are only going to get 300 "dots" per inch, the actual resulting pixels.

That's why I think this is such a mess. I'll just let it drop now, as I'm obviously not going to change anything in the industry. I think they messed things up though.

Say you start with a 6"x6" negative and you scan at 300dpi and you want to print at 300ppi then you'll get a 6x6 print.

The analogue image is made up of dots not pixels so you need to scan at a high resolution of dpi to create the number of pixels on your digital image. You have the idea but you are really scanning in dpi although the input is ppi......

All rather confusing and in the scanning example it means more or less the same thing however in printing it is very different. For example it's quite easy to print a low ppi image (72ppi) at a very high dpi resolution (5760dpi) on your printer - depending on the printer.

We're getting there. here's a good link on scanning.

http://www.digitalmemoriesonline.net/scan/dpi.htm

Cheers
Jim
 
huh, the same old confusion :p

well, people should just ignore all that DPI / PPI thing unless they scan or print and think in absolute pixels in their workflow for all other steps.

Exactly what I said above but it helps to understand why :thumbup:
 
Debating industry semantics is a waste of time, and only clouding the issue for the original poster. PPI and DPI, when used to determine print size, effectively mean the same thing. If you scan a negative at 300 Dots Per Inch, it becomes a digital file. That digital file, in order to print at 300 Dots Per Inch must be set to 300 Pixels Per Inch in the software. The only confusion is in trying to differentiate between the two, when for all practical purposes, they mean the same thing.

You are doing nothing but adding to the confusion for the original poster.

Pixel dimension / PPI or DPI = Print Size. DPI and/or PPI are irrelevant until you decide to print, in which case you use that formula to determine the dimensions of your file.
 
While all the answers and the discussion was interesting and educational, can anyone answer the question? Why are the resolutions the same for a 3.1 and 10.2 MP camera but much lower for the 4.1 camera?

Is the 4.1 camera 75 PPI producing a much larger overall image, while the 3.1 is a smaller image with more PPI because it's smaller is actual size?

What determines the resolution of an image.
When checking the resolution of 3 different images taken by 3 cameras.
Image 1 taken with a 3.1 mega pixel camera = 230dpi
Image 2 taken with a 4.1 mega pixel camera = 75dpi.
Image 3 taken with a 10.2 mega pixel camera = 240dpi.
 
DPI and PPI have nothing to do with the inherent resolution of the image. They only factor in when you are printing. Different cameras and different software will give different "default" PPI resolutions. They mean nothing. If you open a photo in photoshop, go to image size, uncheck the resample button, you can change the DPI to anything you want, and no changes are made to the resolution of the image. The pixel dimensions remain constant. In digital imaging, the pixel dimensions dictate what output resolution can be achieved.
 
Assuming a 2:3 ratio...

3.1 MP = roughly 1450 x 2175 pixels. At 230 PPI, that's a 6.3" x 9.5" print.
4.1 MP = roughly 1650 x 2475 pixels. At 75 PPI, that's a 22" x 33" print.
10.2 MP = roughly 2700 x 4050 pixels. At 240 PPI, that's a 11.25" x 16.9" print.

To do a real comparison, we need all the same print size or all the same PPI number. Let's say a 8"x12" print size.

At 8" x 12" print, a 3.1 MP image is about 181 PPI.
At 8" x 12" print, a 4.1 MP image is about 206 PPI.
At 8" x 12" print, a 10.2 MP image is about 338 PPI.

Now you can make an accurate comparison between the PPI (DPI) numbers, and higher means more quality. But that's only because the print sizes (in inches) are the same. You could also use 4"x6" if you wanted to, but all three would need to be figured at 4"x6".

Note that at the original numbers, the 10.2 MP camera gives you a much bigger print at the same PPI than the 3.1 MP camera does. This shows that it's the number of pixels that drive the quality, not just the PPI number.
 

Most reactions

New Topics

Back
Top