What's new

pixel count

Ppi and dpi are not the same thing, and the terms are not interchangeable.
Ppi does not apply to images displayed electronically. Ppi only applies when an image is printed.
So ppi is print resolution. Pixel size is electronic display resolution, if the application displaying the image does not re-size it.
That's simply not at all accurate. PPI (alt PPCM) is a common term for pixel density on displays (List of displays by pixel density - Wikipedia, the free encyclopedia). Though they likely shouldn't ("pixel" = "picture element", and that's rather equivocal-able for a screen.)

Confusingly: a displayed image (on a given device at a given zoom) could also be said to have a PPI which is separate from (but no higher than) that of the display.

When it comes to print: DPI is the same as the first PPI above. It's literally the number of dots in an inch. PPI is the number (again: separate from but no higher than DPI) picture elements in the same inch.

Say I have a picture that's 4 pixels high and 4 pixels wide (a 16pixel sensor). Say I print that at 4" x 4". My PPI is 1. My DPI can still be anything (though if it's less than 1 I have a problem). So let's say my dpi was 100. That means each picture element (which is 1"x1") will contain 10,000 identical blobs of ink in a 100x100 array.

For an 8x10 print anything over about 360 ppi yields no improvement in image quality that the human eye can detect.
Assuming that there's no limitation from the medium that sets a max useful resolution (such as ink-bleed in paper), the quality an eye can detect is in pixels per AoV. When translated to PPI, that's not related to the size of the print, but rather related to the likely viewing distance.

I giant billboard seen from 1000' has a different max eye-resolvable resolution than the same billboard seen from 5'

"When it comes to print: DPI is the same as the first PPI above. It's literally the number of dots in an inch. PPI is the number (again: separate from but no higher than DPI) picture elements in the same inch."
I'm hoping this isn't what you really meant to say. DPI is not the same as any variety of PPI. Keith gave an appropriate answer.

PPI in the context of the OP's question is the resolution of a digital image when printed at a specified size: Pixels per inch. When displayed by Photoshop it's listed under the heading Document Size clearly referencing the hard copy. It is a value assigned to the digital photo.

The PPI of a display whether your phone, laptop or computer monitor is not what the OP is asking about. Granted it is confusing that the same acronym is used, but it's not hard to separate the two in context. For example when someone here asks about ACR it's a fair bet they're asking about Adobe Camera Raw and not the Association for Consumer Research or an Adaptive Combat Rifle. Likewise when someone here asks about PPI it's a safe bet they're asking about the resolution of their digital photo to a specific print size and not the pixel density of their iPhone screen. PPI as it refers to electronic displays doesn't need to be and shouldn't be part of this thread. It wasn't brought up in the first place.

DPI is dots per inch: the dots of ink per inch laid down on the paper by the printer. It is a value assigned to the physical printer. DPI is not the same nor is it in any way interchangeable with the PPI of a digital photo. DPI is also not the same as the PPI of a display. The way a printer applies an image to paper is sufficiently different from the way a digital photo is displayed on an electronic screen that it's wrong to claim they are the same.

Joe
 
Question,
What is the highest pixel count file required to print the highest quality 8x10.
The lower end is obvious, but at what point is are you just wasting space or even degrading the photo?

The 300 PPI figure got mentioned here a lot. Keith gave you the 360 PPI figure as a physical limit for human eyesight when viewing an 8x10 print from a normal 8x10 print viewing distance and I'll 2nd that.

That 300 PPI figure is very commonly used and has an historical foundation. Prior to modern digital printers photo-quality 4 color offset output, for example the front cover of Cosmo or say a pricey coffee table book or a museum monograph was printed using a 2400 DPI imagesetter to create the press plates. That 2400 DPI equated to 300 PPI in the image. The actual math involved to explain that was above the level of the average photogs so they just learned to memorize that 300 number -- that was easy -- 300 = good, got it!

Times and technology have changed a bit but the math isn't getting easier and so "300 = good, got it!" isn't going anywhere.

What got left out of this thread for the most part is specification of the actual printer. The PPI of your image should match the physical capability of the printer. Refer to the article Derrel linked. It's a little dated but it's a good article and you'll note that Mr. Hogan starts right off talking about the printer. Early on in the article for example he notes that the figures for PPI he's using are derived from the 1440 DPI capability of the printer.

So to be more precise with your answer you want to identify the physical capability of the printing technology you're using. Since you asked about wasting space, 300 PPI may be a little overkill depending on the printer but "300 = good" is still a safe bet. As for degrading the photo if the the PPI is too high; it's all done with software these days and the software is smart enough to avoid that outcome.

Joe
 
what do you guys think of the digital info prints. where if you send it to them they convert it over for you to a 8x10 without you cropping. seems to work for me ive never cropped anything that was close..
 
Rick.. Now I understand. The lab you are using does not have an advance website. So basically what they are asking is for you to crop your file to 4:5 ratio! If you were to send them your native file of 2:3 ratio, then they do not know what side to crop. Does that make sense? I personally would just send them 4:5 ratio and leave it to whatever resolution you got without resizing.

For comparison lets say you print it at costco. They have a modern website where they will ask you how you want to crop it to 4:5 ratio on your 2:3. It will overlay 4:5 red box on the top of your image and ask you to tell them how you want it cropped. The website you are using is just old school.

All the the different labs i use have modern website. I always send them uncropped and not resized.
 
Thanks Robin. I think I got it now. Keep the 4:5 ratio and 300 PPI regardless what size the software says the print would be
 
lol.. stop using that PPI. Just crop it to the ratio and leave whatever pixel you have with no resizing. 300 ppi 4x6 is the same thing as 150 ppi 8x12. Just forget about the PPI. There has to be a check mark somewhere telling the software to just crop it without resizing. This ppi thing confuses you i think.
 
PPI, DPI FBI bah, I give up and just send stuff to the printers I trust to do a good job.
 
[
Another poster was using the term dpi inaccurately in reference to digital photograph size


I think that may have been me. I regret that you are offended by the use of the term dpi, but you need to get over it, and accept that the name has always been dpi. The real world uses both terms, interchangeably. Yes, ppi can work, but I say dpi, no apologies, since that has always been its name (since before the newbies that don't realize that).

Bottom line: It is said both ways. It is absolutely necessary that we understand it said either way. :)
Instead of telling newbies that everything they hear is wrong, we should explain the real world, to expect it either way.
I am not arguing that you should say dpi, suit yourself, it is said both ways. I am arguing that I am going to say it as dpi. Get over it. :)


FYI:

In English, words are understood in context of use. If dpi is about ink drops, it is about ink drops. If dpi is about pixels, it is about pixels. Image files do not contain any ink drops, so any discussion about images is about pixels. Seems pretty simple.

Scanner ratings always call it dpi, also referring to pixels of course (scanners don't use ink dots). You have never heard of a 4800 ppi scanner rating (and there are no ink drops used in scanners - scanners create pixels of course).

Continuous tone printers (dye-subs, and Fuji Frontier types) don't print discrete ink drops of three colors like inkjet printers must - instead they mix the color of the pixel directly, and they print pixels (called continuous tone). There are no dithered ink dots then, just pixels. But these printer ratings also still refer to the spacing of those image pixels with the term dpi, simply because dpi has always been the name for "pixels per inch".

The formal technical specifications at the very heart of our digital imaging definitions use dpi:



You are saying all those good manufacturers who established everything were wrong. Which is of course nonsense, so I am saying you should get over it.

All of the above is good enough for me. It is called dpi. It always was called dpi. Those fundamental and elite specification documents do not use the term ppi one time - because dpi has simply always been the name of it. I always say dpi too, for same reason, simply because that has always been the name for pixel resolution. Instead of telling newbies everything they hear is wrong, it seems much better to help them understand what they do hear.
 
I think that may have been me. I regret that you are offended by the use of the term dpi, but you need to get over it, and accept that the name has always been dpi. The real world uses both terms, interchangeably. Yes, ppi can work, but I say dpi, no apologies, since that has always been its name (since before the newbies that don't realize that).

They really aren't the same thing at all. DPI is the number of little blobs of ink per inch. PPI is the number of potentially unique colors/shades per inch.

The real world also confuses "figuratively" and "literally", but that does not change the meaning of the words (though give it a century or two...)

In English, words are understood in context of use. If dpi is about ink drops, it is about ink drops. If dpi is about pixels, it is about pixels. Image files do not contain any ink drops, so any discussion about images is about pixels. Seems pretty simple.
I printed my picture at 600dpi.

If we use them interchangeably, you have no idea what I said. It's ambiguous. (though the reality is that's already a problem with screens).

Yes: words may be viewed differently in context "I'm reading a book" vs "I'm gonna book the subject"... but that doesn't mean you can use the wrong word and just appeal to context "I was so embarrassed that I literally died" , "My computer was stuck so I jiggled the carburetor"

But even ignoring the detail of whether some dictionary or another may support you: using them interchangeably is ambiguous.

I can use "dollar" and "buck" interchangeably, because in the context of money, there's only one thing either means (yes, I know more than one currency is called the "dollar"); but when discussing pets I should not use "cat" and "dog" interchangeably because then I become unintelligible.

You are right with your statement, but it's double edged. The importance of language is communication. Any word is fine for any purpose as long as it is understood. My concern is that it's not understandable. If you told me you printed something at 300dpi, I have no way of being sure what you meant.

Scanner ratings always call it dpi, also referring to pixels of course (scanners don't use ink dots). You have never heard of a 4800 ppi scanner rating (and there are no ink drops used in scanners - scanners create pixels of course).
That's a funny case. A scanner is functionally a reverse printer. But I'm not sure (pixels "created" by imperfections aside) that the logic really holds.

If I print a 20mp pic, then make a 40mp scan of it, have I really created a higher resolution picture?

Continuous tone printers (dye-subs, and Fuji Frontier types) don't print discrete ink drops of three colors like inkjet printers must - instead they mix the color of the pixel directly, and they print pixels (called continuous tone). There are no dithered ink dots then, just pixels. But these printer ratings also still refer to the spacing of those image pixels with the term dpi, simply because dpi has always been the name for "pixels per inch".

I suspect it's "dpi equivalent", which again translates to "maximum possible PPI of the device"

The formal technical specifications at the very heart of our digital imaging definitions use dpi:


Another possibility is that you are overgeneralizing.

The standard seems to be about aspect ratio control... information about (quoting Kodak here) "the digitizer used to scan"; but it's an interesting argument. It's interesting that JPEG and TIFF both use the exact same language. I think we are both assuming from ignorance in interpreting these specs.
 
I printed my picture at 600dpi.

If we use them interchangeably, you have no idea what I said. It's ambiguous. (though the reality is that's already a problem with screens).

If your use of "picture" means image, of course it obviously means pixels, there is no other choice applicable to images. There are no ink drops in image files.
If your "picture" means a print on paper, it could mean ink drops, referring actually to the paper.
If you say a few more words about what you are talking about, the context will make it be clear to all.

Yes: words may be viewed differently in context "I'm reading a book" vs "I'm gonna book the subject"... but that doesn't mean you can use the wrong word and just appeal to context "I was so embarrassed that I literally died" , "My computer was stuck so I jiggled the carburetor"

But even ignoring the detail of whether some dictionary or another may support you: using them interchangeably is ambiguous.

Yes, the English language is known for multiple definitions of words. We do have to learn to interpret in the context. There are hundreds of definitions for many common words, one of my favorites for this is the word "set". It is not hard to deal with it.

I can use "dollar" and "buck" interchangeably, because in the context of money, there's only one thing either means (yes, I know more than one currency is called the "dollar"); but when discussing pets I should not use "cat" and "dog" interchangeably because then I become unintelligible.

You are right with your statement, but it's double edged. The importance of language is communication. Any word is fine for any purpose as long as it is understood. My concern is that it's not understandable. If you told me you printed something at 300dpi, I have no way of being sure what you meant.

That's a funny case. A scanner is functionally a reverse printer. But I'm not sure (pixels "created" by imperfections aside) that the logic really holds.

If I print a 20mp pic, then make a 40mp scan of it, have I really created a higher resolution picture?

In this context, images are like the money, since there is also only one thing dpi can possibly mean if applicable to an image. There are no ink drops in image files.

Sorry to have to say that your 40mp is reaching for an entirely different subject. Yes, it will be 40mp, but discussing perceptible detail then is a different subject.

Of course, scanners do create pixels, and only pixels, at a specified dpi setting, which does mean pixels per inch. And scanners are rated like maybe 4800 dpi. There are only pixels, and no ink drops involved.

If you scan a 8x10 inch paper at 300 dpi, you will create an image size of 2400x3000 pixels. That is the sole purpose of doing it.
dpi means pixels, of course scanners create no ink drops. There is really no way it can be confused.

Cameras create pixels too, but not in any way relating to paper size or inches or dpi. However, for no good reason, the camera Exif does still commonly assign some arbitrary dpi value, which has nothing to do with anything.

Actually there is a reason.. It simply prevents Adobe from calling a blank Exif dpi field to be 72 dpi (or 72 ppi if you prefer), which is their way, which would still have no meaning then if they did. It just looks so dumb when Adobe calls it 72 dpi and says our camera image will print 6x4 feet. :) There is no reason for that, and cameras have learned to try to avoid that dumb situation (because newbies may think the camera did it).

I believe you to be aware enough to know the term "72 dpi" is NOT about ink drops. We know it is about the old notion that older video screens are 72 dpi somehow (even though they only show pixels, and video systems have no definition of inches or dpi, and we know better now). The 72 dpi video nonsense started in 1984 on the first Macintosh nine inch 512x324 pixel screen. They were ahead of their time, we are just now coming back around to that size screen. :) Many of us have outgrown the notion now, but not all of us.

But if we search Google for "72 dpi" (with the quotes, to make it be a phrase), we get 20.5 million hits.
If we search for "72 ppi" (which is dumb, but we can look), we only get 1/2 million hits. 40x more usage for 72 dpi than 72 ppi. Yet, you and I KNOW this always about video and always about pixels, and the subject of 72 dpi is NEVER about ink drops. So anyone that imagines that only 2.5% of all users are correct really needs to reevaluate their own notions. Really. :)

This shows the magnitude of the popular usage of dpi, which all of the evidence shows was previously and erroneously called "wrong" here. LOL It is actually funny, and of course won't have much effect (except to confuse a few newbies).

Continuous tone printers (dye-subs, and Fuji Frontier types) don't print discrete ink drops of three colors like inkjet printers must - instead they mix the color of the pixel directly, and they print pixels (called continuous tone). There are no dithered ink dots then, just pixels. But these printer ratings also still refer to the spacing of those image pixels with the term dpi, simply because dpi has always been the name for "pixels per inch".

I suspect it's "dpi equivalent", which again translates to "maximum possible PPI of the device"

Yes, that's obviously correct, of course dpi means pixels per inch. Newbies seem not to know that is what it has always meant in the industry. There simply are no ink drops in scanners or dye subs or chemical paper printers, but they have always used dpi for pixels, and still do.

The formal technical specifications at the very heart of our digital imaging definitions use dpi:


Another possibility is that you are overgeneralizing.

It was not me who wrote those specification documents. The specs say dpi, and did NOT mention ppi. I find it real hard to fault dpi. :)

A few examples in the specifications we use:

JEPG page 2: "X and Y pixel density (dots per inch or dots per cm)" (this is stored in the Exif fields)

TIFF page 38: Even if the decision is arbitrary, it might be better to use dots per inch or dots per centimeter, and to pick XResolution and YResolution

EXIF page 90: "IDPI -Dots Per Inch. Stores the dots per inch (DPI) setting of the digitizer used to produce the file."

This last is the dpi value in the Exif data our camera writes, which Adobe does call ppi, but it has zero meaning so far at that camera stage.

NONE of this is about ink drops.

And actually, the files store pixels and could not care less what size we will print it, except the use of storing dpi there is to preserve an original size in inches on paper (scanners for example). But of course, cameras have no concept of any original size in inches... But it is not important yet, we will handle dpi after we come up with some desired printed size, if any.

That dpi is simply the name of the term for image resolution. Always has been. Looks like it always will be. Yes, dpi does mean pixels per inch, pixels are a kind of a colored dot. That is not hard. And yes, we can say ppi if we choose (but I don't, I am used to saying dpi, and prefer to respect the name of it).

The standard seems to be about aspect ratio control... information about (quoting Kodak here) "the digitizer used to scan"; but it's an interesting argument. It's interesting that JPEG and TIFF both use the exact same language. I think we are both assuming from ignorance in interpreting these specs.

Aspect there just means the pixel, and relates to some video use that can have non-square pixel definitions. For example, old SD DVD is 720x480 (pixels not square) but displays in 640x480 (square pixels on PCs).

The subject issue here is that some of us want to make up our own notions and assign their own made up rules for what the rest of us should always say to agree with them. They want to call us out on it if we diverge from their view. I instead choose to agree with those that define the terms, not the newbies that don't understand it yet. Tradition, and all that. :)

My only point is that newbies should be instructed that we will always see it both ways, and that is is necessary that we must learn to understand it both ways.
It is quite easy. If about images, dpi is about pixels. If about printer ink drops, it is about ink drops.

I have no problem if you want to call it ppi. I can understand either way. You should also learn to understand either way, when I and others prefer the standard term dpi. You will be missing out big time if you cannot understand both ways, and that would be a shame.

I do NOT stand up and shout that you can only say it one way. Your use is your choice, not mine, and frankly, I could not care less what you choose.
I do say that we ALL have to understand it either way, because of course we obviously always see it both ways. We do need to learn the real world. But yes, it has always been called dpi.

But when some of us take it on our own to start changing the rules on the standards writers, and on the scanner manufacturers, and on the dye sub printer manufactures, and on the graphic industry for years before all the newbies got involved with digital, just because they fancy their own choice, then it does seem a little pushy. And ignorant. :)

My main issue is that it can only then cause unnecessary confusion for the newbies who of course obviously do see dpi everywhere. Telling them everything they see is wrong is a stupid and harmful teaching tool. They deserve better. We need to tell them what they see means instead. It is not that hard.
 
Last edited:

Most reactions

Back
Top Bottom