I printed my picture at 600dpi.
If we use them interchangeably, you have no idea what I said. It's ambiguous. (though the reality is that's already a problem with screens).
If your use of "picture" means image, of course it obviously means pixels, there is no other choice applicable to images. There are no ink drops in image files.
If your "picture" means a print on paper, it could mean ink drops, referring actually to the paper.
If you say a few more words about what you are talking about, the context will make it be clear to all.
Yes: words may be viewed differently in context "I'm reading a book" vs "I'm gonna book the subject"... but that doesn't mean you can use the wrong word and just appeal to context "I was so embarrassed that I literally died" , "My computer was stuck so I jiggled the carburetor"
But even ignoring the detail of whether some dictionary or another may support you: using them interchangeably is ambiguous.
Yes, the English language is known for multiple definitions of words. We do have to learn to interpret in the context. There are hundreds of definitions for many common words, one of my favorites for this is the word "set". It is not hard to deal with it.
I can use "dollar" and "buck" interchangeably, because in the context of money, there's only one thing either means (yes, I know more than one currency is called the "dollar"); but when discussing pets I should not use "cat" and "dog" interchangeably because then I become unintelligible.
You are right with your statement, but it's double edged. The importance of language is communication. Any word is fine for any purpose as long as it is understood. My concern is that it's not understandable. If you told me you printed something at 300dpi, I have no way of being sure what you meant.
That's a funny case. A scanner is functionally a reverse printer. But I'm not sure (pixels "created" by imperfections aside) that the logic really holds.
If I print a 20mp pic, then make a 40mp scan of it, have I really created a higher resolution picture?
In this context, images are like the money, since there is also only one thing dpi can possibly mean if applicable to an image. There are no ink drops in image files.
Sorry to have to say that your 40mp is reaching for an entirely different subject. Yes, it will be 40mp, but discussing perceptible detail then is a different subject.
Of course, scanners do create pixels, and only pixels, at a specified dpi setting, which does mean pixels per inch. And scanners are rated like maybe 4800 dpi. There are only pixels, and no ink drops involved.
If you scan a 8x10 inch paper at 300 dpi, you will create an image size of 2400x3000 pixels. That is the sole purpose of doing it.
dpi means pixels, of course scanners create no ink drops. There is really no way it can be confused.
Cameras create pixels too, but not in any way relating to paper size or inches or dpi. However, for no good reason, the camera Exif does still commonly assign some arbitrary dpi value, which has nothing to do with anything.
Actually there is a reason.. It simply prevents Adobe from calling a blank Exif dpi field to be 72 dpi (or 72 ppi if you prefer), which is their way, which would still have no meaning then if they did. It just looks so dumb when Adobe calls it 72 dpi and says our camera image will print 6x4 feet.

There is no reason for that, and cameras have learned to try to avoid that dumb situation (because newbies may think the camera did it).
I believe you to be aware enough to know the term "72 dpi" is NOT about ink drops. We know it is about the old notion that older video screens are 72 dpi somehow (even though they only show pixels, and video systems have no definition of inches or dpi, and we know better now). The 72 dpi video nonsense started in 1984 on the first Macintosh nine inch 512x324 pixel screen. They were ahead of their time, we are just now coming back around to that size screen.

Many of us have outgrown the notion now, but not all of us.
But if we search Google for "72 dpi" (with the quotes, to make it be a phrase), we get 20.5 million hits.
If we search for "72 ppi" (which is dumb, but we can look), we only get 1/2 million hits. 40x more usage for 72 dpi than 72 ppi. Yet, you and I KNOW this always about video and always about pixels, and the subject of 72 dpi is NEVER about ink drops. So anyone that imagines that only 2.5% of all users are correct really needs to reevaluate their own notions. Really.
This shows the magnitude of the popular usage of dpi, which all of the evidence shows was previously and erroneously called "wrong" here. LOL It is actually funny, and of course won't have much effect (except to confuse a few newbies).
Continuous tone printers (dye-subs, and Fuji Frontier types) don't print discrete ink drops of three colors like inkjet printers must - instead they mix the color of the pixel directly, and they print pixels (called continuous tone). There are no dithered ink dots then, just pixels. But these printer ratings also still refer to the spacing of those image pixels with the term dpi, simply because dpi has always been the name for "pixels per inch".
I suspect it's "dpi equivalent", which again translates to "maximum possible PPI of the device"
Yes, that's obviously correct, of course dpi means pixels per inch. Newbies seem not to know that is what it has always meant in the industry. There simply are no ink drops in scanners or dye subs or chemical paper printers, but they have always used dpi for pixels, and still do.
The formal
technical specifications at the very heart of our digital imaging definitions use dpi:
- JPEG file specifications See page 2 and 5, dpi, "dots per inch", meaning pixels.
- TIFF file specifications See page 38, "dots per inch", meaning pixels.
- http://www.kodak.com/global/plugins/...fStandard2.pdf See pages 19, 90, 101, 104, 108, 112, "dpi", meaning pixels (the number 72 dpi there has no significance, it just means "blank, no numerical information").
Another possibility is that you are overgeneralizing.
It was not me who wrote those specification documents. The specs say dpi, and did NOT mention ppi. I find it real hard to fault dpi.
A few examples in the specifications we use:
JEPG page 2: "X and Y
pixel density (dots per inch or dots per cm)" (this is stored in the Exif fields)
TIFF page 38: Even if the decision is arbitrary, it might be better to use
dots per inch or dots per centimeter, and to pick XResolution and YResolution
EXIF page 90: "IDPI -Dots Per Inch. Stores the
dots per inch (DPI) setting of the digitizer used to produce the file."
This last is the dpi value in the Exif data our camera writes, which Adobe does call ppi, but it has zero meaning so far at that camera stage.
NONE of this is about ink drops.
And actually, the files store pixels and could not care less what size we will print it, except the use of storing dpi there is to preserve an original size in inches on paper (scanners for example). But of course, cameras have no concept of any original size in inches... But it is not important yet, we will handle dpi after we come up with some desired printed size, if any.
That dpi is simply the name of the term for image resolution. Always has been. Looks like it always will be. Yes, dpi does mean pixels per inch, pixels are a kind of a colored dot. That is not hard. And yes, we can say ppi if we choose (but I don't, I am used to saying dpi, and prefer to respect the name of it).
The standard seems to be about aspect ratio control... information about (quoting Kodak here) "the digitizer used to scan"; but it's an interesting argument. It's interesting that JPEG and TIFF both use the exact same language. I think we are both assuming from ignorance in interpreting these specs.
Aspect there just means the pixel, and relates to some video use that can have non-square pixel definitions. For example, old SD DVD is 720x480 (pixels not square) but displays in 640x480 (square pixels on PCs).
The subject issue here is that some of us want to make up our own notions and assign their own made up rules for what the rest of us should always say to agree with them. They want to call us out on it if we diverge from their view. I instead choose to agree with those that define the terms, not the newbies that don't understand it yet. Tradition, and all that.
My only point is that newbies should be instructed that we will always see it both ways, and that is is necessary that we must learn to understand it both ways.
It is quite easy. If about images, dpi is about pixels. If about printer ink drops, it is about ink drops.
I have no problem if you want to call it ppi. I can understand either way. You should also learn to understand either way, when I and others prefer the standard term dpi. You will be missing out big time if you cannot understand both ways, and that would be a shame.
I do NOT stand up and shout that you can only say it one way. Your use is your choice, not mine, and frankly, I could not care less what you choose.
I do say that we ALL have to understand it either way, because of course we obviously always see it both ways. We do need to learn the real world. But yes, it has always been called dpi.
But when some of us take it on our own to start changing the rules on the standards writers, and on the scanner manufacturers, and on the dye sub printer manufactures, and on the graphic industry for years before all the newbies got involved with digital, just because they fancy their own choice, then it does seem a little pushy. And ignorant.
My main issue is that it can only then cause unnecessary confusion for the newbies who of course obviously do see dpi everywhere. Telling them everything they see is wrong is a stupid and harmful teaching tool. They deserve better. We need to tell them what they see means instead. It is not that hard.