This is quite an involved question, so I apologise in advance if it seems to ramble a bit. Let's start with some history: Many (many, many) years ago, before anyone had even thought about digital imaging, I used to read photographic magazines. Lots of them. These magazines had vast numbers of images of A4 or A5 size. At these relatively small sizes it was, generally, very easy to spot which images had been taken on MF, which on 35mm and (very occasionally) which on 5x4. The larger images were (again, generally) technically superior to the smaller. And the effect was not that subtle. Fast forward to the present day. It seems to be pretty much accepted wisdom that there is no point in printing images at higher effective resolutions than around 300 RCDPI (Resolved Colour Dots per Inch). Indeed, it's generally pretty hard to even find out what the RCDPI resolution of a printer or processing lab. Now, the puzzle is this: There are 35mm cameras available with horizontal resolutions of 6000 or greater. Indeed, most decent modern cameras will approach this. This means that you can produce a 20" print with no resampling 'enlargement' necessary. This seems to imply that with modern digital technology there would be no advantage to MF for any print that is not greater than 20" in its longest dimension. Obviously, there are issues other than simple resolution to consider, but given that it seems unlikely that the vast majority of printed output used a resolution greater than ~300, there is not that much scope for anything other than very minor improvements using MF. So, finally to the question: What has happened to the quality difference that used to be clearly visible in small prints when it was all emulsion and 'analogue' processing? Have we lost something by going to digital processing that no one has really noticed (even though it stuck out like a sore thumb half a century and more ago)? Or am I missing something fundamental in my analysis of printing resolution?