A couple of posters have said they find the idea that an application is responsible for interpolating the pixels in an image rather than the printer to be 'illogical' and counter to their 'intuition'. I thought it might be a good idea to add a little historical perspective as that might make the way the system works a little easier to understand. The GUI was originally developed by Xerox PARC in the late seventies and early eighties. It was copied by Apple for the original Mac released in 1984. So when the architechture of the GUI was developed (and the basic architecture has never really changed), a full blown computer had a processor that would not power a modern mobile phone and had to get by with around 1/10,000th of the memory of a modern system. At this time, the control circuitry for printers was fairly rudimentary and they were certainly not fitted with any significant amount of memory. The idea of expecting the printer to handle interpolating a bitmap to enlarge it before printing would have seemed laughable to the teams that worked on the design of the architecture. In those days the printer manufacturers did not supply drivers because there were so many different microprocessor types and operating systems in existance and it was up to each program to interface with the printer directly. So the architecture was designed in such a way that the unit with the far larger amount of processing power: the PC (as all personal computers were called before IBM 'aquired' the name for their particular style and the IBM architecture became known as the PC wheras other personal computers were known as Macs or whatever), handled everything up to and including producing the bitmap in the correct format for the printer.. All the 'grunt' work is done by the main system leaving the printer to concentrate on doing the actual work of printing. The drivers are merely responsible for converting from the function calls required by the GUI to the command streams recognised by the printers and allowing the user to set and communicate non standard options. So that is how an architecture that a quarter of a century later may seem 'illogical' to some came to be developed in that way. Even had the architects considered doing it in any other way the lack of processing power and memory on the printers of the day would have made it quite impossible.