I see. I still don't get it though....it seems "right" to me that more pixels are better, it gives you more to work with....like, in macro, I can crop in on a bug's head and blow it up bigger. I dunno, gonna read up on this some more. I'm sure I'm just trying to comprehend it wrong. I guess I'm trying to compare a camera's pixels to that of a computer monitor's resolution and how it translates to clarity and picture quality.
Gotta be something to it though, otherwise camera makers wouldn't keep increasing them?
I suspect they increase them because consumers will fall for it. The sensor resolution needs to at least match the resolution of the output media. Once it exceeds the output media pixels have to get averaged to reduce the resolution -- so the information is wasted. That's a slight oversimplification. If there is some "noise" in the image, the averaging of the pixels has a natural side-effect of reducing the strength of the "noise".
However, there's two more problems:
(1) the increased resolution only helps if you have the glass with high enough accutance to be able to take advantage of the resolution. In other words, unless you're buying pretty high end glass, you won't be capable of noticing the difference.
(2) there's also the problem with becoming "diffraction limited". This applies only to higher focal ratios. If you're shooting at f/2.8 through f/5.6 then being diffraction limited is not an issue. But at higher f-stops it becomes a factor in the image quality. Note that the photos won't necessarily look "worse" -- what really happens is they just fail to be capable of taking advantage of the sensor resolution -- they hit a wall where more resolution will not allow them to look better. This is a law of physics. The only way around the problem is to use a physically larger sensor (at which point you'd no longer have an APS-C camera.) And then eventually you hit the same wall for full-frame sizes... and so on.
Becoming "diffraction limited" has to do with the wave nature of light. A ray of light isn't technically a straight beam. It's a wobbly wave. It doesn't actually focus to a single point. It dances around and creates the shape of something referred to as an "Airy disk" (named for the man who discovered it.) The math for calculating the "diffraction limit" actually assumes "perfect" optics -- so optical quality isn't the problem (low optical quality would only make the problem that much worse.) But as a result, there is a minimum diameter of the point where a single "ray" of light will focus. The 18.1 sensors on Canon's T2i, T3i, 60D, and 7D already exceed the diffraction limits when shooting at high f-stops. So going to an even higher resolution just means you'll get diffraction limited at the middle f-stops instead of at the high f-stops. At this point, you can only take advantage of the extreme high resolution when (a) using very high quality glass and (b) cropping in VERY tight (or printing at very large size) and (c) shooting only with the low f-stops. All three conditions must be true. If any one of them are no longer true then the extra-high resolution is wasted.
Again... becoming "diffraction limited" doesn't mean the photos will be bad. They're still good. It's just that they're no better -- e.g. the 24 MP sensor can't create an image which is any better than the 18 MP sensor -- because a single ray of light cannot technically focus on just a single "pixel" (technically sensors don't have "pixels", they have a cluster of photo-sites arranged in a Bayer mask, but you get the idea.)