I recently stared working in RAW using Adobe Camera Raw (which comes with CS3). A few days ago I tried unsuccessfully to shoot the moon surrounded by faintly illuminated clouds. The challenge of the shot was its high dynamic range (DR) (faint clouds, bright moon) so this got me paying more attention to the issue of DR with RAW format. My question is this: how can I verify I'm getting the DR I think I am? My Nikon D300 is set to 12 bit NEF. If Adobe behaved as I expect, I'd see values between 0-4095 in my image. Yet as I mouse over regions of the image in the Camera Raw software, the pixel values displayed are clearly only 8-bit values (the max value in the highlights is 255). Why am I not seeing a range consistent with the RAW DR? Obviously the software has to scale the image to the DR of the monitor for display purposes, but I would expect that it would report the correct underlying numeric values. Setting the color space preference has no effect on this by the way (e.g. 16b Adobe RGB). Based on this behavior, I'm concerned that I'm doing something wrong and not actually working with raw data. Or am I just somehow misinterpreting the software's display? Thanks for your help.