This is kind of common sense here... If you want the best quality images you use the best quality setting the camera offers.
You had said you were shooting in raw. The full sized raw is every bit of information that hits your sensor. If you are not post processing raw properly it will give you a crappy quality. If you are post processing raw properly it will give you the best possible quality.
If you are having a hard time with the raw format and quality then shoot in the highest quality jpeg.
Want to see an approximation of what a Raw file looks like before it goes through a Raw converter like ACR?
And the same Raw image data file after the Raw conversion process of demosaicing to extrapoltae color, the application of non-linear luminosity adjustments, and some other things done during Raw conversion to make the photo look approximately like what we can see :
Ok so each file is made up of a bunch of numbers that we have no idea how to read but the camera/computer can. So imagine(just making the numbers up):
Basically the closest you can get to a regular negative regards to data-
a Tiff (which you can't shoot in, but just to give perspective) removes any duplicates to shrink the file. Its smaller but still mostly there-
JPG- 172817251 - MUCH less data.
A Jpeg, removes duplicates AND merges close numbers-
It's always best IMO, when shooting with the possibility to print, to shoot in RAW. It just gives you more options on the day when you realize you finally know what the heck you're doing, you can go back and edit much more effectively. You can also print larger without loss.
Most DSLR's use 12 or 14 bits to represent the voltage each pixel generated when it was exposed to light. By the way, no digital camera has a digital image sensor. They all have analog image sensors and the voltages the pixels record have to be converted to digital numbers, which is done in an analog-to-digital converter on the image sensor chip after the analog voltages are amplified (also on the image sensor chip).
Luminosity data is the only information an image sensor is capable of recording. Yep, the image sensor doesn't record color. All digital images start as a grayscale image.
111111111111 (4096) is the largest number (bit depth) 12 bits can represent, and allows for 4096 potentila levels of luminosity at each pixel.
11111111111111 (16,384) is the largest number (bit depth) that 14 bits can represent.
A JPEG also starts as a 12 or 14 bit image data file. But JPEG is a lossy, compressed file type.
A JPEG retains most of the luminosity data an image sensor records. The lossy part of JPEG is the result of 2 changes; 1. The color bit depth is reduced from 12 or 14 bits to just 8 bits - 11111111 (256). 2. The pixels in the image are converted into 8x8, 8x16, or 16x16 pixels units (Minimum Coded Units - MCU's).