This is totally misleading IMO. The color information is "not thrown away" when using JPEG. The values from sensor undergo a non linear transformation - gamma correction and only after that it compressed throwing away only information not relevant to human vision. The gamma correction is a crucial step allowing to keep almost any relevant information. The only reason to shoot raw if you want to do gamma correction using your algorithm (software), not one in camera. If raw have more "editing headroom" then the in camera algorithm sucks, but usually it is not the case.
Today's digital cameras make 12-bit depth (4096 discrete colors per color channel, as defined by colormetric interpretation of the Bayer array) or 14-bit depth (16,384 discrete colors per color channel, as defined by colormetric interpretation of the Bayer array).
JPEG is limited to an 8-bit depth, or 256 discrete colors per color channel. What happens to the other 3840 or 16,128 discrete colors?
Banding and posterization can result in image gradients because JPEG lacks enough colors to render the gradient without visible steps.
None of the luminosity data the image sensor records is discarded.
The image sensor records a linear gamma (1). Human vision cannot be modeled accurately using a gamma curve, but Raw converters usually use a non-linear gamma encoding between 1.8 and 2.2, close enough to match the way human eyes see the world.
http://www.adobe.com/digitalimag/pdfs/linear_gamma.pdf
Also see - Real World
Camera Raw by Bruce Fraser and Jeff Schewe.
Here is an approximation of an image having a linear gamma and no colormetric interpretation of a Bayer array, but converted to JPEG for online display:
The same photo with a non-linear gamma applied, colormetric interpretation, and conversion to JPEG for online display.