As much data as possible without blowing the highlights. I may not have stated it right in the OP, I know how to get a proper exposure, how to use the in camera meter and my hand held. I routinely use a combination of judgement, average of spot readings, and watching the histogram, to determine the "correct exposure ". However, it's a known fact that different manufacturers apply certain "proprietary adjustments" to both the Raw file and a processed JPEG, so is it possible that they also calibrate the meter to give a proper exposure per "their standards" for a JPEG processed by their internal software, regardless of if you saving Raw or JPEG?
Here's an example of something I was playing with the other day. I had the camera set to Raw+ (save both Raw and JPEG to card). I maximized the exposure per the histogram (which by the meter was not quite a full stop right of center). The first image is the histogram from the Raw File.
View attachment 139657
The second image below is the histogram from the JPEG created from the same exposure at the same time
View attachment 139658
The Raw file had maximum data and showed no sign of blown highlights when opened post but the JPEG looked over exposed. Had I centered the meter, the JPEG would have been properly exposed, and the Raw would have needed a bump in post.
Am I missing something?