Black frame noise reduction really only helps eliminate sources of noise that are determined by the duration of the exposure. That is hot pixels and sensor bleeding. Other noise is quite deterministic and the result of averaging a black frame and a normal frame is a picture that's half as bright. Astrophotography stacking software uses this for the same reasons. The black frames are used to eliminate hot pixels and bleeding, and the black frame pictures at max shutterspeed is used to eliminate errors due to the read out process, the rest of the noise is extracted by averaging lightframes.
There's a real benefit to noise in astrophotography. Because noise is deterministic it's possible that once your entire sensor image average is above the black point (when you average many high iso pictures together you won't have a single black pixel in your final result) any photon that hits a pixel statistically raises it's chance of recording a lighter value. This is why when you look at the result of a nights shooting you'll see 100 very dark frames with lots of noise, but when you stack them you'll get all sorts of wonderful colours of space that didn't exist in your individual shots previously, regardless of how much you play with curves in photoshop.
Anyway your process in action:
http://www2.garbz.com/stars.jpg This image is the result of a bunch of 25 second exposures shot at ISO1600 with a camera that is widely regarded as being one of the worse performers in the high-iso noise department.
For once your coffee fuel over-thinking has led to an incredibly sound theory