I'm well aware of your experience and have learned much from you, however your methodological techniques, are sadly lacking leaving out some important parts like comparing, analogy, and deduction before assuming a conclusion. The fact remains that personal experience without validation is still an unfounded opinion. With the availability of information available in today's internet world, if this were factual there would be corroborating content out there.
I'll repeat myself from above, "WB is not recorded as Kelvins in a Raw file - it's recorded as coordinates in a color profile which is then translated back to degrees by a processing software. The actual measured temperature of light is what's reflected off the source. Once recorded on a medium, it's fixed, so the term WB is irrelevant despite its interchangeable use with temperature, as both are merely data points in a digital file." There is no "Kelvin" in a digital file. There are numerous sites that confirm this, but here's an interesting read from FastRawViewer
https://www.fastrawviewer.com/white-balance-as-per-channel-exposure-https://www.fastrawviewer.com/white-balance-as-per-channel-exposure-correction#:~:text=Thus, white balance is only,may change the RAW data. If you'll scroll back up to post 7 where I said "You corrected the Raw file to a known gray value, then tried to correct a JPEG which has data coordinates adjusted to some extent depending on the algorithms of the camera manufacturer", it's noted in the referenced link that a JPEG image has "modified " data which can result in differences when applying post adjustments to the file (especially in H,S,B points of colors). Why do you think all the recommendations say you get more accurate WB correction in Raw?