Accuracy of 3C over the FULL temperature range; the accuracy degrades a good bit at lower temperatures and in the 50-60C range I use this to test with then it is pretty close to 1C. It is also possible to calibrate the diode reader and setup and account for this as the accuracy reflects the variability of testing setups (electrical noise, capacitance, and variability in the diodes used) as well as the intrinsic accuracy of the IC; I am still working on that however. The problem is that you have to get a perfect joint at the CPU socket so that the calibration done by putting CPU with diode reader on pins into water is relevant to the diode reader soldered on the motherboard. Maybe you can assume that both joints are equal? Comments are welcome on this.
I believe if you look around I calculated some uncertainties based upon +/- 1C for a diode reader. That is reasonable at the temps I typically run (50-60C thus far with heatsinks).
The CPU numbers jump around because that is what CPU temps DO; the programs run cycles and if your equipment is sufficiently good then you pick therm up. The only way to deal with that is to take a LOT of measurements, average, and post a std deviation along with the number:
And no, those numbers aren't good enough to quantitatively rate waterblocks. My std deviations are in the 0.4C range. You really need a die simulator with exact power inputs and close to perfect insulation if you want better; a CPU won't do.
I have been critical to webmedic at times because I have done exactly what he is headed towards and I already know that he won't get the results he wants from the testing equipment that he has. I already attempted the same! Just trying to save him some frustration. I don't think I ever told him not to proceed.
In my opinion, the people telling him to go ahead and do all that work with shitty equipment are the ones doing him a disservice; not the ones offering advice on what is needed.