Fluke doesn't list it anymore, but it looks like several used dealers snagged the Fluke web page for it. Like
this.
Relevant specs are:
Range: 1000A, 200A
Resolution: 1A, .1A
Accuracy: 0-600A: ±(1.9%+4), 600-1000A:±(3%+3), 0-200A: ±(1.9%+7)
My interpretation is that on the 200A range, the resolution is 0.1A and the accuracy is 1.9% + 0.7A.
Assuming 70 Watt dissipation on a CPU at 2V, gives 35A. So likely less than 4% error. (With the CPU loaded.) Probably substantially less error than that when the error data I can generate is applied.
The effect of that '+7' could be reduced by looping the wire through the clamp several times, and dividing the reading by the number of times the wire was looped through the clamp. I don't think I'd actually advise doing this though.
There is still the issue of how the reading is affected by high frequency components of the current going through the clamp. (My main objection to the THG article) A set of readings of idle and load current at different Vcore settings ought to provide an indication how much error this can cause. If it appears that error due to high frequency components of the current is substantial, additional steps can be taken to improve the test setup.
I think the Fluke 36 would definitely provide interesting data. It's certainly not going to get the results that can be achieved with a die simulator with 'inline' current measurements, but I don't think that's necessarily the goal here.