kronchev, Althornin has been delivering it in an unfriendly way but (unless I've misread something he wrote) he's essentially correct (although you may not be lying about your temps, it could just be a misunderstanding).
I'd be more than happy to go over the math with you if you would like. In short form, idle delta-T's must (by laws of thermodynamics) be lower than load delta-T's. If you change CPU wattage, you can get a linear scale of delta-T's with the slope equal to a constant that comes from a particular waterblock and system setup: C/W. So, if the difference between two waterblock dT's is less at load than at idle, the system has changed or the measurement is inaccurate.
dT(idle1) = C/W(1) * Watts(idle)
dT(idle2) = C/W(1) * Watts(idle)
dT(load1) = C/W(1) * Watts(idle)
dT(load2) = C/W(2) * Watts(idle)
dPerformance(idle) = dT(idle1) -dT(idle2)
dPerformance(idle) = Watts(idle) * ( C/W(1) - C/W(2) )
dPerformance(load) = dT(load1) -dT(load2)
dPerformance(load) = Watts(load) * ( C/W(1) - C/W(2) )
Notice that both dPerformance (the difference of temperatures on the idle and load deltaT graphs) points are proportional to the same constant:
constant = C/W(1) - C/W(2)
Call that constant K and rewrite the two equations.
dPerformance(idle) = Watts(idle) * K
dPerformance(load) = Watts(load) * K
So, if Watts(idle) is less than Watts(load) then that gap on the graphs should be greater on the load graph than the idle graph.
Althornin, please grow some tact. It doesn't help matters when you insult people.
Really guys, this board is for adults and I don't want to ban anyone from here. That's just stupid.
Last edited by Brians256; 05-19-2004 at 11:40 PM.
|