Quote:
Originally posted by MeltMan
Why?
Who "defines" what the minimum flow rate for a radiator to be efficient is? You arent going to "miss out" on any cooling. Your radiator will just cool the water closer to ambient longer. So what if the water stays in the radiator at close to ambient longer? That is a good thing in case you add more heat load. It's a buffer. You arent missing out on cooling. The water wont warm up in the rad. You are just gaining flow rate which in turn is cooling better on the chip.
|
Oh, so close...
Ok. Try to picture a graph of the performance of a rad, where X is the flow rate, and Y is the amount of heat dissipated.
Would you agree that, at no flow, the heat dissipated is also zero?
At a very, very low flow, (I need JimS here, to complete what I'm trying to say!) the coolant would get very hot, enter the rad, and be cooled down with time to spare BUT the waterblock has allowed the coolant to become very hot, and your CPU temps would suffer.
If you allow the coolant to go a little faster, then the CPU wb still emits the same heat to the coolant, but it'll do so to a larger amount of coolant, so the coolant doesn't get so hot. In that scenario, the rad would still have time to cool down the coolant.
Now... this is where we jump to a faster flow rate. The idea is that for time X, wether the coolant passes through once or twice, the coolant spends the same amount of time in the rad and in the CPU, so it shouldn't make a difference. The coolant is half cooled, but half heated. The only difference is that the coolant is travelling faster, which will create more pressure throughout the rig.
The higher pressure is good, because we're getting into turbulent flow, which would allow the water to collect or dissipate more heat.
This pressure thing is where we're at...