Quote:
Originally posted by schoolie
Thanks for considering my opinion BigBen OK here goes:
Assuming that the dissipation of heat from the water to the radiator is proportional to the difference in temp between the water and the radiator ( and the air flowing over the fins). If one acepts this, then the heat dissipation would increase with increasing water temps in the radiator, and with lower air temps flowing over the radiator fins.
Consider the steady state of a simplified system, where the coolant temp is only a function of position in the cooling loop, and not time dependent. A lower coolant flow should produce a larger difference in inlet and outlet coolant temps in the radiator, and a lower total heat dissipated by the radiator.
I'l finish my thoughts after I get back, but I'd be curious what other people think.
Thanks
|
I agree with the first part, but I don't believe that the proportion is linear.
As for the second, if the outlet temp is much lower than the inlet temp, then where did the heat go? It is a function of time, any way you look at it.
Let me see if I can pump out an overview:
Facts:
1-Ambiant air is at 20C
2-Coolant temp is increased by the power emitted by the CPU.
3-Coolant is cooled by transmiting its heat to the metal that composes the radiator.
4-The rad cannot lower the temp of the coolant below the temp of the ambiant air.
5-The heat from the coolant is transmitted to the metal of the rad at the same rate, regardless of the rate of movement of the coolant.
6-The rad lowers its metal temp to the ambiant air.
7-The fan helps the above purpose.
It therefore follows that the longer the coolant sits in the rad, the more heat can be transmitted to the metal of the rad. The result is a lower coolant temp.
It is however increasingly pointless to reduce the flow rate, if the coolant temp exits the rad at a temp close to ambiant.
Let's see if we all agree on that before I go on.