I guess the best way to explain that would be to take a heat source, like a soldering iron, and place it in a bucket of water. The bucket will cool via evaporation, simulating the radiator in this case. When you plug in the iron (increasing the heat load to simulate the increased block efficiency), the water will heat up, and the water will evaporate faster (simulating the greater efficiency of the radiator under greater dTs). Regardless of the fact that the water is evaporating faster, though, the water will be warmer than before. Now, the difference we are talking about isn't as large as turning a soldering iron on, but it is there. There will be an increase in coolant temperature if a waterblock is more efficient.
__________________
#!/bin/sh {who;} {last;} {pause;} {grep;} {touch;} {unzip;} mount /dev/girl -t {wet;} {fsck;} {fsck;} {fsck;} {fsck;} echo yes yes yes {yes;} umount {/dev/girl;zip;} rm -rf {wet.spot;} {sleep;} finger: permission denied
|