Thanks for considering my opinion BigBen

OK here goes:
Assuming that the dissipation of heat from the water to the radiator is proportional to the difference in temp between the water and the radiator ( and the air flowing over the fins). If one acepts this, then the heat dissipation would increase with increasing water temps in the radiator, and with lower air temps flowing over the radiator fins.
Consider the steady state of a simplified system, where the coolant temp is only a function of position in the cooling loop, and not time dependent. A lower coolant flow should produce a larger difference in inlet and outlet coolant temps in the radiator, and a lower total heat dissipated by the radiator.
I'l finish my thoughts after I get back, but I'd be curious what other people think.
Thanks