View Single Post
Unread 10-22-2005, 06:09 PM   #3
Cathar
Thermophile
 
Cathar's Avatar
 
Join Date: Sep 2002
Location: Melbourne, Australia
Posts: 2,538
Default

Rather than suggest that the outlet temp is superior, I'll just explain the reasons why I prefer it.

It really comes down to the similarity between this application and the way PC's work, and thereby convenience for the end-users to interpret the data.

By measuring performance relative to the radiator outlet temperaure, the heat load (constant measured in Watts), and the flow rate (air & water), we are directly simulating what occurs in PC's as flow rate is varied, and we are providing a direct 1:1 relationship between the flow rate and what temperature comes out of the radiator, and hence into the waterblock.

Waterblocks have their C/W measured relative to the heat load and the inlet water temperature. This is fine and the way it should be, because the waterblock is where the heat is being added. The problem here is how can the average user predict what the inlet water temperature will be given some radiator, a fan, and a pump?

It just seems fairly obvious to me. We know the heat-load (well, we don't really but it is a constant given a fixed CPU program load, and can be estimated fairly well), we know the flow rate, either through measurement or through prediction based upon the PQ curves of the loop components, and we know the air-flow through the radiator through whatever means.

It just seemed to me to be most useful given all that to simply state what the resultant radiator outlet temperature would be, as this provided a value that is of most direct use to the end-user, without them having to work out the temperature drop of the water across the radiator which is what they would have to do if we measured the water temperature at the inlet.

It is clear that as the flow rate varies this then means that the water temperature that enters the radiator is going to be warmer/cooler as the flow rate is dropped/increased, and this doesn't present a nice fixed relationship between incoming air and incoming water. However, if we measure at the inlet we have to raise/lower the heat-load to maintain a fixed dT between the water-in and air-in, but in doing so we are now changing the picture. The CPU does not increase/decrease its heat load as the flow rate changes, or as air-flow rate changes, so I don't understand why we should be quantifying radiator performance with respect to a variable heat load. It's not how CPU's works, and it presents a figure that is less useful to the end-user without them first munging the values to compensate for the thermal mass flow rate of the liquid.

I do also understand the reasons for wanting to keep the air/water delta constant, purely because this removes an extra layer of variability with the test equipment. If test equipment is accurate to +/- 0.1C let's say, and the outlet temperature across a full range of test conditions is ranging from 1C to 10C, then we now have a variable margin of error.

At the end of the day, I do feel that the average user is prepared to accept values with a varying margin of error that approximate their waterblock inlet temperatures, rather than be presented with a C/W figures with a fixed margin of error but then requires them to do thermal mass flow rate compensations to determine the waterblock inlet temperature.

So that's why I prefer the radiator outlet temperature. Just my 2c.
Cathar is offline   Reply With Quote