PDA

View Full Version : The purpose of testing?


Ewan
11-11-2003, 07:56 AM
I've been browsing for a few minutes and I find that there appears to be a great deal of contention as to how, who and what people should be testing when it comes to waterblocks. This is a sentiment that has been forming in my mind for some time as it's been obvious that most people haven't a clue as how to get useful water-cooling results. Or is the problem rather that waterblocks aren't what is being measured in the first place?

I've read a few reviews where seemingly top blocks do OK, lousy blocks do well. A whitewater appears inferior to a maze, etc. People comment on these forums that reviewers are fools and that their results don't reflect reality. What reality is it that they don't reflect?
In some cases (I would say most cases) the performance of a water block will vary greatly from system to system largely depending on who mounted the blocks and how much thought and energy went behind the block mounting process.
Is the WW a better block than a Maze? Most certainly. Will your average user find a noticable benefit (i.e. higher overclock) using a WW instead of a "worse" block? Of that I'm not so sure.

If there are too many variables for the average tester to be able to draw reasonable conclusions, then how is a poor end user to have a chance?

If a reviewer is coming up with dubious results, should he /she be slandered when in actual fact their results are simply a reflection of the randomness that end-users will experience?

If anything I think the problems with watercooling testing is not that testers are incompetant - on the the whole I find that testers are going to much greater lengths than are necessary to provide results which an end user would appreciate - but rather that testers are taking themselves too seriously and putting too much weight behind their own findings. I've found that even reputable testers have posted questionable data. If you can't get to a scientific level with your testing, then why pass off your testing as science?

However, just because it isn't science doesn't make the data useless.

I think reviews should better recognise this and should state so up front in their conclusions.

pHaestus
11-11-2003, 10:36 AM
However, just because it isn't science doesn't make the data useless.

If the testing isn't done scientifically then interpretation based upon data collected is worthless. CONTROL variables as much as possible, vary one at a time, and collect many replicates. This is the basis for testing. If several parameters vary then the results are of little use. A bit of careful thought and test design can save a LOT of work. There are people who have used relatively simple tools and gotten useful numbers because they designed their tests carefully. People without any formal technical training. I dont care WHO does testing; only what comes out of their efforts.

At this time, ranking waterblock performance with in socket thermistors is akin to determining the outside weather by sticking your tongue on a window of your house. Sure the tongue is sensitive to changes in temperature, and sure the window temperature is related to outside temperature pretty strongly. And this method is just fine for your three year old. But your tv meteorologist?

Hardware reviewers are supposedly the experts right? They are supposedly more knowledgeable and more experienced than any of their readers in th topic at hand. Otherwise why are they the reviewer instead of the reader? So if the testing of cooling gear is a technical issue (a rather difficult one actually) then why would you not expect said reviewer to show some degree of technical expertise?

No apologies from me; I am not the "average" reader. I read reviews (when I do) with an eye to (1) How they were done and (2) How generally applicable are the results. If 6 months worth of testing culminates in a review that's ONLY relevant for the wcing system (and motherboard/CPU) the test was done in then it isn't too much use for me or anyone else.

If the error in your temperature measurements is large, then the numbers themselves aren't of much use and instead impressions of things like mounting, ease of use, and value become what is really important in the review. No complaints from me here; blocks do perform closely and so value and quality of manufacturing are important in making buying decisions. But the fact is that people DON'T base buying decisions on that part of a review. They base them upon numbers that have no statistical significance instead. And that's unfortunate. Ask the reviewer and they'll happily tell you that the numbers aren't set in stone and the tests weren't as accurate as they'd like (if they are honest). But that doesn't matter because the results are in a graph and look completely objective. Why not propagate error? That would be the honest thing to do. Or at least test three times and plot average + std deviation from mean rather than a single number.

jaydee
11-11-2003, 10:38 AM
When I first joined ProCooling a few years ago I was of the same opinion most are now that quality testing was a waste of time. I argued with BillA and pH the same as people argue with me today.

After a couple YEARS of designing my own blocks and studying BillA's and pH's work it all started to come together what the possibilites of quality testing are.

1)
With quality results the end user can choose spacific parts for his systems and KNOW how well they will work before even buying them. Something that is not possible with these reviews data. If we get data on each block such as head loss, how it performs at various flow rates, C/W, etc. you can make a chart and plot how well it will perform at any flow rate. Once you get that info you can look up the PQ curve on pumps and presure loss's on tested rads and a basic calculation of head loss of the hosing and you can build a system that will work as well as optimal as it can before you even buy the parts. All these calculations can be all put into a GUI interface and made easy for the end user to figure out. Thats the basic answer.

2)
Water Block design. I design and build blocks, and untill recently most of my blocks I tested myself worst than these reviews and I have not been able to make to much advances in my blocks. This is changing. My test bench is getting better (even though I gave up several times already), my knowlege of using it is getting better, and so my results are getting better. With better results the smaller mods I make to each design show up better. So I have a better idea what I am doing right or wrong.
Now if better results could be taken on commercial blocks it would help the designer make a better block as they would know what works and what doesn't. Once again that is the basic answer.

The problem is not to many people are up to doing this. The results from review sites are usless at best and mainly a gimik to get hits. No flow rates, pressure rates, unknown wattage, guesstimated C/W's, equipment with 10C error margins... It's all useless really. I used a dart board example before and I belive it is just as accurate as these review's results. Take the top blocks from each manufacturer, right the name on a peice of paper, attach them to a dart board and start throwing darts. Average the results and you have your block to buy! They all perform well so you are not goign to loose to much, but wouldn't you rather know that a $40 block might perform nearly aswell as a $90 block and save $50 with a .5C-1C loss in performance?

Without quality results reviews are nothing more than hits getter's and free advertizing. Even if the reviewer doesn't have that intention, that is still what it comes down to.

BillA
11-11-2003, 02:33 PM
Ewan
this is doubtless the worst forum in the world to promote 'consumer testing' to,
but such is the crux of your question

as the principal proponent of 'highly accurate' testing of WCing components, I will repeat the conclusion of an old article:

"Why bother with all this ?

While the equipment and methods described in this article may seem hopelessly complex and even impractical, one needs to consider the intended purpose and use for the data so developed. The results are engineering data, of use to systems and product designers. The independent control of the heating and cooling functions - while being able to measure the effect - is in fact the only way that a design prototype may be validated. And also the only way that a wb's performance attributes can be quantified."
etc. etc.

most on this site are quite into product design, not at all true in most other places
but since an informed decision is, to me, better than a random one; I conclude that more information is better

and the question you asked of me about rad test results earlier is illustrative
- the sole piece of equipment that I still have from when that testing was done is a Little Giant pump; everything else has been replaced, several times over
-> this is the process of improvement; in terms of the equipment capability, the knowledge of how to use it, and increased knowledge also in understanding and intrepreting the results

aka progress
and I suspect you understand this after comparing the 2 articles
no, the errors of an individual do not invalidate technically driven testing methods

Ewan
11-12-2003, 01:58 AM
Cheers for your replies. I should probably do the search myself, but while the thread is still open, do you (as a website or as a collection of individuals with a common interest) have an easily referenced page where you outline what YOU think are good testing standards?
I.E. what data do you think should be gathered.
In what manner it should be gathered.
Useful equipment.
Good data presentation.

I think one of the forum members has been pushing for some sort of standard to be erected without getting much support the forum. This would not be so much a standard rather than a recommendation based on your experience. Preferably also explaining how you have arrived at you recommended set-up.
I expect that until a certain methodology becomes accepted and standard practice (something that won't hapen unless you push for it, I think) there will continually be reviews which will be regarded as substandard. Is there such a page anywhere?

pHaestus
11-12-2003, 08:50 AM
http://www.thermal-management-testing.com/bench_vs_system_testing.htm

http://thermal-management-testing.com/methodology.htm

http://thermal-management-testing.com/waterblock_test_results.htm

Same articles as on overclockers but I can actually find them on Bill's site.

WAJ_UK
11-13-2003, 04:54 AM
Accurate testing certainly is important for development but I see Ewans point on questioning whether it is relevant to the people who buy the products. A large part of how a water block performs is due to how the user mounts it as a poor installation could make a good performing water block perform badly. Surely the mounting system is an area than can be improved upon. People don't seem to put as much effort into designing how the wb will mount as they do to the design of the heatsink within the block. I think that a consistent, easy mounting method that anyone can use and get reasonable results from will be quite a significant step in the design of water blocks.

sorry for going a bit off topic i'spose it should really be in the wb design forum

jaydee
11-13-2003, 10:52 AM
Originally posted by WAJ_UK
Accurate testing certainly is important for development but I see Ewans point on questioning whether it is relevant to the people who buy the products. A large part of how a water block performs is due to how the user mounts it as a poor installation could make a good performing water block perform badly. Surely the mounting system is an area than can be improved upon. People don't seem to put as much effort into designing how the wb will mount as they do to the design of the heatsink within the block. I think that a consistent, easy mounting method that anyone can use and get reasonable results from will be quite a significant step in the design of water blocks.

sorry for going a bit off topic i'spose it should really be in the wb design forum
Unfortuantly there are only so many ways to mount a water block. Through the holes or with a clip. Mounting advancments will have to be done on the mobo.CPU manufactures side not block designers. We just have to work with what they give us.

Also if you mount the block wrong chances are the effect will be less if the block is a better performer. Without knowing what is good and what is bad the consumer has no clue. With the recent reviews I have seen the consumer is less informed AFTER reading the review than before. Such as the recent liquid ninja review.

If your looking for convenience you can forget the dart board method I proposed and just drop the names in a hat and blindly pick one out. That will take less effort and be just as good as the reviews as of late.

IMO of course.

Ewan
11-14-2003, 02:17 AM
Thanks pHaetus for the links.

A large part of how a water block performs is due to how the user mounts it as a poor installation could make a good performing water block perform badly. This has been one of my concerns as well. I have a supposedly good CPU cooler but get high temps for a computer which I can only put down to my lousy installation. Though I've tried remounting it a couple of times without things improving, so maybe it's the cooler afterall. However, in one of the links above BillA describes mounting methodology thoroughly. Following those guidelines should minimise the installation effect. It's a pain in the arse proceedure though so I doubt many follow it.

IMO of course.That's what I came here for ;)

WAJ_UK
11-14-2003, 07:04 AM
I agree jaydee I wasn't trying to suggest that those kind of reviews were of any use to anyone. More that anyone who is installing a waterblock could also get the same sort of range errors that those reviewers do. They seem to be quite tricky to fit correctly (from what people say I have no experience myself) and any air between the die and block would make a huge difference. I think 1 micron of air has about the same conductivity as 40mm of copper (just using numbers in my head).
It's a shame I can't think of a better way of using those 4 holes or the socket lugs or a better way to position the cooler on the die:confused:

Maybe something useful will come out of the big effort to make the mounting in bench testing more accurate. Something that could be transferred across to a computer system

Since87
11-14-2003, 08:23 AM
Originally posted by Ewan
I have a supposedly good CPU cooler but get high temps for a computer which I can only put down to my lousy installation. Though I've tried remounting it a couple of times without things improving, so maybe it's the cooler afterall.

How do you know what temperature your CPU is at?

The thermometers on motherboards (including those that use the on die sensor) have very questionable accuracy.

BillA
11-14-2003, 09:44 AM
very few users install a wb correctly, so it is to be expected that their 'reports' re wb performance are bunk
- and I suspect the 'reviewers' are little better

here is the very simple test to evaluate the 'quality' of the wb's mounting:

with the case in its normal position, unfasten the wb's retention system
- where is the wb ?
if it is still sitting correctly on top of the CPU, congrats - you have a good hose installation

I’ve said this many times: the coolant hoses must be 'externally supported' so that they do not affect the wb's mounting
- these bending moments are why users get such wild results,
they just don't know what they are doing, or don't care enough to do it correctly

it is NOT easy, I have a WW in a P60 case and I had to resort to copper tubing supported from the case frame
quite a chore