Quote:
You are going to infer the temp? At what resolution? As an example will it be 70C or 70.1C or 70.01C? Even if you place sensors in the riser as you plan there is still the calibrated accuracy of the probes to be figured in. As an example even with equiptment capable of .01C resolution, there is a +-.05C accuracy to be figured in, doubling the probes doubles this problem.
|
It's not difficult to calculate. Lets start with round numbers:
-A heat die of 100mm2
-100 Watts load
That makes a heat density of 1w/mm2
With copper conductivity of 0.386W/mm*k we have a gradient of 2.59ºC for each mm in lenght. With probes 10mm apart there is a delta of 25.9º. If you use two probes with an accuracy of 0.1ºCfor an accumulated error of 0.2º you have a power measurement accuracy of 0.77%. That's not bad at all considering that when monitoring input power only you have to guess how much is your secondary heat loss.
Calibration of the two probes is easy: in cold and steady state both should measure the same temp.
Applying this calculations to Robotech's die simulator we we can see a real world case:
Asume again he's using 1w/mm2 heat density. Given that he's using a brass die (I'll guess 0.120w/mm*k?), the gradient would be 8.3º/mm. In a distance of 10mm the delta is 83.3º. He's using probes with 0.05º accuracy so his error margin is 0.12%.
By choosing brass for the heat die the gradient have a big increase and improves accuracy of the reading, but that has a dangerous side effect: The base of the simulator is going to be at burning temps! Guessing again a 1w/mm2 heat load and that his brass die has a height of about 20mm, If he tests a cheasy heatsink that's only cappable of keeping the die surface at 60ºC, the base of the brass die is going to be at 227ºC! Things are going to be worst than that at the cartridges un the aluminium base.
Robotech: If I were you, I'll reconsider the use of brass, or at least lower a lot the height of the thin part.