Quote:
Originally Posted by freeloadingbum
Please clarify. Are you saying that you are not using 70 watts as a substitute for 100watts radiate and then creating the c/w measurement based on that simulated 100 watt load. I understand c/w fully. It's only your explanations that are confusing.
|
The temperatures and any differences are calculated using an actual 70W of heat.
That means that you then can't take a C/W, multiply it by 100W, and then add in a temperature delta of 0.5C (or whatever) that was derived at a 70W heat load. You would need to add in the delta as relative to 100W, so you would need to multiply that 0.5C derived at 70W by 100/70 => ~0.71C to arrive at your C/W.
It's easier if you just work everything out at the wattage that everything was measured at, manipulate the C/W's there, and then multiply the final C/W by 100W to get your predicted 100W temperature.
i.e. you're getting confused by mixing temperature differences derived at one wattage, and applying them to temperatures caclulated for a different wattage.