determining overall C/W with idle/load temps?
ok, so motherboard measurements are terrible for ABSOLUTE temps
but, they seem to be decent for relative temps...that is, if you increase the environment 3C. they increase about 3C. Would it be unreasonable to measure the difference between idle/load temps, and somehow calculate the difference in watts of heat output of the CPU? |
Quote:
If you are planning on doing this then try and calibrate the on-board with a TC or something "good" or you may spend a lot of time collecting useless data. Note this is my own half assed testing so who knows. Better safe than sorry though. |
There's plenty of evidence in addition to Jaydee's that onboard sensors are distinctly non-linear.
|
I sit corrected, then. Just thought I'd throw it out
|
funny ABIT RD is just looking to electrical resistance vs. temp past week, I guess someone in their BIOS team finally realize that it is NOT linear but a curverture relationship
|
All times are GMT -5. The time now is 02:12 AM. |
Powered by vBulletin® Version 3.7.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
(C) 2005 ProCooling.com If we in some way offend you, insult you or your people, screw your mom, beat up your dad, or poop on your porch... we're sorry... we were probably really drunk... Oh and dont steal our content bitches! Don't give us a reason to pee in your open car window this summer...