BillA-unregistered...how accurate is the XP's thermal diode?
I was curious if you have ever tested the readings of an XP internal diode on a MB equipped to read it,against the temps based on your testing platform on the same chip?
If so, How accurate is the XP's diode? And on what MB did you test it on?? Also, I want to ask you, in your opinion,what is the cheapest , most accurate way to measure the temp of a given cpu? And how YOU would impliment such a crude way of measuring the CPU temp?? Thanks in advance |
I can answer this for you as I and bill found quite allot of this data in spec sheets for both amd and intel. Lets just say it's only good enough to shut your computer off before it fries.
In there own design and tech docs they go into great detail about how they made compromises to keep the cost down and the built in diode was one of the cost cutters. It has a -+10c margin of error. If thats acceptable to you then it completetly invalidates bills testing. This tech doc was from intel but I found simmilar data from amd. The problem here also is that every cpu made has a margin for error within that range and could be different from one cpu to the next. Also you dont even want to go here. I started out asking that about 6 months ago. You wont like what you find out. |
Quote:
What would be the most accurate(without $20,000+ equip.)to measure the temps of the core >> BILL<< ? I would like to get as accurate of a reading as I can. And I do want to "go there" as you put it,Webmedic. That was the WHOLE point of posting ! |
Get a Digitec 5810 Thermistor Thermometer and a YSI 700 (or 400?) series thermistor probe. Drill out your wb base for probe insertion. Lube with AS generously. Insert probe.
0.01°C resolution, accuracy varies with probe model. Total cost: <$100. |
How does drilling and sticking a thermal probe in your waterblock base give you your CPU temperature? Given that the thermal goop junction for some blocks already forms more than half of the total thermal resistance in the system, your temperature readings from this method will be between 30-60% of reality, with the actual difference being depending on the block in use and the flow rate.
Also what to do about block who's bases are too thin to drill into and insert a thermal probe. |
If you want to test temps the only way to do it correctly is to set up a die simulator as bill has. If you look at ebay and get things reasonably you can get away with maybe 1000 - 1500$. Thats maybe.
Now if you want to test cpu temps on your running system really there is no good way. No matter witch way you go there will be error margins to high to be usable. That is if you need this for anything remotely usable. If that is the case build your self the die simulator. If you really dont care about any of the above and you just want to have a temp to look at just use the one built into the cpu or mobo it's just as good as anything else for that purpose. Really trust me on this I was working on a water block shoot out and I researched this very well. In the end at least for me the expense was not worth it and the error margin is to high on a normal system to give results worth anything. |
We've talked about this before... As I remember, the diode itself has a +/- 1 deg C accuracy, but the circuit that reads it can be off by 10 deg C, sometimes more! (am I remembering this right?)
So pHaestus has this mod, but I seem to remember that the mobo circuit can interfere with it (if it's present at all!), making it all a complicated mod. |
well, we ALL need to gang up and ride pHaestus to complete his test setup
-> one of the principal goals of which was to develop some data for a correlation between CPU #s and the 'real world' (according to BillA of course) (we won't get quite there, but certainly can do better than where we are now) pHaestus ? oh pHaestus ?? to answer the original question: as webmedic stated, all the mobo stuff is 'feel good', ahem, instrumentation the AMD diode is rendered about worthless due to the inability to calibrate the Intel temp sensing is apparently quite good - but totally inaccessible suggestion: use CPUs for computing tasks |
How do you guys feel about a thermistor epoxied to the underside of a chip? Or what about a thermistor touching the outside edge of a XP's core? Would you trust those temps to any extent? Just curious.
|
depending on the motherboard the margin for error with those type of temp sensors is just about as bad or maybe even worse. Asus for instance has very bad temp sensors. They almost always read about 10c to high. I could guess they most likely did this to save the cpu from overheating but it still does nothing to tell you the true temps.
|
a temp is a temp
but if you cannot calibrate (offset and linearity) then it is only a number such 'numbers' can be useful for 'A' and 'B' comparisons in the very same setup but are quite worthless to compare one system's components to those of another and of course such 'numbers' should not be confused with temperatures in testing calibration is everything, one MUST have a fixed (and valid) reference point |
So , if I have a system set up and note current temps,then remove and replace just the waterblock with another style waterblock leaving all other components intact... I can reasonably assume any temp changes can be attributed to thermal interface or actual waterblock performance?
Assuming all previous conditions were the same, this would effectively tell ME how any given cooling loop part compared to another in cooling effectivness. |
sure
just don't obsess on the specific 'temp' quantity, as it may be more - or less than what you are seeing |
Quote:
To get actual CPU temps, I suppose you could drill out the block so the probe would be exposed to the core itself. This could also take care of the problem of the base being too thin, since you are using the space between the substrate and the wb, as well as the space drilled out. |
The reason I asked is because I wanted a pretty accurate and cheap way to test my blocks against some commercial ones,to make sure that its performance merits the $90 that BillA charges for PROPER testing and analysis(and of coarse for the Free publicity:)
I would hate to be thinking that my block performs so much better than block"X" and have Bill tell me its a peice of $hit and thanks for the donation pal! I guess Im pretty much screwed then. I was going to get an older Epox board that supports reading of the XP diode, but I dont see any point based on all of your advise. And my block is definately NOT going to have a hole drilled into it, not by me anyway! Thanks all, for the advise and insight. |
Yes the diode reading of the motherboards themselves is highly suspect, but the AMD diode itself isnt too bad.
Progress is being made Bill; I put a res on the little giant today, and fittings on all the rest of the stuff. I have a second PC for temp monitoring now, and a Morgan Duron for purposes of collecting CPU backside temps at same time that diode temps are collected. Should hopefully get it running in the next week if nothing unforseen occurs. |
This is such a pressing issue. I can’t help and wonder how many reviews out there would be invalidated if an effective way to measure CPU temps were to arise. Maybe that is why the best way (In my opinion) is to use a CPU simulator. Think about how stupid it is, we are here trying to measure performance with out doing any actual measuring (again my opinion). At the very least with a simulator we know (or should know) much heat is being applied to block, system etc. In case any one cares (and I doubt any one does) I use the hole in the block technique to measure my temps.
|
Not everyone stresses there cpu's using the same program. I am sure that different instructions heat more or less. Also with the cache on board, and redundant or repetitive instruction will not be recalculated over and over. So, some programs designed to keep the cpu hot may not work with newer chips.
A CPU simulator seems the only way to keep a stable even temperature. Some results with a cpu would be as varied as much as the tester because the heat load will be completely different! |
There might be a solution...
First, if the mobo reads the CPU diode, then you have to bypass that reader. Since none of us want to start cutting traces on our mobos, we're left with one option: cut off the cpu pins for the diode, and attach wires to those two pins. THEN one could build pHaestus' reader. As for the calibration, the neat thing about this diode is that it'll work even if the CPU is not powered, so if you're able to stabilize its temp (bad example: dunk it in water), and take an accurate temp measurement, that'll be your calibration. Some notes: a) you should take this calibration measurement at different temperatures, because the diode's response curve may not be linear. b) the accuracy of your calibration is going to depend (a lot) on the accuracy of what you use as a reference, to measure the temp. It is *theoretically* possible to build a reader with +/- 0.1 deg C accuracy (PM me for details), but the CPU diode may not have this accuracy. |
Hm I'm glad to see more has been done with this line of thinking since I was looking arround before. I'm a little sceptical but I would really like to know the outcome of this testing myself.
|
ditto
but every time Ben says "*theoretically*" be prepared for a shunt to the dark side of the moon |
My Diode calibration since May 2002.
From http://forums.overclockers.com.au/sh...5&pagenumber=2 :- "Have fitted a Diode Reader made by Hoot. to a ECS K7S8A volt modded motherboard to take the temperature of a Morgan1100.* The custom-made reader has a 15" long shielded cable to the SMBus header plug which is plugged into the header on a P3V4X motherboarded Celeron PC.Temperatures are read using MBM5 on the Celeron PC. I checked the the Reader's sensitivity/calibration by pumping water at temps 18-45c through a waterblock on the Unpowered Morgan1100 and compared the Diode Reader's temp with that of the reservoir water measured with a greenhouse/household bulb thermometer. The Diode reader gave temps consistently ~ 1.5c below that of the of the household/greenhouse thermometer throughout the temp range.This was the case even when the resevoir water temp(18c thermometer read) was below ambient(23c).I did not compensate for the difference in reading because the greenhouse/household thermometer is of unproven accuracy." |
Quote:
It may be possible to generate equal amounts of heat with two programs, but in one case the heat is disproportionately in the floating point circuitry as compared to the other. Does anyone even know where the temp sensor diode is located on the die? (Central? One edge? Near what subcomponent(s) of the CPU?) |
Quote:
I've got some doc from Maxim which details a calibration procedure to get to a +/- 0.1 accuracy on their 1-wire line of ICs. Wether it's valid or not, is up to you to judge. |
although I don't do such, I am quite interested
1) does pHaestus have these docs ? 2) if not can you e-mail or fax them to him ? if so, I would be interested in a copy as well thanks |
Certainly! (tmo)
|
dallas 1 wire isn't the same as the CPU's internal diode as far as I know. The 1wire temp sensors are basically independent little digital thermometers in a typical transistor package. They work well because the sensor just needs voltage from the reader and a wire to transmit the temperature back. This means a lot of the problems associated with diode readers (trace width/length and solder points) are no longer a concern. The 1wire sensors I have are 0.5C accuracy and 0.125C resolution though; haven't seen any info on getting better than the mfgr's stated limits on them.
FYI, my approach to dealing with diode calibration is to basically epoxy one of those 1 wire sensors (or now, thanks to Bill, a type T thermocouple) under the center of a ceramic AMD CPU. There are equations relating the CPU die temp to the CPU backside temp for the ceramic CPUs supplied by AMD. The relationship can be used to cross correlate the diode. Not so sure about soldering onto CPU pins Ben. I know PeterNorth killed an XP's diode by soldering onto the socket with the CPU in it. Not sure how you get a perfect connect of wire to CPU pin without soldering either. If you really wanted to get a diode reader cobbled onto a mobo that had "COP" or such, then I would pull the metal connection out of the socket and then just wire wrap the CPU and use a thin piece of heatshrink. Will the motherboard still boot without a diode temp reading though? This actually isnt a bad approach if you can get a perfect coupling of wire to pin because you could then use a water bath to calibrate the diode. |
Quote:
* http://www.amd.com/us-en/assets/cont...docs/24228.pdf |
personally i have built a few maxim based diode readers ...
suggested ics are maxim max6657/6658/6659 series .. http://www.maxim-ic.com/quick_view2.cfm/qv_pk/2578 if you want to use the internal diode on a board that already has a diode reader use desoldering braid (correct term?) on the dxp/dxn pins of your motherboards temp management ic to remove as much solder as possible ... then put a razor blade under the pin and bend it up just a very little.. only so much that the connection is gone .. now solder your wires to the back of you mobo cpu socket... very easy to undo ... |
All times are GMT -5. The time now is 09:06 AM. |
Powered by vBulletin® Version 3.7.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
(C) 2005 ProCooling.com If we in some way offend you, insult you or your people, screw your mom, beat up your dad, or poop on your porch... we're sorry... we were probably really drunk... Oh and dont steal our content bitches! Don't give us a reason to pee in your open car window this summer...