Computers and electricity bills/KWH
I have my Athlon CPU running at 218x11 2.03V. I'm guess that it produces about 120 watts of heat. Does that mean that I'm acutally using 120 watts to power the CPU? Does adding 0.4 volts to your CPU make a difference in your power bill? I'm trying to figure out why our electricity bill has gone up alot.
|
The amount of heat produced by your system is equal, well actually less, than the power it draws. The easiest way to figure how many kilowatts your system will consume is to multiply your PS wattage by 0.72.
24(hours per day) X 30(days per month) / 1000(conversion to kilowatt) X Power Supply Of course if your maxing out your power supply it wont last very long, but in my boox its always better to over estimate when it comes to cost. It will also be less if you don't run your system 24/7 |
Quote:
What I'm really trying to get at is when we overclock, we raise voltages by about no more than 0.4 volts. Let's say I have 2 computers with the exact same components, one overclocked, the other one isn't. Will the overclocked computer with the CPU at 0.4 volts higher consume alot more power than the stock computer, assuming both computers are used exactly the same way at the same time? 0.4 volts doesn't sound like much. Off topic: I just saw your sig and I also have the NF7-S Rev2, what's your FSB at? What's the stock speed and volts of your 9600Pro? |
Total power consumption = Total heat produced by computer. Remember energy is neither created or destroyed.
As for your power consumption, I'd have to guess you're consuming about 200-250 W. This means that every 4-5 hours your computer will burn through a kWh of electricity. |
If your CPU is giving out 120W heat, chances are it's using at least that amount of electrical energy. As pdf27 stated, energy is neither created or destroyed. Since the CPU produces no sound or light energy, all the energy coming out will be as heat.
The reason a .4v difference can increase heat output so much is for two reasons. The CPU is consuming 1.65v at spec, so a .4v increase might seem small in our head but to the CPU it's a 24% increase in voltage! The second and main reason the CPU gets so hot when you bump up voltage is because you're also trying to raise its frequency. You don't raise CPU voltage for no good reason, you're trying to make the CPU do more work. So of course it's going to get hotter and draw more current (think of current as the amount of 'juice', and voltage as the size of each glass). More current means more heat since Power(watts) = Current x Voltage, and since you've already raised the voltage 24% your power consumed/given-out-as-heat rises considerably. This link's useful for calculating the cost of running your PC: http://www.ukpower.co.uk/running-costs-elec.asp |
All times are GMT -5. The time now is 07:08 PM. |
Powered by vBulletin® Version 3.7.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
(C) 2005 ProCooling.com If we in some way offend you, insult you or your people, screw your mom, beat up your dad, or poop on your porch... we're sorry... we were probably really drunk... Oh and dont steal our content bitches! Don't give us a reason to pee in your open car window this summer...