1

how can I determine the theoretical peak power cost per hour for my computer? What is the formula I can use to calculate this? Do I need to know amperage?

My current cost per hour is $0.11/kWh

  • 1200W power supply in my computer
  • 2 - 110W 24" monitors
  • 1 - 300W speaker system
NDBoost
  • 113
  • 6
  • I'm going to guess with music playing, all monitors on, computer being actively used, you're drawing 350W. Please let me know after you measure it :) – Bryan Boettcher Jun 04 '12 at 17:07
  • I modified my question, im asking for the theoretical peak power consumption.. whats the formula used to calculate this? – NDBoost Jun 04 '12 at 17:16
  • 3
    (1200 + (2 * 110) + 300) / 1000 * 0.11 * hours. Add all wattages, put wattages to kilowatts, multiply by cost per hour, multiply by hours. – Bryan Boettcher Jun 04 '12 at 17:17

2 Answers2

5

Simplest solution: Kill-A-Watt power meter.

Plug in your power strip to it and let it calculate the power consumption for you. The best part is that it has direct carryover and will let you monitor power usage of other devices that you don't have nearly as much info on as you do your computer.

As far as theoretical peak comsumption, and subsequently the hourly cost:

Peak power consumption: sum of power draw of devices

Hourly cost at peak usage: peak power consumption * hourly rate

So, for your case: the peak power consumption would be 1200W + 110W + 110W + 300W, roughly. That comes out to 1,720 watts. That would be 1.72 kilowatts. So, multiply that with your hourly power rate, and you have a cost of ~$0.19/hour to run your system at full bore.

Toby Lawrence
  • 3,128
  • 3
  • 27
  • 34
  • although great, i am also concerned about the maths behind it. No sense in buying a calculator if you cant do simple addition without one in the first place, right? ;) – NDBoost Jun 04 '12 at 16:58
  • 1
    As Brian mentioned in his answer, your system is likely never running at full throttle all the time. That's the only time you'd be able to do the "simple addition" to calculate the hourly cost... and it'd only be the hourly cost for running at full power. Otherwise, your system is pulling a constantly-varying amount of power which is where the Kill-A-Watt power meter becomes very useful since it can measure and add up that power usage for you automatically. – Toby Lawrence Jun 04 '12 at 17:12
  • 1
    It also takes a LOT of hardware to draw 1200W on a computer. We're talking overclocked CPU, a dozen 15k drives, AND multiple dual-slot GPUs. Nearly all of those PSUs are snake oil. – Bryan Boettcher Jun 04 '12 at 17:24
  • Thanks, i know the consumption is most likely inaccurate but it at least gives me background knowledge of how it is calculating things. I have a lot of hardware, 10 drive RAID 5 array, dual gtx 480s, i7 920 OCd to 4.5ghz plus liquid cooling (fans, pumps etc for this). The system is used for video editing, 3d modelling and gaming. – NDBoost Jun 04 '12 at 17:24
  • 1
    Yup. The theoretical peak power draw is super super simple but useless 99% of the time. :) – Toby Lawrence Jun 04 '12 at 17:27
  • Holy crap @Mike, you're probably up to 700W of draw then. Kudos, a very rare sight! – Bryan Boettcher Jun 04 '12 at 18:06
1

While I would do what @Toby Lawrence says, for peak you would just add the powers in your question. However the power supply won't require the full 1200 W if the components in the computer don't need that much. But that is the conservative number.

However it is likely the power supply and speakers aren't drawing that much. You would need to make sure everything is going at full tilt to get the maximum. Games are good for that.

Brian Carlton
  • 13,312
  • 5
  • 44
  • 65
  • What is the mathematics used to determine the cost though? I understand @Toby's answer and thats the route i'll go, but i'd still like to know how one could theoretically calculate the cost/usage – NDBoost Jun 04 '12 at 17:17