I was learning about PWM circuits and something confused me:
If you have 1V going into a 1 ohm load for 1 hour we would get: 1V * 1A * 1 Hour = 1 Watt Hour
If you have 1V going into a 1 ohm load for 30 minutes we would get: 1V * 1A * .5 Hours = .5 Watt Hours
What if we had 1V at 50% duty cycle going into a 1 ohm load for 1 hour?
My intuition tells me it should be equal to running the 1V at 100% duty cycle for .5 hours and at 0% for .5 hours: 1V * 1A * .5 Hours + 0V * 0A * .5 Hours = .5 Watt Hours
However, another way to look at it would be: .5V * .5A * 1 Hour = .25 Watt Hours
Which is correct?