0

Say we have an interrupt that is generated once each time that 1024 bytes of network traffic arrives. Each interrupt takes 3.5 microseconds to process and the network speed is 100Mb.We want the amount of cpu used per second

Is it correct that:

1 interrupt      3.5e-6 seconds     3.4e-9 seconds      1.25e7 bytes
----------    x  -------------- = ------------------ x ------------- = .043
1024 bytes       1 interrupt        1  byte              1 second
Tulains Córdova
  • 39,570
  • 13
  • 100
  • 156
user63210
  • 31
  • 1

2 Answers2

1

In order to calculate CPU per second, we need to have a clear definition of what that is. The only sensible way to define that is based on the number of instructions the CPU can execute in a second. Then you need to know how many instructions your application trying to execute per second and divide that by the CPU's capacity.

For example, if the CPU can execute 1 million instructions per second and your application executes 500 K instructions in 2 seconds, your program used up 25% of the CPU during those 2 seconds.

The details you have here about bandwidth and interrupts tell you nothing about how many instructions this application processes per second. Either you trying to calculate something other than CPU usage or you don't have enough information.

JimmyJames supports Canada
  • 30,578
  • 3
  • 59
  • 108
0

I come up with the same result:

(100E6 / 8 / 1024) * 3.5E-6 = 0.0427 CPU seconds per second.

Simon B
  • 9,772