2

I understand that if supply and demand are not in equilibrium synchronous generators make the difference by increasing or decreasing their rotational energy - thus changing the electrical frequency in the grid. It is usually noted that a frequency shift $\geq 1\%$ is unacceptable and might damage the grid; however, I cannot find anywhere what in particular such frequency shift can damage. Which components of the grid are the ones we are mostly scared of getting damaged?

I am also curious what are the usual methods of mitigation of such situations, if I understand correctly, if the generated power is much higher than the demand the power stations disconnect and a blackout occurs (i.e. on a very windy night in a country powered by wind power stations). Similarly when the demand is much higher than the supply then some areal blackouts are purposefully introduced to decrease the load.

Lastly, are there currently any methods - beyond classical generators - that are widely used to increase the grid inertia?

Akerai
  • 123
  • 4

1 Answers1

2

A 1% shift in grid frequency will shift the running speed of every AC induction motor connected to it by 1%. Factory machinery and AC-powered clocks would all slow down, with potentially disastrous effects in those factories and big upsets would occur in any setting where event timing relied on AC synchronous motor-driven clocks.

As to the condition of the grid, its components are all tuned and impedance-matched to manage the transmission of power at the line frequency. Running the grid off-frequency would knock all the tuning off, and potentially cause unmanageable current and voltage transients and surges in the transformers, lines, and switchgear.

Furthermore, if part of the grid goes off-frequency then there is another bad effect that arises, in that all of the generators in the grid normally tend to automatically sync themselves together so they are phase-locked. If one part of the grid goes out of phase lock, then there will be power flowing not from all the generators to all the loads, but power flowing between the on-phase generators into the off-phase generators to "motor" them back in phase. Those currents could be large enough as to overload those parts of the grid between the out-of-phase nodes and cause the voltages in the whole grid to fluctuate.

There are very complex automatic load-and-generator-capacity management systems in place at all generating nodes, which hold the system in balance and maintain phase lock on the correct line frequency over a broad range of loads and generating capacities- with human override in case of accidental failures.

Increasing the grid "inertia" means adding inductance to it so it strives to maintain constant current at the design frequency for short duration perturbations. This is done within the grid by using large coils or capacitors in the line, usually near locations where large amounts of power are being consumed, to "trim" it and thereby keep the correct phase relationship between voltage and current. This is known as power factor adjustment.

niels nielsen
  • 15,513
  • 1
  • 15
  • 33