In 2007, UK nuclear power stations produced 164 TWh/y of high-grade heat that was converted to 63 TWh/y of electricity. 57 TWh/y of this was delivered to the grid, and 6 TWh/y was used on-site to run the power stations.
There are two possible ways of measuring the energy produced by a nuclear power station, modelled in the chart below. Most of the figures in this document show the electrical energy delivered (the green bars), but sometimes it is conventional to display the primary energy (the blue bars), which is the heat generated by the nuclear processes. The electrical energy is smaller than the primary energy due to the inherent conversion losses and the energy requirement of the power station itself. If the nuclear power stations were located near to buildings with heat demand they could generate combined heat and power: in return for modest loss in electrical output much of the ‘waste’ heat can be delivered to the heat-users.
Level 1 assumes that no new nuclear power stations are built. By 2050, the UK’s existing fleet has all been retired.
Level 2 assumes a 4-fold increase in capacity over 2010 levels, reaching 39 GW by 2050, equivalent to building 13 3-GW power stations.
Level 3 assumes a 9-fold increase in capacity over 2010 levels by 2050, building the equivalent of 30 3-GW power stations. The build-rate would be similar to France’s in the 1980s and would be maintained for longer (Figure 1).
Level 4 assumes a 13-fold increase in capacity over 2010 levels to 146 GW by 2050, roughly equivalent to 50 3-GW power stations. These stations produce just over 1000 TWh/y of electrical output, which is 40% of the total output of all the nuclear power stations operating in the world in 2009.