Electricity in Great Britain


The National Grid covers most of mainland Great Britain and several of the surrounding islands, and there are interconnectors to Northern Ireland and to other European countries. Power is supplied to consumers at 230 volts AC with a frequency of 50 Hz. As of 2024, wind generates 30% of the yearly electrical energy on the grid, whereas fossil gas generated just over 25% and over two-thirds was low-carbon power. Coal power ceased in 2024. Nuclear is currently the second biggest low carbon source, some of which is imported from France. The government is aiming for greenhouse gas emissions from electricity in Britain to be net zero by 2035.
The use of electricity declined in the 2010s and early 2020s, attributed largely to a decline in industrial activity and a switch to more energy efficient lighting and appliances. However demand is projected to increase considerably due to electrification, such as heat pumps and electric vehicles.
UK energy policy includes capping some residential energy price rates, and wholesale prices for some new low-carbon power can be stabilized by the government.
Nationalisation plans are currently underway following the proposed introduction of Great British Energy subsequent to the 2024 King's Speech, which also oversaw increased dedication towards net-zero targets by 2050. This is further emphasised via GB Energy through heavy investment in renewable energy sources, such as tidal power and offshore windfarms.

History

In 2008 nuclear electricity production was 53.2 TW·h, equivalent to 860 kWh per person. In 2014, 28.1 TW·h of energy was generated by wind power, which contributed 9.3% of the UK's electricity requirement. In 2015, 40.4 TW·h of energy was generated by wind power, and the quarterly generation record was set in the three-month period from October to December 2015, with 13% of the nation's electricity demand met by wind. Wind power contributed 15% of UK electricity generation in 2017 and 18.5% in the final quarter of 2017. In 2019, National Grid announced that low-carbon generation technologies had produced more electricity than fossil generators for the first time in Britain.

National grid

The first to use Nikola Tesla's three-phase high-voltage electric power distribution in the United Kingdom was Charles Merz, of the Merz & McLellan consulting partnership, at his Neptune Bank Power Station near Newcastle upon Tyne. This opened in 1901, and by 1912 had developed into the largest integrated power system in Europe. The rest of the country, however, continued to use a patchwork of small supply networks.
In 1925, the British government asked Lord Weir, a Glaswegian industrialist, to solve the problem of Britain's inefficient and fragmented electricity supply industry. Weir consulted Merz, and the result was the Electricity Act 1926, which recommended that a "national gridiron" supply system be created.
The 1926 Act created the Central Electricity Board, which set up the UK's first synchronised, nationwide AC grid, running at 132 kV, 50 Hz.
The grid was created with 4,000 miles of cables: mostly overhead cables, linking the 122 most efficient power stations. The first "grid tower" was erected near Edinburgh on 14 July 1928, and work was completed in September 1933, ahead of schedule and on budget. It began operating in 1933 as a series of regional grids with auxiliary interconnections for emergency use. Following the unauthorised but successful short term parallelling of all regional grids by the night-time engineers on 29 October 1937, by 1938 the grid was operating as a national system. By then, the growth in the number of electricity users was the fastest in the world, rising from three quarters of a million in 1920 to nine million in 1938.
It proved its worth during the Blitz when South Wales provided power to replace lost output from Battersea and Fulham power stations.
The grid was nationalised by the Electricity Act 1947, which also created the British Electricity Authority. In 1949, the British Electricity Authority decided to upgrade the grid by adding 275 kV links.
At its inception in 1950, the 275 kV Transmission System was designed to form part of a national supply system, with an anticipated total demand of 30,000 MW by 1970. This predicted demand was already exceeded by 1960. The rapid load growth led the Central Electricity Generating Board to carry out a study of future transmission needs, completed in September 1960. The study is described in a paper presented to the Institution of Electrical Engineers by Booth, Clark, Egginton and Forrest in 1962.
Considered in the study, together with the increased demand, was the effect on the transmission system of the rapid advances in generator design, resulting in projected power stations of 2,000–3,000 MW installed capacity. These new stations were mostly to be sited where advantage could be taken of a surplus of cheap low-grade fuel and adequate supplies of cooling water, but these locations did not coincide with the load centres. West Burton with 4 × 500 MW machines, sited at the Nottinghamshire coalfield near the River Trent, is a typical example. These developments shifted the emphasis on the transmission system, from interconnection to the primary function of bulk power transfers from the generation areas to the load centres, such as the anticipated transfer in 1970 of some 6,000 MW from The Midlands to the Home counties.
Continued reinforcement and extension of the existing 275 kV systems were examined as possible solutions. However, in addition to the technical problem of very high fault levels, many more lines would have been required to obtain the estimated transfers at 275 kV. As this was not consistent with the CEGB's policy of preservation of amenities, a further solution was sought. Consideration was given to both a 400 kV and a 500 kV scheme as the alternatives, either of which gave a sufficient margin for future expansion. A 400 kV system was chosen, for two main reasons. First, the majority of the 275 kV lines could be uprated to 400 kV, and secondly it was envisaged that operation at 400 kV could commence in 1965, compared with 1968 for a 500 kV scheme. Design work was started, and to meet the 1965 timescale, the contract engineering for the first projects had to run concurrently with the design. This included the West Burton 400 kV Indoor Substation, the first section of which was commissioned in June 1965. From 1965, the grid was partly upgraded to 400 kV, beginning with a 150-mile line from Sundon to West Burton, to become the Supergrid.
With the development of the national grid and the switch to using electricity, United Kingdom electricity consumption increased by around 150% between the post war nationalisation of the industry in 1948 and the mid-1960s. During the 1960s growth slowed as the market became saturated.
On the breakup of the CEGB in 1990, the ownership and operation of the National Grid in England and Wales passed to National Grid Company plc, later to become National Grid Transco, and now National Grid plc. In Scotland the grid was already split into two separate entities, one for southern and central Scotland and the other for northern Scotland, connected by interconnectors. The first is owned and maintained by SP Energy Networks, a subsidiary of Scottish Power, and the other by SSE. However, National Grid plc remained the System Operator for the whole British Grid until the creation of the National Energy System Operator on 1 October 2024.

Generation

The mode of generation has changed over the years.
During the 1940s some 90% of the generating capacity was fired by coal, with oil providing most of the remainder.
The United Kingdom started to develop a nuclear generating capacity in the 1950s, with Calder Hall being connected to the grid on 27 August 1956. Though the production of weapons-grade plutonium was the main reason behind this power station, other civil stations followed, and 26% of the nation's electricity was generated from nuclear power at its peak in 1997.
During the 1960s and 70s, coal plants were built to supply consumption despite economic challenges. During the 1970s and 80s some nuclear sites were built. From the 1990s gas power plants benefited from the Dash for Gas supplied by North Sea gas. After the 2000s, renewables like solar and wind added significant capacity. In Q3 2016, nuclear and renewables each supplied a quarter of British electricity, with coal supplying 3.6%.
Despite the flow of North Sea oil from the mid-1970s, oil fuelled generation remained relatively small and continued to decline.
Starting in 1993, and continuing through the 1990s, a combination of factors led to a so-called Dash for Gas, during which the use of coal was scaled back in favour of gas-fuelled generation. This was sparked by political concerns, the privatisation of the National Coal Board, British Gas and the Central Electricity Generating Board; the introduction of laws facilitating competition within the energy markets; the availability of cheap gas from the North Sea and elsewhere and the high efficiency and reduced pollution from combined cycle gas turbine generation. In 1990 just 1.09% of all gas consumed in the country was used in electricity generation; by 2004 the figure was 30.25%.
By 2004, coal use in power stations had fallen to 50.5 million tonnes, representing 82.4% of all coal used in 2004, though up slightly from its low in 1999. On several occasions in May 2016, Britain burned no coal for electricity for the first time since 1882. On 21 April 2017, Britain went a full day without using coal power for the first time since the Industrial Revolution, according to the National Grid. The last remaining coal power station in the United Kingdom ceased operating on 30 September 2024.
From the mid-1990s new renewable energy sources began to contribute to the electricity generated, adding to a small hydroelectricity generating capacity.