Quantum computers can accelerate the transition to net zero power grids
News Team
Power grid operators, such as National Grid in the UK, rely on high-performance computers to plan grid expansions and to schedule when energy should be produced from different sources. These problems are becoming larger and more complex due to the transition to net zero carbon emissions, and they are now reaching the limits of even the world’s largest supercomputers.
Quantum computing opens up a new avenue for progress. My research group at the University of Oxford investigates how quantum computing can offer value for the net zero transition. My colleague Xiangyue Wang and I recently published a paper in the journal Joule that identifies promising opportunities for quantum computing to help optimize the planning and operation of net zero power grids.
Over the next five years, National Grid plans to spend £30 billion on updating power grid infrastructure as part of the UK’s transition to a decarbonized grid. Large investments are also planned for low-carbon technologies, including wind, solar, nuclear and batteries. Additionally, millions of electric vehicles (EVs) and heat pumps will be added to local distribution networks to decarbonize transport and heating.
Planning decisions, including where to build renewables, when to upgrade electrical transmission lines, and how to roll out EV chargers, will directly affect how high our energy bills are, how often people experience power cuts and how quickly the UK can achieve its net zero targets. With billions being invested in the grid, it is crucial that grid planners understand how to spend this money wisely.
In addition to grid planning, operating a net zero grid is also a challenging optimization problem because grid power flows must match demand while remaining within safe limits at all times. Otherwise, the grid risks power outages. This is becoming more difficult because of the variability and uncertainty of wind and solar generation.
Another challenge is the electrification of transport and heating, which concentrates demand when people arrive home from work. One solution is to adjust when EVs are charged and when heat pumps are run. Small shifts in usage, added up across millions of homes, can be equivalent to the output of large power plants. However, this significantly increases the number of devices on the grid that need to be scheduled, making scheduling much more difficult.
The race to innovate
In 2019, Google demonstrated quantum supremacy—solving a problem that no classical computer could solve in any feasible amount of time—by completing a physics simulation problem in 200 seconds. That same problem would have taken an equivalent classical supercomputer 10,000 years to solve using the best algorithm known at the time. This kicked off an ongoing race between researchers working on expanding the limits of both classical and quantum computing. Quantum computers are now reaching the scale and maturity where they can offer tangible value for industries including pharmaceuticals and finance.
Classical computers store information in strings of bits, where each bit has a value of 0 or 1. Logical operations on bits are used for computation. Within a quantum computer, the basic unit of information is instead the quantum bit or “qubit.” Qubits can be built in a variety of ways, for example using superconducting circuits or atoms trapped by lasers.
When measured, a qubit will read as a 0 or 1 just like a classical bit. However, within a quantum computer, qubits can be controlled using the principles of quantum physics–the laws governing the behavior of subatomic particles. This lets quantum computers represent large amounts of classical information with only a few qubits and perform specific types of calculations that are practically impossible for classical computers.
Researchers describe quantum computing as being in the noisy intermediate scale quantum (NISQ) era. Large, general-purpose quantum computers are expected to remain out of reach for at least a decade. However, NISQ devices already show promise for combinatorial grid optimization problems. These are problems with interlinked yes-or-no decisions that create an exponentially large set of possibilities, such as deciding where to build new generators, which transmission lines to upgrade and which specific power plants to start up or shut down.
There are also a wider set of opportunities where quantum computing is underexplored. Quantum computing could speed up the simulation and optimization of grid power flows. It could also accelerate machine learning—the use of algorithms that improve their performance when exposed to data. This could help grid operators make use of high-volume smart meter data to improve forecasting, scheduling and planning. With small NISQ devices, a promising approach is to team them up with large classical computers and use them to accelerate specific parts of complex algorithms which are most suited to quantum computation.
Despite the early stage of power grid quantum computing research, there are already industry initiatives underway to develop quantum algorithms that could enable grid expansion and intelligent scheduling of EV charging.
Given the goal of decarbonization, the energy needed for quantum computers is a potential concern, particularly the energy for cooling, since quantum computers often require extremely low temperatures (near absolute zero or -273.15°C) for reliable operation. However, research indicates that when a quantum computer can solve a problem using many fewer operations than a classical computer, this can also save energy. For example, Google’s quantum supremacy demonstration not only massively increased the speed of computation, but also reduced energy use by a factor of 557,000.