Adding Intelligence to the Grid
Power grids around the world are facing similar challenges. One of the biggest is the rise in renewable energy generation of all kinds; solar and wind energy are great for the planet, but they are as unpredictable as the weather. Schemes designed to encourage consumers to put solar panels on roofs and use electric vehicles to store energy mean the grid is morphing from one-directional to bi-directional. And instead of demand prediction, utilities now need to predict both supply and demand in real time, at very fine levels of granularity.
You can’t do much in a digital world without power. Private and public sector efforts are underway to boost transmission capacity in aging power grids. We examine what it will take to build reliable grid infrastructure that meets soaring electricity demand in our upcoming Resilient Grid Special Project.
“The ability to add AI into the mix and do real time analytics at the edge is going to be critical for increasing the amount of distributed energy resources that can come online,” Marc Spieler, global business development and technology executive for the energy sector at Nvidia told EE Times.
Spieler pointed out that great work is being done in wind, solar and electric vehicles, but if the grid doesn’t have the ability to support these applications, the effort is wasted.
Demand prediction draws on many complicated factors. Aside from the weather, real time prediction might include complex tasks such as predicting how many electric trucks will arrive at which filling station and require battery charging at what exact moment, for example.
“It’s going to come down to hour by hour, minute by minute type decisions,” he said. “And AI is the only thing that’s going to allow that to become efficient.”
Utilities typically subscribe to detailed weather prediction services today, feeding this data into complex models to try to predict energy demand.
“The people doing this best are probably the financial services companies, the hedge funds that are buying and selling power,” Spieler said. “Those guys are making huge investments in AI and they’re capitalizing on the profit.”
However, Spieler said, utilities are upping their game.
“We are seeing a ramp up of data science in the utility,” he said. “Some of the [utilities] we’re working with are ramping up their data science communities. We’re starting to sell hardware DGX systems [Nvidia data center-class AI accelerators] into utilities for the first time.”
Among the techniques available to utilities for AI at scale is federated learning — a technique in which a central model may be trained using data from multiple sources, without the data having to be centralized or shared. This is often used in the healthcare sector for medical AI models, since having access to more training data can make the models more accurate, but the data cannot leave hospital premises. In essence, local versions of the model are re-trained at the edge, then updates to the model parameters are centralized to make the overall model better. Nvidia has a platform for federated learning called Clara.
Spieler pointed out that large-scale demand and supply prediction models for the electricity grid would be an interesting use case for Clara.
“[Utilities] can’t share their data, but they aren’t exactly competitors either, since there’s only one set of power lines going to your house,” he said. “We believe we can use federated learning to get the entire industry working together by training their models and sharing the model weights with larger organisations that can consolidate those.”
This could enable more accurate models to predict the grid’s response to unusual weather conditions – for example, a model deployed in a desert state could be trained partly with data from further north, which will include more instances of those particular conditions.
The grid of the future will also make use of AI at the edge.
The “smart meter” of ten years ago will get smarter. The use case for smart meters has changed from getting rid of human meter readers to taking more of a role in predicting consumer demand and supply from solar panels and electric vehicles using AI.
According to Spieler, today’s smart meters use very little of the data they have access to. A typical meter might have eight channels of data available, while downstream devices such as smart thermostats might be collecting as many as 20 or 30 channels of data.
“Every smart meter today has a chip in it – the question is will it be powerful enough to process the amount of data?” he said. “We envision the smart meter could become like an iPhone – it captures a ton of data, then utilities, consumers and others can apply applications on top of that, to optimize energy efficiency.”
If a substation goes down, smart meters could provide the necessary data to create a microgrid in a particular neighborhood which could share energy from solar or electric vehicle batteries between neighbors. In the event of extreme weather, AI-enabled smart meters could also potentially be used to switch off power to non-essential loads as a kind of smart load-shedding scheme. (Spieler’s example here is that during the recent Texas power grid crisis, power to Houston’s pool pumps could have been shut off in order to preserve supply to homes which were running life-preserving medical equipment).
“[We could] take energy consumption off the grid in a surgical way,” he said. “Today it’s done purely by turning [whole cities] on and off, but in the future you could make a decision based on, say, the temperature of somebody’s home.”
During a cold snap, smart meters could reveal which homes are at or below 40 degrees, and then the grid could prioritize their power supply so their pipes don’t freeze.
“AI is going to provide this level of visibility,” he said. “The data exists, we’re seeing that with Nest thermostats and other things behind the meter. But that data doesn’t get back to the utilities to make decisions in a better way.”
Grid AI algorithms
Veritone is one of several companies developing AI solutions for grid management. The company’s CDI (Cooperative Distributed Inferencing) technology is designed to ensure predictable energy distribution and resilience across the grid. The system uses forecast data and rules to build and continually update device state models, which are then used to intelligently control edge devices.
“No human could properly control the grid,” Sean McEvoy, senior vice president for energy at Veritone told EE Times. Analysis of massive volumes of data is required to continuously monitor the state of every energy-generating and -storing device on the grid, as well as energy demand, weather patterns, transmission flow and energy prices, he said.
Only AI is capable of doing all this.
“Continuous real-time modelling of this tsunami of data delivers the intelligence to constantly know how much energy every grid participant needs, and how much energy can be to delivered, at any given moment or in the near future,” McEvoy said. “Not only can no human do this – even massive computer power alone is also not enough. It demands edge compute power combined with intelligent reinforcement and adaptation learning.”
Reinforcement learning is a technique where an AI agent (an AI algorithm that takes some kind of action) is trained to maximize some notion of reward. This enables the agent to effectively learn from the consequences of its actions rather than specifically being taught. And the adaptive nature of Veritone’s algorithm means it is constantly updating its model of the system as it evolves with changing conditions in real time.
McEvoy explained further that Veritone’s AI model creation for the grid uses a “distributed, constrained Hamiltonian” approach. Distributed in this context means that inference is done at the edge in order to improve latency. The models can be constrained by rules, such as device warranties, or rules handed down by the North American Electric Reliability Corporation (NERC) or the Federal Energy Regulatory Commission (FERC). Mean field algorithms are used for model synchronization and linear algebraic models are used for forecasting. Veritone’s simulator uses Monte Carlo techniques to model the probability of different outcomes.
An overview of Veritone’s AI solution for the electricity grid is shown below. Information about domain rules is converted into parameters by the rule translator. The tomograph learns and updates a model of the device under control, the optimizer continuously creates a policy that satisfies operational and behavioral goals, and the mean field synchronizes CDI agents by projecting states of the entire network onto local agents. The edge controller controls the edge device, the blackboard provides a flow of information, and the forecaster uses past and current sensor data to forecast future energy factors, including demand.
Veritone targets utilities and independent power producers (IPPs) such as microgrid developers and operators, as well as working with equipment providers to develop and deploy AI models and predictive controllers for resources such as solar inverters and battery systems.
“The software can be deployed centrally, at substations, data centers, or at the edge to control edge devices and provide edge inferencing,” McEvoy said. “Real-time synchronization of multiple grid edge devices provides a holistic model view of the plant state and capacity.”
On what scale will these AI technologies be rolled out, and does it make sense to start relatively small, perhaps with microgrids? Or are there downsides to starting out piecemeal?
“Generally, Veritone recommends starting piecemeal by controlling a single site’s energy resources,” McEvoy said.
Rolling out AI technology to smaller sites first – perhaps 25-50 MW of renewable energy generators and storage – can provide confidence to grid operators.
“Once that site is optimized with AI, then it can expand out to multiple sites and the AI can synchronize at the site level,” he said. “The technology can manage and control nanogrids, microgrids, and the broader utility grid, but complexity grows exponentially as you scale up.”