When the power goes out, electricity providers are often left in the dark along with their customers. That status quo is what’s keeping Naresh Acharya up at night. He is now planning to use some of the world’s most powerful supercomputers to help keep the lights on, while also allowing wind farms to produce more electricity and making the electrical grid more efficient.
“Right now, the power grid isn’t transparent,” Acharya says. “Grid operators don’t always see when something happens. We want to help them maximize the use of their assets in real time.”
GIF credits: GE’s Datalandia series.
Acharya works as a senior engineer at GE’s global research labs in upstate New York. He says that in order to keep their systems safe, grid operators sit down every few months to figure out the maximum amount of power that can run safely through their systems in the worst conditions.
“These analytical tools have been in place for decades and they are very rigid,” Acharya says. “The worst-case scenario may apply to just a few days during a heat wave or a winter storm. This type of thinking is leading us to overdesign and overbuild the grid. With real-time knowledge, we could be getting much more out of our assets without building out a new grid.”
Acharya’s team at GE Global Research is now working with GE Energy Consulting, GE Digital Energy, the Pacific Northwest National Laboratory and Southern California Edison on a software system that could simulate and control the grid in real time.
Many of the tools currently used by utilities to manage the grid were designed for computers with a single processing core, like traditional PCs. As a result, they cannot take advantage of high-performance computers with multiple cores, which are available today. “Utilities can monitor the health of the power grid, but the problem is that anything can go wrong at any time,“ Acharya says. “Today, we can’t find out quickly what are the best actions to take.”
The team is building grid analytics tools for powerful multi-core computers, like the machines at the national laboratory, that can carry out multiple tasks at a time. This method, called parallel processing, allows the team to screen data coming over the Industrial Internet, from sensors, generators and other equipment distributed along hundreds of miles of high voltage wires that make up the grid. The software is able to extract from the data deluge a few dozen key signals that have the biggest impact on the stability of grid. “It tells us where we might have a weak spot,” Acharya says.
Grid operators will be able to use the system to quickly answer questions like which generators should increase or decrease output, and what is the optimum amount of electricity that should be flowing through the grid at a given time.
The team is already able to apply parallel processing to existing GE power management software developed by GE Energy Consulting, and to speed it up. The scientists will now use their findings to develop new software specifically designed for parallel processing.
The long term goal is to help utilities maximize the use of their systems. This, in turn, could increase the amount of renewable power flowing through the grid.
Some wind farms, for example, turn the blades of their turbines out of the wind when the grid cannot take any more electricity. “The new system will help utilities to predict outages and fix equipment before it breaks down,” Acharya says. “But it will also help them bundle in more renewable power from wind and solar farms without building new grids, which is becoming harder to do.”