This post originally appeared in LinkedIn.
The power industry is sitting on a goldmine of data.
Data can unlock valuable insights on how to further improve grid operations, be it through the more efficient delivery of energy from generation to the end customer, better planning of capital investment, or even near real-time guidance of how to triage and navigate restoration activities in severe weather events. Right now, however, as an industry we are leveraging a small fraction of operational data – it’s generated for a specific use, and then relegated to a box somewhere in a remote office, rarely to be seen again. It’s not that the data is unusable, but it requires very specific expertise that spans both domain and software to unlock its potential and drive actionable insights.
Enter GE’s grid analytics portfolio – released in general availability in June at our Americas User Conference and engineered to make powerful analytics accessible to all. Leveraging our analytics on top of a utility’s current operational software helps move the needle on hyper-critical goals for electric utilities, moving from a reactive state to a proactive state through advanced analytics. Outcomes will generally include minimization of customer downtime via outage prevention and reduction of time to restore when outages do occur. This improves customer satisfaction and ensures greater control over operating costs for our utility customers.
What’s especially exciting to me is the convergence between the agility of the analytics available and the additive nature of each incremental analytic. Rather than asking our customers to make a huge investment in a data platform, we designed the analytics with speed to delivery of value in mind. At the same time, we built all analytics with the same production pipelines and infrastructural strategy to ensure that a customer could build out their data platform incrementally as they added new analytics.