Data Analytics: Perspectives from the Utility Industry

Energy industry leaders on the role of data in developing the potential of smart grid capabilities - by Bart King, Cleantech Communications

What puts the “smart” in smart grid? It isn’t the digitization of hardware that routes electricity from the generating source to the porch light bulb, and it isn’t the machine-to-machine communications that allow these components to send vast amounts of information to data warehouses. Rather, it is the analysis of that information resulting in better-informed operational and business decisions.

The Electric Power Research Institute (EPRI) predicts smart grid investments of $338 billion to $476 billion could yield $2 trillion in benefits by 2030. Deployment of advanced metering infrastructure (AMI) and meter data management systems (MDMS) is underway across the U.S., but the industry has only scratched the surface of capabilities for load adjustment, demand response (DR) and price signaling. 

Data analytics is a critical piece for completing the promise of the smart grid and delivering greater energy efficiency, grid resilience and customer satisfaction in the years ahead. It’s a process that requires cooperation among experts in operations, business and IT in order to ask the right questions, calculate the answers, and implement optimized systems and procedures.
I recently caught up with four industry leaders to gather their thoughts on these and other issues.
 
What have been the biggest challenges and/or benefits of employing data analytics in the utility industry thus far?
 
Glenn Steiger, General Manager, Glendale Water & Power: The challenges for GW&P have been deploying a fairly large system in two years, transitioning to full operation, and integrating our electric and water systems for the first time. This integration has also been the biggest benefit. Roughly 30 percent of our operating cost on the water side is associated with electricity costs for pumping. So we’re taking real-time data from the electric side to determine the least expensive time to pump the water, which directly benefits our bottom line. Data analytics have also paid big dividends in leak protection, allowing us to find and repair water leaks well before we would have been able to previously.
 
Tor Garman Project Manager, Demand Response and Distributed Energy Resources, NV Energy: Data analytics are critical for the operational challenges of making DR reliable and economical. At NV Energy, analyzing historical data allows us to accurately predict how much DR we can produce at a given location and temperature. During a DR curtailment, we have near real-time analytics that enable us to monitor how much DR we’re actually getting. In the near future, everyone’s going to have an interval meter, and we will use data analytics to pay customer incentives based on their individual household data. 
 
Brian Rich, Senior Director of Customer Care and Demand Side Technologies, PG&E: For the first time ever, the utility industry can give customers feedback about their energy usage—not at the end of 30 days—but in near real-time. We have an opportunity to take complex data and present it to customers in a clear, coherent fashion to help them make better decisions at home or at work. For PG&E residential customers, we’re finding the more we put energy information in terms of dollars and cents, as opposed to kilowatt hours (kWh), the more engaged customers become in taking control over their energy usage. With our large commercial customers, data analytics allow us to have very sophisticated conversations around how they operate building management systems, what types of rate plans they qualify for, and what types of programs they can enroll in to save energy and money. The response has been phenomenal.
 
Mitch Wondolowski, CEO, Grid Solutions: From our point of view at Grid Solutions, proprietary platforms for business intelligence are the challenge. The telecom industry made this mistake early on—boxing themselves in with the offerings of a particular vendor. After about a decade, the industry switched to primarily open source systems. Open source would reduce the entry cost for utilities to get into data analytics, and that’s especially important for the budgets of coops and muni’s. They would only have to purchase the applications they need right now and could add pieces later without buying a whole package.
 
To find out more about data analytics in the utility industry, download the full report here.
 
Glenn, Tor, Brian and Rich will also be speaking at the Data Management and Analytics for Utilities Conference, taking place on June 27-28 in San Francisco.
For more information go to: