Collaboration between utilities and third party software developers will be crucial in the implementation of the smart grid, bringing innovation to large energy suppliers and reliable customers for data firms.
Large utility companies need to embrace “big data analytics” to better understand the “health of our assets,” Karen Austin, the senior price president and chief information officer at Pacific Gas and Electric, said.
“We need thoughtful data architecture that minimises the time and processing power that it takes to get through all of that data. We need better hardware planning and innovative data models.
“We need database analytics to provide new tools for our associates. Spreadsheets are now like tea cups in the big data world; if I’m going out for a swim in the big ocean I don’t want the tea cup anymore – I need scuba diving equipment. These are new types of tools that need to be developed so we can easily mine that data… to keep the lights on and the gas flowing for our customers.”
Austin told Greentech Media’s Soft Grid conference in San Francisco this week that “big data” technologies could improve reliability and prioritize power restoration after natural disasters such as the Loma Prieta earthquake in 1989.
“There were slower methods [then] and it took a lot longer to get things back to normal. You can imagine the damage that it does to lives, homes and businesses.
For much more detail on everything from “the new plastic” to data architecture, Hadoop and recruiting the brightest technologists, read on.
“Hundreds of thousands of people without power including government, hospitals, schools. There were millions of customers looking to PG&E to restore order amid that chaos,” Austin said.
“Big data and the tools and analytics [are] key to improving the reliability of the grid.”
PG&E yesterday urged customers to conserve electricity because of a moderate inland heat wave in the southern part of the state.
The California ISO (CAISO) issued the alert in response to forecasts of high demand at 47,500MW peak load from air conditioning to combat the hot weather.
PG&E also activated two demand response programs for more than 2,000 business locations and one program for residential customers, reducing demand by approximately 60 MW of electricity, said a company statement.
David Leeds, chief analyst at Greentech Media, said that now utility companies had made smart grid infrastructure investments at the generation and customer levels, open source platforms such as Hadoop would be vitally important in developing applications.
“Smart grid, much like an iPhone, is a platform creating new possibilities that weren’t available to us 6-12 months ago. We’re really at this new threshold of what’s possible.
“We are moving up the stack from the foundational structure investments into the applications layer and we’re on the cusp of tremendous wave of innovation.
“Global data is reported to be doubling every two years – moving from gigabytes to exabytes.
This exponential growth is putting a strain on systems [such as] the utilities industry …”
What’s the New Plastic?
Leeds said that “big data” was being regarded by some sectors, such as online retail and even some venture capital investors, as a new economic asset class, or even the “new plastic” thanks to its versatility in various applications.
“The game has changed and the utilities are slow to recognise this. If you talk to any financial institution or online retailer, the feeling is let’s get our hands on as much of this data as we possibly can. Give us all the data and we’ll figure it out and get some fresh observations.
“Utilities are little more skittish – just give us what we can manage. They haven’t really accepted new architectures which allow them to do everything they want.
“The application space for big data is still light. You’re not going to find an application specific company that does wind turbine tracking. [But] if you’re spinning turbines and not able to sell power into the grid, it’s a big grind on those assets. You will start to see those specific app companies. That will be really exciting because we need them.”
Josh Gerber, lead architect for smart grid at San Diego Gas & Electric, said that the utility already had 2.2m nodes for communication on its network. SDG&E uses large amounts of data to identify weaknesses, fire hazards or transformers that were strained by unexpected patterns in solar PV generation or demand from electric vehicles, he said.
Surfing the Data Tsunami
But he warned that the “data tsunami” would need to be managed with distributed analytics to avoid conflict.
“There’s risk of … asynchronous responses to events on the system that create oscillations that lead to less reliability. The way I think you have to deal with the [data] tsunami is to start building some seawalls. You have to keep some of that wave closer to the edge of the network – meaning that you don’t need to backlog every bit of data from every … sensor from across the system.
“You need to process those data closer to the edge of the network and add a layer of control so you’re distributing the processing in distribution management.”
Gerber said that SDG&E preferred to cultivate its own inhouse competence in smart grid tools, but faced an “arms race” with Google and Facebook for computer scientists and software developers.
“We’re looking to hire a new breed of technologist, ones that understand power systems but also have a background in computer science, that’s able to become that blended smart grid technologist that we’re going to need in the future.
“We are reluctant to outsource the intelligence that we need to run our utility effectively. This arms race for data scientists – we can’t compete against the likes of Google and Facebook to get the best and brightest around data.
“I’m reluctant to outsource intelligence. It’s very inefficient to churn people through consultants and contractors to do the most challenging tech development and implementation and have them go on and supply value to the next customer.”