Big Data from SmartGrid tells Utilities more than they want to know
Mo Data stashed this in Big Data in Supply Chain Mgt and Heavy Industry
2 part article:
As the electric grid gets smarter, vast quantities of data are arriving at utility companies that have no idea what to do with them, according to electric industry experts who gathered in Chicago this morning.
Electric utilities already possessed 194 petabytes of data by 2009, according to one estimate (the entire collection of the Library of Congress is believed to amount to about 3 petabytes), and every day more terabytes are showing up at utility company data centers nationwide.
For example, smart grid investment grants have driven the installation of phasor monitoring units (PMUs) that collect voltage, current and digital status measurements as often as 30 times per second at many points along the grid. The data is supposed to help utilities manage the grid better, especially as renewable sources contribute more power.
“The amount of data this is generating is phenomenal. It’s generating terabytes of data,” said Paul Myrda of the industry-sponsored Electric Power Research Institute, during a panel at the Great Lakes Symposium on Smart Grid and the New Energy Economy, which concluded today at the Illinois Institute of Technology.
“There’s a lot of data there; what do you do with it? How do you take advantage of it? How do you manage it? At what point does it become not useful?”
EPRI recently surveyed utilities to find out what they’re doing with the data.
“By and large right now, most of the utilities that have access to this information are acquiring it, they’re keeping it forever, few of them have any idea of how they’re going to archive it or move it offline. Any decimation of the data is unclear at this point,” Myrda said.
“The other problem is that they do not have an easy way to link the data to what the system state was at any point in time, so you have voltage and current measurements without the typology associated with it. What good is it after a day, two days, a week, a year?”
Myrda has been an advocate for the installation of PMUs, he said, and he has followed the development of software that utilizes the data. While some applications have been developed, he said, “there’s no killer app.”
“These are real problems. There’s real issues, and what I’ve just articulated is one dimension of the problem around big data and utilities.”
Another dimension involves information technology. Most utilities have equipped themselves with state-of-the-art computers, databases and networks—today’s best-practices in information-technology architecture, said Dan Rosanova, a senior technology architect with the consulting firm West Monroe Partners.
Today’s IT architecture works well for today’s common uses, which often involve accessing processors and data over networks to solve particular problems.
But the kind of big data utilities are collecting now would overwhelm today’s networks, Rosanova said. For example, a single calculation might require access to 4 terabytes of data—more information than the Library of Congress holds, across a network.
“This is where the centralized model, which serves us well for most of what we’re doing now, falls apart when you get into big data. And that’s with 4 terabytes. Wait until you get to these petabyte scales that we were talking about earlier.”
So architects like Rosanova have to plan a new kind of IT infrastructure, but they still don’t know exactly what demands it will have to meet:
“We don’t know how we’re going to use all of this data yet,” he said. “And a lot of the tools we’re going to use probably haven’t been invented yet, to be quite frank. The reason big data is such a big challenge, is not only do we not know what tools we’re going to use, but the future demand on storage capabilities is pretty unclear at this point. We just know that they’re big.”
“This is a bigger challenge than most people are prepared for at this time.”
Myrda believes the challenge represents opportunity for engineers to find applications for the data and for large utilities to collaborate.
“Maybe some engineers could make a buck or two by selling their app,” he said. “This is a place where we can leverage the industry at large and some of the most innovative talent that’s out there.”
If you’re an electricity thief, watch out for the Smart Grid.
“Utilities have been using the Smart Grid data to find all kinds of creative ways that people have been stealing electricity,” said Mel Gehrs, a Smart Grid expert with Silver Springs Networks, yesterday at The Great Lakes Symposium on Smart Grid and the New Energy Economy hosted by the Illinois Institute of Technology.
“That’s by far the easiest (use of the Smart Grid) to monetize. I think ComEd said in 2011 their theft alone was over $60 million.”
But for the most part, utilities have yet to realize the potential of the flood of new data that has begun flowing to them from the power grid, panelists agreed. And in some cases, they may not welcome it.
“Smart Grid data is beginning to give us a view of what the customer is actually experiencing, something that we’ve never ever seen before,” Gehrs said. “Now we’ll tell you that some utilities are not thrilled to know this, because it’s kind of a two-edged sword, because then you have to go out and fix it if it’s bad.
“So there’s an important paradigm in big data. Be careful what you come up with. It may not be as well received as you might think.”
For example, utilities are now able to document the power loss on lines that leave the substations and head to homes and businesses. Sometimes, that data could prompt expensive upgrades or redesigns.
When utilities don’t undertake those efforts voluntarily, they may here from the North Americans Electric Reliability Corporation.
“There are tons and tons of problems out there, so what are the big fish, what are the things that we need to work on first?” asked Jessica Bian, the director of performance analysis for NERC. ”We use the data to identify the big fish and then we work on a solution to fix the big fish. To me that’s where big data helps us.”
Gehrs described other uses of big data that utilities have already embraced:
Outage Response: Instead of waiting for customers to report outages, utilities now receive reports from the grid itself. That can be especially useful in the middle of the night, Gehrs said, when customers may sleep through an outage. The Smart Grid can report the outage, document its recovery in real-time and isolate locations of physical damage.
Renewables Reliability: The Smart Grid has helped utilities better understand when they can depend on power from solar facilities distributed throughout the grid. Many homes have solar panels in Hawaii, for example, where Smart Grid data helped narrow expectations for solar performance. “Even though the sunlight is from 7:00 to 7:00, the generation is much narrower than that,” Gehr said. “It’s around noon that they get a lot of generation. By about 4:00 p.m., the sun angle is so low in the sky relative to the roof angle that you don’t get nearly as much generation as you might think.”
Forecasting: When Oak Park, IL was considering installing LED street lights, Gehrs was able to use Smart Grid data to document where those new installations could plut into and communicate with the existing Smart Grid.
“I feel every day like I’m wrestling this big data to the ground, and I’ve got to wrestle it in a way that people understand it.” Gehrs explains the Smart Grid in this video:
Stashed in: Big Data!
Sounds like they do not know what they do not know.
Holy Shiat... I think this is the first time I've seen a LoC defined in bytes....
3 petabytes seems small enough that we'll be able to carry the entire library of congress in our pockets within 20 years.