Grid modernization: The next generation
We are moving very quickly, for an industry often compared to dinosaurs, from business intelligence to analytics. What I mean by this is that business intelligence has historically involved taking a backwards, or forensic, look, using mainly reactive information. (I.e., What happened, and what can we learn from it?)
Now, we are moving, out of necessity, to a more proactive look, aided by the influx of what has been labelled Big Data.
We have the data. Now, how can we use it both for business intelligence and, in near-real-time, to respond more quickly to outage and reliability situations, as an example -- and even to prevent the problems from occurring in the first place?
In data mining, the key term is "discovery" -- or detecting something new. To take it one step further, it's the ability to extract previously unknown, interesting patterns. One example would be anomaly detection. For an electric utility, this might mean the ability to detect theft by automatically flagging anomalies in usage, or the ability to flag an overstressed transformer and be able to proactively replace it if necessary.
In the grocery business, data mining has allowed for "market basket analysis." That is, using association rule learning, the grocery store can determine which products are frequently purchased together, and then use this information for marketing campaigns. In the utility business, we don't have this kind of marketing leverage, unless you happen to be a distributor in a deregulated market, but this kind of market basket analysis can be used in other ways, as well.
Consider, for example, the opportunities afforded by knowing the hours in which your customers are most likely to be using a whack of electricity, and targeting the heaviest residential users with a program designed to shift their load. TXU Energy recently launched a program it's calling TXU Energy Free Nights 24SM, a new electricity plan that offers customers free nighttime energy charges between the hours of 10 p.m. and 6 a.m. every night for two years.
In announcing the program, Michael Grasso, TXU Energy's chief marketing officer, said, "We learned a lot from last summer's record heat, when daytime temperatures were scorching. Our customers who shift and save can generate cost savings for themselves and support the greater good by removing demand from the Texas power grid during peak consumption hours." Given this summer is projected to be even hotter than usual in Texas, it's a proactive step, and it'll be interesting to watch the customer response.
But imagine if you could offer niche programs tailored to specific groups of customers that took into account their particular energy usage patterns, and helped them, as well as the utility, save money in the process? It's possible, with the right analytics (given appropriate regulatory approvals).
That's where data clustering comes into play, automatically discovering the particular segments or groups within your greater customer data set.
As a utility, you have unexamined, useful data. By saying this, I'm not telling you anything new, just reinforcing it. That data, given the time, resources and capital to fully examine it, and to set automatic processes to work on it, suggests what to do. This is what I mean by Grid Modernization: The Next Generation.
I read something recently that noted: "Data analysis is not about numbers -- it uses them. Data analysis is about the world, asking, always asking, 'How does it work?' And that's where data analysis gets tricky."
That was written in 1996. It's still tricky.
Many utilities are now leading data analysis projects, or enterprise information management projects, or both. And each utility is choosing the areas it wants to analyze first -- those short-term wins that help the long-term business case. Business value and needs drive the types of applications you, as utilities, will choose to employ, and in what order.
We are all, ultimately, storing, managing and mining data for business value, safety, power quality, asset management and customer behavior. Behind it all? Business, asset and capital optimization.
My colleague Phil Carson wrote, at the beginning of this year, that utilities were finding that Big Data could require Big Aspirin and, possibly, counselling. He posed the following question: "If myriad devices and systems on your grid produce Big Data, which is the most crucial data stream to your business to analyze? Which data challenges are cost-effective to tackle, and in what order?"
In her "Five Analytical Forecasts for 2012" Utility Analytics Institute senior analyst Christine Richards noted that utilities will spend a great deal of money looking for the critical insight from reams of data.
For Intelligent Utility magazine, it's my job to look for the utilities on the front lines with data analytics and enterprise information management projects, and to report them, along with early lessons learned and the grand "ah has!" so that other companies may share in the learning.
Customer value is a large key business value segment, and one that can give electric utilities with AMI data some early wins. It makes sense, therefore, that many utilities are looking both to customer value and operational value when they look for early analytics wins.
On the customer value side, a utility can analyze energy use data for customers, or pinpoint potential causes for high energy bills, building their stake as "trusted energy advisor," On their own side of the fence, they can also analyze customer payment patterns to pinpoint late-paying customers who may need additional help, or may need better energy conservation information.
Finally, the data can be used specifically for much more granular customer segmentation.
On the operations side, outage management systems are now pulling in SCADA data, GIS data, and even individual meter and transformer data in order to quickly pinpoint and analyze outages and dispact appropriate crews and equipment. They are also using meter and transformer load data, overlaid with weather data, to pinpoint assets in need of closer monitoring for failure, or early replacement. Not only is this a good use of capital, but it maintains reliability in a much more proactive way.
And, of course, there's the option to use data analysis in the forecasting of energy usage, in order to optimize power purchases.
In the past, we've worked by taking the data in each area (MDM, SCADA, OMS, etc.) and putting it in its own star schema, or summary. These were all discreet data environments.
For unstructured analytics, or the next step, however, that's not enough. All the nitty-gritty important "ah ha" details come from the mining of all of it, and finding the patterns within it. This, indeed, is the future of analytics.
Energy Central's Utility Analytics Institute is hosting Utility Analytics Week, Sept. 18-20, in Arlington, TX. For more information, or to register, click here.
No discussions yet. Start a discussion below.