In partnership with AESP: The increasing roles of DERs, connected technology and Big Data are driving rapid change in energy efficiency. As we shape the Utility of the future, this community will help you keep up with the latest developments. 

11,314 Members

Post

The Heisenberg Uncertainty Theory of Energy Efficiency, and Aggregated NMEC in CA

Matt Golden

Last week, the California Public Utilities Commission (CPUC) released new guidance that effectively greenlights the application of Normalized Metered Energy Consumption (NMEC) to aggregate (population-level) energy efficiency programs.

Learn more about this ruling from this OpenEE Blog.

Why is this super complicated weedy regulatory stuff so important?

Think about it this way: Aggregated Normalized Metered Energy Consumption (NMEC) is the Heisenberg's uncertainty principle applied to energy efficiency.

The uncertainty principle states that you cannot know, with absolute certainty, both the position and momentum of an electron – the more accurately you measure one of these properties the less accurate your knowledge of the other.

The traditional approach to energy efficiency is based on measuring every attribute, controlling for every variable, and modeling the physics of each building in an attempt to attribute savings to the individual measure, and from that, model the overall impact to the greater system. While very costly and complicated, it's also very poor at predicting system-level outcomes. Lots of precision, very little accuracy.

Aggregated NMEC is the opposite approach. Aggregated NMEC measures everything at the meter and generates a statistical and probabilistic model of the overall system that is an excellent predictor of outcomes at the portfolio or aggregate level, but trades off the ability to attribute savings to measures or know with certainty the impact on individual buildings.

Moving to aggregated NMEC enables performance-based regulation and accurate forecasting, and creates the confidence to bring energy efficiency into DER procurements and manage risk to attract project finance -- at drastically lower costs to all parties.

That's a big deal.

Matt Golden's picture

Thank Matt for the Post!

Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.

Matt Chester's picture
Matt Chester on February 7, 2019

I'm intrigued, Matt-- but I'll admit I'm a bit lost as well! When you say that the traditional approach to energy efficiency (measuring each attribute & controlling every variable) is precise but not accurate, can  you explain why that's the case? I understand why doing that would be precise in measuring current energy efficiency, but why exactly does it perform poorly in predicting?

Matt Golden's picture
Matt Golden on February 13, 2019

Why is it the case?  Its false precision. The overall system is much more complex than our models, with all sorts of exogenous factors that we either do not measure, or can't (like pesky homeowners behavior).  

Models are terrible predictors on individual assets. It is common to have realization rates in the 50% range or worse (70% would be good BTW!) -- meaning we only get 50% to 70% of what the model predicts. In California, the residential AHUP program has a 29% realization rate for gas savings and a 21% realization rate for electric. ouch!

Worse than that, there is a ton of variance. Even if we were right on average (which we are not) there are tons of winners and losers... a huge spread.

See an example of the distribution of savings in a real portfolio of residential home performance projects: https://www.dropbox.com/s/ctrn37zg4u38tmy/Screenshot%202019-02-13%2006.06.59.png?dl=0

A model is not science (building science). An energy model is a hypothesis.  Science is when you test the hypothesis against actual data in an experiment.

We need to focus on empirical actual data, and stop drinking our own building science cool-aid and believing our models are the truth.

Bob Meinetz's picture
Bob Meinetz on February 13, 2019

Matt, models are never the truth. CPUC's new system will still use models, but they're based upon real-time empirical input from smart meters. There will still be winners and losers.

Whether the new system will be an improvement is hypothetical too - it's very possible to draw faulty conclusions from misapplied or generalized statistics. For example: a "population level" approach which doesn't account for standard deviation will have a ton of variance too - likely more. Ideally, the system would still collect data from individual buildings, and even customers, and target efficiency programs which have the best chance of benefiting each one. But that gets expensive too.

The new system will undoubtedly cost less for utilities. Whether those savings are passed on to customers remains to be seen.

Get Published - Build a Following

The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »