Substation Digital Twin via IEC 61850 and Machine Learning: Exclusive Interview with Dr. Andre Naumann of Fraunhofer IFF
Dr. Andre Naumann
- Sep 24, 2019 7:55 pm GMT
- 380 views
As October’s IEC 61850 Global 2019 Conference approaches, the eyes of the utility industry are on the lauded speakers who will share their insights from the sector’s leading minds in digital utility practices. Dr. Andre Naumann is one such expert, and he’s going to be sharing an anticipated presentation entitled ‘Substation Digital Twin: Leveraging IEC 61850 and machine learning to achieve advanced monitoring and simulation of substation systems.’
As a Group Leader of Energy Systems at Fraunhofer IFF, Dr. Naumann is a trusted and key voice in this field. With the increasing technology behind artificial intelligence, data collection, flexible energy systems, and more, this marriage of IEC 61850 with machine learning is quite the exciting topic.
Ahead of the IEC 61850 Global 2019 Conference, Dr. Naumann was gracious enough to share with me some of his thoughts and insights in a preview of his presentation so those in the Energy Central community could learn and consider attending this presentation live:
Matt Chester: Before diving into your presentation on substation digital twin for the IEC 61850 conference, can you provide a bit of background on you and your experience? How did you get involved in this topic and why so you find it particularly important and compelling?
Dr. Andre Naumann: My interest for electrical energy system grew during my studies. Most of my fellow student focused on automation and microelectronics. I was more interested in the energy sector, having the "real big machines” in the focus. So, I came to the field of power networks during my studies and the time as PhD student. Nevertheless, I always had some background in communication systems starting from school times and also as a hobby. My PhD supervisor saw that power networks and communication systems were my favorite fields. That was quite suitable, since during the time of my PhD studies several research project in that field took place. I could dive quite deep into the topics of IEC 61850, and IEC 61970 (CIM) and developed some software and test frameworks, which was used in the EU project "web2energy." This also how I know Christoph Brunner. Since then, several more projects having IEC 61850 in scope took place, while also other standards like IEC 61970 and OPCC and ISO 15118 complemented the portfolio.
Today, things are growing more and more towards automation and user guidance, especially having in mind that the system becomes much more complex with a higher degree of dynamics. These situations, which no control center operator will be able to handle manually, require assistance from sophisticated systems. These systems will rely on data from digital twins and use artificial intelligence algorithms. In my research together with my colleagues, we develop solutions to get these technologies running in the power systems. This research is also driven from a personal interest to have one day a perfect stable, efficient, and automated power system. Imagine a power system where you simply add a new component just by plugging it to the station at home, like with PV or some EV chargers, and the system automatically exactly knows what this component can do and adapts everything else automatically. We call this concept "plug&run" like the "plug&play" from former PC hardware installation. With the increasing number of decentralized components, the desire for this will grow. Of course operators still will be needed in such a world, but they will have much more assisting systems, while still leaving the final decision to the operator.
MC: Some of the needs that have pushed you towards this topic to present on that you’ve noted include energy systems that are becoming increasingly complex and the need for higher levels of grid security. How do substation digital twin and machine learning feed into those goals?
AN: For a detailed system analysis, you need appropriate models. These models run today in the control center, using some simulators and returning you several results on congestion analyses, possible line overloading, etc. With increasing complexity, these models become bigger and bigger, otherwise you lose accuracy. The idea is to outsource some calculations to the substation to perform a decentralized local computation. This computation has only in scope a particular region of the substations and the neighbor substations. A pre-calculation of relevant key parameters can be done that way, which are then transmitted to the control center.
This approach also has other benefits. In case of communication loss, in case some measures need to be taken, the substation can decide on its own to keep the system stable. Additionally, latencies can be reduced in case of events, since the decisions can be made local. The prerequisite for this is to have a very detailed model of the substation and its surrounding network part. Using AI, the substation can be taught to behave correctly in several situation. Depending on measured parameters, the substation can detect if it's a normal situation or a situation that needs some reaction and the AI can also tell which measure would be appropriate.
One of the main challenges here is to create teaching data for the AI. That's the reason why a combination of AI and digital twin (DT) comes into play. Using the data, a DT-based simulation can deliver to you, and you can teach your AI how to react in certain situations that haven't been encountered in real operation. The other way around: for a digital twin to work correctly a continuous model tuning is performed using measurement values to update the model parameters. One way to perform such a tuning process is based on AI. So, the DTs and AI comprehend each other.
MC: What are some of the challenges you’ve run into for utilities who are trying to implement artificial intelligence as a tool for substation systems?
AN: For most people, AI is a black box: some data goes in, some result comes out, and the result gives quite good results (depending on training data) but no one exactly knows why. Especially in field of power supply, this is not a feasible pathway because decisions must be traceable. That's one of the basic paradigms. Additionally, substations today are facing less complex tasks. Protection devices work using characteristic curves or arrays, and automation tasks are defined using logic blocks. Historically, the station controllers do not also provide optimal frameworks for AI implementation, although computing power might be sufficient. So, the mix of the black-box-characteristics and the not-so-urgent need for AI is one aspect. Another aspect is the missing experience for AI application in substations. This is true for real systems as well as for test systems in labs. No one can really tell at the moment how stable and safe these systems will be working, and as long as there is no sufficient level of reliability the application won't take place.
A third aspect is the necessity of information exchange between substations and control center. To make a DT-based system work properly, a continuous synchronization of the DT models between the substation and control center must take place, so that in case of an event both models will know what the other components do (especially in case of communicating loss). The same thing is true for some parts of the AI. At the moment, there are no standardized models allowing such a detailed data exchange and no rules describing which of the data really needs to be transferred and which are just needed locally.
These are the aspects that we are addressing: The typical black-box image of AI is only true in parts. AI today can be integrated using methods that clearly show why decisions have been made and also integrate algorithms that clearly are based on analytical algorithms and predefined characteristics. The other aspect of getting experience in AI for substations and having a clear statement on its reliability and safety is for the current focus in our research projects. One example project is the German project "HyLITE," together with Fraunhofer IOSB-AST, Technical University Ilmenau, and SIEMENS, where we use a substation digital twin combined with a control center and investigate the possibilities and limits of AI. The rules for data exchange for AI and DT are also in scope of this project, addressing also the standard IEC 61850.
MC: How far along would you say digital twin implementation is for the typical utility? Is this technology being widely made a priority or do you find you still need to provide convincing to the decision makers that this is necessary to invest in today?
AN: Some basic approaches of DTs are slowly getting to some control centers. This can also be seen be the producers of control center system, which also address the benefits a DT can have. DTs are mainly used here for product lifecycle management and also go into the direction of system state calculation. The hurdle for utilities should not be so big here, since most of them already have detailed system models and simulations running, which can be extended to DTs. Surely some models need to be more detailed, especially if dynamic behaviors shall be in focus. On the substation side, at the moment there are no real solutions running, since they are also not available at the moment. So, on the one hand, much more development must be done here; but on the other hand, research and development can only work if requirements, experience, and feedback are given by utilities. In that field, a lot of convincing of decision makers is still necessary. From my perspective DTs won't be that important during the next 5 years, but having a look 20 years in future, this will be one basic technology to handle the system complexity. At Fraunhofer IFF we're always happy if there is a utility giving us some specific question or use case which might be interesting for DT or AI solutions.
MC: The IEC 61850 conference is going to be big for you as a presenter of these important topics, but I’m sure you’re also eager to attend and learn from others. Are there any particular topics you’re eager to hear more about that may be outside of your specific expertise?
AN: There are several aspects I'm interested in. The first thing is the progress in the unification of several standards like IEC 61850 and IEC 61970. This issue is interesting for me, since I got in contact with both standards and developed the first platform handling both standard. The solution built at that time was a kludge and was parametrized manually especially for the use case in scope. Some more standardized way, enabling automatic translation is something I'm still missing. Although this is not my project focus at the moment, this is more of personal interest and will be a very piece for future energy systems.
Another issue is of course what I mentioned previously: How far we in the realization of the "plug&run" concept, the vision of bringing some new device or exchanging a device in the system and letting it announce what it is and which features it has so the complete system knows about it and behaves accordingly. It would be interesting for me to hear what’s left to do. Maybe there is something there we also can focus on in our research.
And the third thing, which is a part of the issue above: how far are we really in practice with the model-based IEC 61850 integration using SCL and enabling multi-vendor systems.
One more thing, which is relevant for some other of our projects focusing on intelligent grids: What's the current state on standardization having in mind the new components to be actively integrated into the power system, like HVDC systems, controllable loads, both large and small, aggregated loads like electric vehicles and batteries.
If you’re interested in hearing more about Dr. Naumann’s insights into machine learning, digital twin, and IEC 61850, be sure to check out his presentation at the IEC 61850 Global 2019 conference, taking place from October 14 to 18 in London. You can check out the agenda and register for the conference here.