Upload
phungtuyen
View
214
Download
0
Embed Size (px)
Citation preview
Prof.Dr.-‐Ing. Luigi Vanfretti
iTesla Workshop – Brussels, BelgiumNov. 4-‐5, 2015
iTesla RaPId -‐ Rapid Parameter IdentificationA Toolbox for Model Validation and Identification
[email protected] Professor, Docent
Electric Power Systems Dept.KTH
Stockholm, Sweden
[email protected] Advisor
Research and Development Division Statnett SFOslo, Norway
Outline
• Recap– The iTesla Power Systems Modelica Library (iPSL)– Where is Modelica used in iTesla? And where does RaPId fit?
• Why Model Validation– Motivation– Model validation loop– Validation levels– Requirements for a model validation SW tool
• iTesla RaPId– Model validation software architecture based using Modelica tools and FMI
Technologies– RaPId Intro and Demos
• Conclussions and Recommendations
Modeling, Simulation Tools and Model Validation
Current/new tools will need to perform simulations:• Of complex hybrid model components
and networks with very large number of continuous and discrete states.
• Models need to be shared, and simulation results need to be consistentacross different tools and simulation platforms…
Assume:• If models could be “systematically shared
at the equation level”, and simulations are “consistent across different SW platforms” – we would still need to validate each new model (new components) and calibrate the model to match reality.
iPSL:iTesla Power Systems Modelica Library
• The iTesla Power Systems library developed using as reference domain specific software tools (e.g. PSS/E, Eurostag, PSAT and others)
• The library has been used with several Modelica compliant tools software: OpenModelica, Dymola, SystemModeler, Jmodelica, etc.
• Components and systems are validated against proprietary tools and one OSS tool used in power systems (domain specific)
Model Editing in OpenModelica
Reminder: models are used as a key enabler of the iTesla Toolbox!
Sampling of stochastic variables
Elaboration of starting network
states
Impact Analysis(time domainsimulations)
Data mining on the results of simulation
Data acquisition and storage
Merging module
Contingency screening (several stages)
Time domainsimulations
Computation of security rules
Synthesis of recommendationsfor the operator
External data (forecasts and snapshots)
Improvements of defence and
restoration plans
Offline validation of dynamic models
Data management
Data mining services
Dynamic simulation Optimizers Graphical
interfaces
Modelica use fortime-‐domain simulation
Why “Model Validation”?• iTesla tools aim to perform
“security assessment”• The quality of the models
used by off-‐line and on-‐line tools will affect the result of any SA computations– Good model: approximates
the simulated response as “close” to the “measured response” as possible
• Validating models helps in having a model with “good sanity” and “reasonable accuracy”: – Increasing the capability of
reproducing actual power system behavior (better predictions)
2 3 4 5 6 7 8 9-2
-1.5
-1
-0.5
0
0.5
1
Δ P
(pu)
Time (sec)
Measured ResponseModel Response
US WECC Break-‐up in 1996
BAD Model for Dynamic Security Assessment!!!
The Model Validation Loop• The major ingredients of the model validation loop below have been
incorporated into the proposed general framework for validation:
• A model can NEVER be accepted as a final and true description of the actual power system.– It can only be accepted as a suitable (“Good Enough”) description of the system for specific
aspects of interest (e.g. small signal stability (oscillations), etc.)– Model validation can provide confidence on the fidelity and quality of the models for
replicating system dynamics when performing simulations.– Calibrated models will be needed for systems with more and more dynamic interactions
(1)Prior Knowledge(Experience)
Parameter Tuning
Choice of Model Structure (2)
Criterion fit:-‐ fitting the data?-‐ purpose of the
model? (3)
OK!Model meets the
performance criteria/indicator.
Not OK!Revise from (1), (2), or (3)
Experiment Design(Staged Tests)
Data(System Measured
Response)
Assumed Model Structure(s)(Model of the System During
the Test) Simulate the Model
(System Simulated Response)Chosen Criterion of Fit
(Performance Indicator)
Validate the Model
(Model of the System During the Test)
?
Different Validation Levels
• Component level response– e.g. generator such as wind
turbine or PV panel
• Cluster level behavior– e.g. gen. cluster such as wind
or PV farm
• System level behavior– e.g. power system small-‐signal
dynamics (oscillations)
G1
G2
G3
G4
G5
G7
G6
G8
G9
What is required from a SW architecture for model validation?
Models
Static Model
Standard Models
Custom Models
Manufacturer Models
System LevelModel Validation
Measurements
Static Measurements
Dynamic Measurements
PMU Measurements
DFR Measurements
Other
Measurement, Model and Scenario
Harmonization
Dynamic Model
SCADA MeasurementsOther EMS Measurements
Static Values:- Time Stamp- Average Measurement Values of P, Q and V- Sampled every 5-10 sec
Time Series:- GPS Time Stamped Measurements- Time-stamped voltage and current phasor meas.
Time Series with single time stamp:- Time-stamp in the initial sample, use of sampling frequency to determine the time-stamp of other points- Three phase (ABC), voltage and current measurements- Other measurements available: frequency, harmonics, THD, etc.
Time Series from other devices (FNET FDRs or Similar):- GPS Time Stamped Measurements- Single phase voltage phasor measurement, frequency, etc.
Scenario
Initialization
State Estimator Snap-shop
DynamicSimulation
Limited visibility of custom or manufacturer models will by itself put a limitation on the methodologies used for model validation
• Support “harmonized” dynamic models
• Process measurements using different DSP techniques
• Perform simulation of the model
• Provide optimization facilities for estimating and calibrating model parameters
• Provide user interaction
Coupling Models with Simulation & Optimization: FMI and FMUs
• FMI stands for flexible mock-‐up interface:– FMI is a tool independent standard to support both model exchange and co-‐simulation
of dynamic models using a combination of xml-‐files and C-‐code, originating from the automotive industry
• FMU stands for flexible mock-‐up unit– An FMU is a model which has been compiled using the FMI standard definition
• What are FMUs used for?– Model Exchange
• Generate C-‐Code of a model as an input/output block that can be utilized by other modeling and simulation environments
– FMUs of a complete model can be generated in one environment and then shared to another environment.• The key idea to understand here is that the model is not locked into a specific
simulation environment!• We use FMI technologies to build RaPId
The FMI Standard is now supported by 40 different simulation tools.
User Target(server/pc)
Model Validation Software
iTesla WP2 Inputs to WP3: Measurements & Models
Mockup SW ArchitectureProof of concept of using MATLAB+FMI
EMTP-‐RV and/or other HB model simulation traces and simulation configuration
PMU and other available HB measurements
SCADA/EMS Snapshots + Operator Actions
MAT
LAB
MATLAB/Simulink (used for simulation of the Modelica Modelin FMU format)
FMI Toolbox for MATLAB(with Modelica model)
Model Validation Tasks:
Parameter tuning, model optimization, etc.
User Interaction
.mat and .xml files
HARMONIZED MODELICA MODEL:Modelica Dynamic Model Definition for Phasor Time Domain Simulation
Data Conditioning
iTesla Data Manager
Internet or LAN.mo files
.mat and .xml files
FMU compiled by another tool
FMU
Proof-‐of-‐Concept ImplementationThe RaPId Mock-‐Up Software Implementation
• RaPId is our proof of conceptimplementation (prototype) of a softwaretool for model estimation and validation.The tool provides a framework for modelidentification/validation, mainlyparameter identification.
• RaPId is based on Modelica and FMI –applicable to other systems, not onlypower systems!
• A Modelica model is fed through anFlexibleMock-‐Unit (i.e. FMU) to Simulink.
• The model is simulated and its outputs arecompared againstmeasurements.
• RaPId tunes the parameters of the modelwhile minimizing a fitness criterionbetween the outputs of the simulationand the experimental measurements ofthe same outputs provided by the user.
• RaPId was developed in MATLAB.– The MATLAB code acts as wrapper to
provide interaction with several other programs (which may not need to be coded in MATLAB).
• Advanced users can simply use MATLAB scripts instead of the graphical interface.
• Plug-‐in Architecture:– Completely extensible and open
architecture allows advanced users to add:• Identification methods• Optimization methods• Specific objective functions• Solvers (numerical integration
routines)
Options and
Settings
Algorithm Choice
Results and Plots
Simulink Container
Output measurement data
Input measurement data
What does RaPId do?
Output (and optionally input) measurements are provided to RaPId by the user.
At initialization, a set of parameters is pre-‐configured (or generated randomly by RaPId)The model is simulated with the parameter values given by RaPId.
The outputs of the model are recorded and compared to the user-‐provided measurementsA fitness function is computed to judge how close the measured data and simulated data are to each otherUsing results from (5) a new set of parameters is computed by RaPId.
1
2
3
4
5
2’
ymeas
t
ymeas , ysim
tSimulink ContainerWith Modelica FMU Model
Simulations continue until a min. fitness or max no. of iterations (simulation runs) are reached.
1
2
3
4
5
RaPId! Now Available as OSS!
• Download at:• https://github.com/SmarTS-‐Lab/iTesla_RaPId
Get it while it’s hot!
Video Demo!
Validating the Excitation System Model of the Mostar Power Plant!
Stay for the workshop to work with this example!
More On-‐line Video Demos!GUI example
https://www.youtube.com/watch?v=e7OkVEtcz6ACLI example:
https://www.youtube.com/watch?v=4qrPASIWdiY
Stay for the workshop to work with RaPId!
Modelica and FMI Modern computer technologies opening new opportunities
• Validating power system models requires to develop new methods and new tools itself:– The tools for model validation can be built independent from a specific power system
simulator, thanks to the development of the Modelica library which allows to run the models with different tools and using FMUs.
– Model validation tools developed in this approach will provide additional flexibility to couple in a modular fashion: simulation, optimization and signal processing tools.
iTesla Dynamic Database
Power System Model Definition in
Modelica
Model Validation Tools
(RaPId)
Model validation results, including validation metrics and parameters are sent back to the dynamic database
which updates these specifications
Simulation
Signal Processing
Optimization
New Method ImplementedTo Deal with Difficult Validation Problem
Conclusions andLooking Forward
• Modeling power system components with Modelica (as compared with domain specific tools) is very attractive:– Formal mathematical description of the model (equations)– Allows model exchange between Modelica tools, with consistent (unambiguous)
simulation results• The FMI Standard allows to take advantage of Modelica models for:
– Using Modelica models in different simulation environments– Coupling general purpose tools to the model/simulation (case of RaPId)
• There are several challenges for modeling and validating “large scale” power systems using Modelica-‐based tools:– A well populated library of typical components (and for different time-‐scales)– Support/linkage with industry specific data exchange paradigm (Common Information
Model -‐ CIM)• Developing a Modelica-‐driven model validation for large scale power systems is more
complex challenge than the case of RaPId. • However, the results obtained so far by Tractebel are encouraging, and are a good starting
point to build CIM-‐compliant tools for model validation.• We have released RaPId as a Free and Open Source Software, and the iTesla Power Systems
Modelica library will be released shortly.
Get it while it’s hot!
Questions?
[email protected]@statnett.no
Thank you!
25
RaPId: Now Available as OSS!: https://github.com/SmarTS-‐Lab/iTesla_RaPId