Folie 1 BIOMASS End-to-End Mission Performance Simulator Paco López-Dekker, Francesco De Zan, Thomas Börner, Marwan Younis, Kostas Papathanassiou (DLR);

Embed Size (px)

DESCRIPTION

Paco López-Dekker > Slide 3

Citation preview

Folie 1 BIOMASS End-to-End Mission Performance Simulator Paco Lpez-Dekker, Francesco De Zan, Thomas Brner, Marwan Younis, Kostas Papathanassiou (DLR); Toms Guardabrazo (DEIMOS); Valerie Bourlon, Sophie Ramongassie, Nicolas Taveneau (TAS-F); Lars Ulander, Daniel Murdin (FOI); Neil Rogers, Shaun Quegan (U. Sheffiled) and Raffaella Franco (ESA) Microwaves and Radar Institute, German Aerospace Center (DLR) Paco Lpez-Dekker > Slide 2 Project Context and Objectives BEES: BIOMASS End-to-End (mission performance) Simulator ESA funded project in context of BIOMASS EE-7 Phase-A study Provide a tool to evaluate the expected End-to-End performance of the mission Realistic, distributed scenes Model system residual errors (noise, ambiguities, instrument stability, channel unbalances) Ionospheric disturbances (Faraday rotation and scintillation) Processing - L0, L1, L1b - Ionospheric error correction - L2 retrieval Focus on including all main effects and disturbances Not detailed instrument simulator Paco Lpez-Dekker > Slide 3 Paco Lpez-Dekker > Slide 4 Paco Lpez-Dekker > Slide 5 BEES Modules Engineering Modules Geometry Module: provides common geometry to all modules [DEIMOS] Observing System Simulator (OSS-A & OSS-B) [A: DLR; B: Thales Alenia Space] Product Generation Module(s) [DLR] - PGM-L1a - PGM-L1b Scientific Modules Scene Generation Module (SGM) [DLR+U. Chalmers] Ionospheric Modules [U. of Sheffield] - Ionospheric Generation Module (IGM) - Ionospheric Correction Module (ICM) Level-2 retrieval module (L2RM) [FOI] Performance evaluation modules [DLR] PEM-L1b PEM-L2 Paco Lpez-Dekker > Slide 6 BEES Block Diagram OpenSF Simulation control OpenSF drives the E2ES. This includes: - UI - Execution Monte Carlo runs. - Etc Paco Lpez-Dekker > Slide 7 BEES diagram: OSS 3 sub-modules Dummy Radar Parameter Generator (RPG) System Errors and Sensititvity Module (SES) Impulse Response Function Module IRF strategy IRF models SAR system + processing This avoids generation of RAW data SES strategy: model residual errors Two OSS versions corresponding to the two industry Phase-A studies. Paco Lpez-Dekker > Slide 8 SGM: Scene Definition 200t/ha, Clark-Evans Index t/ha, Clark-Evans Index Forest Type (Out of a Predefined List); 2.Mean Biomass Level (Ha level); Spatial Distribution of single trees each with a individual (top) Height / Biomass tag. 500t/ha 0t/ha 100x100 m: 1.Biomass (t/ha) 2.Tree height (h 100 ) To forward model Paco Lpez-Dekker > Slide 9 SGM output (ground truth) Biomass Tree height (H 100 ) Paco Lpez-Dekker > Slide 10 Input to PGM: PolInSAR covariance matrices HH HV VV Paco Lpez-Dekker > Slide 11 Input to PGM: PolInSAR covariance matrices HH1-HH2 HV1-HV2 VV1-VV2 Paco Lpez-Dekker > Slide 12 BEES Block Diagram: PGM Paco Lpez-Dekker > Slide 13 Review of PGM algorithm Generation of interferometric/polarimetric channels for the scatter (correlated) and the noise (uncorrelated) Spectral shift modulation (geometric decorrelation part I) 2-D convolution Add ionospheric phase screen (scintillations) and Faraday rotation Spectral shift demodulation (geometric decorrelation part II) Ambiguity stacking Additional system disturbances (cross-talk, phase and gain drifts) L1b product generation (multilooking) L1a product generation SGM, OSS GM OSS GM IM, GM OSS ICM GM inputs macro steps Paco Lpez-Dekker > Slide 14 Multichannel signal simulation Channel Linear Combination Channel Linear Combination channel #1 channel #2 channel #N channel #1 channel #2 channel #N channel #1 channel #2 channel #N Independent channels (complex) Correlated channels (complex) Spatial convolutions Desired spectral properties for each complex channel Tree Height Coherence HH-HH SLC HH Paco Lpez-Dekker > Slide 15 Introduction of Ionospheric distorion Orbit Target 1 Aperture length Aperture angle: This is what really matters! Lower (virtual) orbit Equivalent Aperture Target 2 Ionosphere (modeled as a layer) This part of the ionosphere Modifies this part of the raw data for Target 1 but this part for Target 2 Ionospheric distortion cannot be applied directly to raw data!!! (the raw data distortion is target dependent) For an orbit at Ionosphere height Distortions can be applied directly to the raw data Paco Lpez-Dekker > Slide 16 BEES Block Diagram This block applies the ionospheric correction (Faraday rotation and shifts only). The simulation of the Ionosphere is divided in two steps. First the spectral coefficients describing the state of the Ionosphere are generated. For a given spectra random realizations are generated. Paco Lpez-Dekker > Slide 17 Level-2 Retrieval Paco Lpez-Dekker > Slide 18 L2 retrieved heights (H 100 ) SGM L2 Range dependent H 100 bias Software bug or realistic feature? Paco Lpez-Dekker > Slide 19 L2 retrieved biomass SGM L2 Paco Lpez-Dekker > Slide 20 Performance Evaluation (L1b) L1b performance in terms of element-wise covariance matrix errors Bias Standard deviation In example Significant coherence loss, due to spectral shift Paco Lpez-Dekker > Slide 21 Performance Evaluation (L2) L2 performance in terms of biomass and tree height errors Bias Standard deviation Error statistics vs. range and biomass levels In example Height error leads to biomass error? Paco Lpez-Dekker > Slide 22 Performance Evaluation (L2) L2 performance in terms of biomass and tree height errors Bias Standard deviation Error statistics vs. range and biomass levels In example Height error leads to biomass error? Paco Lpez-Dekker > Slide 23 Monte Carlo (multiple runs of BEES) Monte Carlo simulations are implemented by OpenSF BEES is run repeatedly perturbing (if necessary) some input parameters. Perturbation approach Random realizations implemented by modules (OpenSF can provide varying seed for independent realizations). This gives the control of the randomization to the module developers in order to ensure physical correctness. Most of this randomness is introduced by IGM and PGM Paco Lpez-Dekker > Slide 24 Paco Lpez-Dekker > Slide 25 Validation: challenges and strategy BEES is a complex software tool comprising modules developed by different teams under heterogeneous environments How do we know that the outputs are correct? We are developing the tool because we do not know (exactly) what we will get! We are simulating a random process: - Speckle - Random noise - Random hardware disturbances - Random realizations of Ionosphere - Validating the software requires approaches that resemble the post-launch validation/calibration of a real system Homogeneous scenes Point targets Validation needs to check if resulting statistics for some canonic cases are in agreement with theory. Paco Lpez-Dekker > Slide 26 Example: NESZ validation NESZ is range dependent The threshold is designed for a failure probability of test failure test success test failure The nominal NESZ value Paco Lpez-Dekker > Slide 27 Example: PGM L1b Verification Probabilistic Threshold Due to random nature of speckle, the estimated covariance matrices will not be identical to the true one (even when all error sources are turned off) We can however evaluate the likelihood of a certain output given the input in probabilistic terms (e.g. using confidence intervals). We will do the test using the complex coherences, i.e. the normalized elements of the sample covariance matrix: Using a probability threshold (th), it is possible to bind the deviation: The threshold will be a function of the desired error (t), the input coherence () and the number of looks (L). Paco Lpez-Dekker > Slide 28 PGM L1b Verification Caveat! The assumption that the estimate is unbiased doesnt hold for high coherences and low number of looks. For a given coherence one has to make sure that enough looks are taken into account, i.e.: 10 5 simulations, gamma=0.5, L= simulations, gamma=0.95, L=30 histograms from simulations To validate the simulator we need (to simulate) large, homogeneous scenes! Sound familiar? Paco Lpez-Dekker > Slide 29 Project Status/Outlook Software almost completed Full handling of ambiguities missing Some ionospheric features/possibilities pending Validation and debugging on-going Distinguishing between bugs and features not easy! Mission Performance Assessment Once BEES is validated it will be used to assess mission performance for both Phase-A designs Hundreds of test cases requiring N Monte Carlo repetitions Weeks of simulation time