ALGORITHMS

From MyDewetra World
Revision as of 11:36, 4 March 2021 by Admin (talk | contribs)

Jump to: navigation, search

[Home]


GFM' flood products are based on an ensemble approach integrating three robust, cutting edge algorithms developed independently by three leading research teams.
The motivation for choosing such a methodology is to substantially improve accuracy of the derived Sentinel-1 flood and water extent maps and to build a high degree of redundancy into the production service.

The (internal) availability of three separate flood and water extent maps tackles, by readily identifying them, the shortcomings a single algorithm, by itself, might be suffering of in specific conditions and/or part of the world due to many well-known factors like topography or environmental conditions.

For these very reasons, Users have access to consensus flood maps where two or even all three cutting-edge algorithms agree, giving her or him the extra confidence needed to use the fully-automatic Sentinel-1 flood data product instead of manually derived flood maps. Accordingly, the implemented quality assurance procedures (see INSERT REFERENCE) allow for differentiating between classification errors that can be attributed to shortcomings of individual algorithms and errors that are inherent to the SAR sensing instruments and their difficulty to capture the appearance or disappearance of surface water in particular situations



data processing architectures underlying the different scientific algorithms, namely single-image, dual-image, and data cube processing architectures. The latter is based on the “data cube” concept, whereby SAR images are geocoded, gridded and stored as analysis ready data (ARD) in an existing spatio-temporal SAR data cube. By using a data cube, where the temporal and spatial dimensions are treated alike, each Sentinel-1 image can be compared with the entire backscatter history, allowing to implement different sorts of change detection algorithms in a rather straightforward manner. Importantly, the entire backscatter time series can be analysed for each pixel. Therefore, model training and calibration may be carried out systematically for each pixel. As laid out in the tender, advantages of working with data cubes are: (a) Algorithms are better able to handle land surface heterogeneity; (b) Uncertainties can be better specified; (c) Regions where open water cannot be detected for physical reasons (e.g. dense vegetation, urban areas, deserts), can be determined a priori, and (d) Historic water extent maps can be derived, essentially as a by-product of the model calibration, which may serve as a reference for distinguishing between floods and the normal seasonal water extent. Fulfilling the requirements of the tender, the consortium will build the Sentinel-1 flood monitoring service on top of and in synergy to an existing data cube processing infrastructure which already serves an operational service, namely the Sentinel-1 surface soil moisture data service that is part of the Copernicus Global Land Service (CGLS).