Difference between revisions of "ALGORITHMS"

From MyDewetra World
Jump to: navigation, search
Line 3: Line 3:
  
 
'''GFM'''' flood products are based on an ''ensemble approach'' integrating three robust, cutting edge algorithms developed independently by three leading research teams. <br>
 
'''GFM'''' flood products are based on an ''ensemble approach'' integrating three robust, cutting edge algorithms developed independently by three leading research teams. <br>
 +
The motivation for choosing such a methodology is to substantially improve accuracy of the derived Sentinel-1 flood and water extent maps and to build a high degree of redundancy into the production service.
  
and accept a pixel as flooded when at least two algorithms classify it as water. The motivation for choosing such an ensemble approach is to substantially improve the robustness and accuracy of the derived Sentinel-1 flood and water extent maps and to build a high degree of redundancy into the production service. We expect this to be true from the very beginning of the service, but certainly along the way of the operations, with its continuous maintenance and optimisation activities and major evolutions every few years. This is because the (internal) availability of three separate flood and water extent maps will help the development team to readily identify algorithmic shortcomings and/or bugs in the implementation of each of the three algorithms and mitigate any problems that may appear, thereby not only improving the quality of the service but also speeding up the initial set-up of the service and reducing its costs in the long term. In the frame of the quality assurance procedure, the ensemble approach will further allow differentiating between classification errors that can be attributed to shortcomings of individual algorithms and errors that are inherent to the SAR sensing instruments and their difficulty to capture the appearance or disappearance of surface water in particular situations. The user, on the other hand, will only receive “consensus flood maps” where two or even all three cutting-edge algorithms agree, giving her or him the extra confidence needed to use the fully-automatic Sentinel-1 flood data product instead of manually derived flood maps.
+
The (internal) availability of three separate flood and water extent maps tackles, by readily identifying them, the shortcomings a single algorithm, by itself, might be suffering of in specific conditions and/or part of the world due to many well-known factors like topography or environmental conditions: for all those reasons In the frame of the quality assurance procedure, the ensemble approach will further allow differentiating between classification errors that can be attributed to shortcomings of individual algorithms and errors that are inherent to the SAR sensing instruments and their difficulty to capture the appearance or disappearance of surface water in particular situations. The user, on the other hand, will only receive “consensus flood maps” where two or even all three cutting-edge algorithms agree, giving her or him the extra confidence needed to use the fully-automatic Sentinel-1 flood data product instead of manually derived flood maps.
  
  
 
data processing architectures underlying the different scientific algorithms, namely single-image, dual-image, and data cube processing architectures. The latter is based on the “data cube” concept, whereby SAR images are geocoded, gridded and stored as analysis ready data (ARD) in an existing spatio-temporal SAR data cube. By using a data cube, where the temporal and spatial dimensions are treated alike, each Sentinel-1 image can be compared with the entire backscatter history, allowing to implement different sorts of change detection algorithms in a rather straightforward manner. Importantly, the entire backscatter time series can be analysed for each pixel. Therefore, model training and calibration may be carried out systematically for each pixel. As laid out in the tender, advantages of working with data cubes are: (a) Algorithms are better able to handle land surface heterogeneity; (b) Uncertainties can be better specified; (c) Regions where open water cannot be detected for physical reasons (e.g. dense vegetation, urban areas, deserts), can be determined a priori, and (d) Historic water extent maps can be derived, essentially as a by-product of the model calibration, which may serve as a reference for distinguishing between floods and the normal seasonal water extent.
 
data processing architectures underlying the different scientific algorithms, namely single-image, dual-image, and data cube processing architectures. The latter is based on the “data cube” concept, whereby SAR images are geocoded, gridded and stored as analysis ready data (ARD) in an existing spatio-temporal SAR data cube. By using a data cube, where the temporal and spatial dimensions are treated alike, each Sentinel-1 image can be compared with the entire backscatter history, allowing to implement different sorts of change detection algorithms in a rather straightforward manner. Importantly, the entire backscatter time series can be analysed for each pixel. Therefore, model training and calibration may be carried out systematically for each pixel. As laid out in the tender, advantages of working with data cubes are: (a) Algorithms are better able to handle land surface heterogeneity; (b) Uncertainties can be better specified; (c) Regions where open water cannot be detected for physical reasons (e.g. dense vegetation, urban areas, deserts), can be determined a priori, and (d) Historic water extent maps can be derived, essentially as a by-product of the model calibration, which may serve as a reference for distinguishing between floods and the normal seasonal water extent.
 
Fulfilling the requirements of the tender, the consortium will build the Sentinel-1 flood monitoring service on top of and in synergy to an existing data cube processing infrastructure which already serves an operational service, namely the Sentinel-1 surface soil moisture data service that is part of the Copernicus Global Land Service (CGLS).
 
Fulfilling the requirements of the tender, the consortium will build the Sentinel-1 flood monitoring service on top of and in synergy to an existing data cube processing infrastructure which already serves an operational service, namely the Sentinel-1 surface soil moisture data service that is part of the Copernicus Global Land Service (CGLS).

Revision as of 11:32, 4 March 2021

[Home]


GFM' flood products are based on an ensemble approach integrating three robust, cutting edge algorithms developed independently by three leading research teams.
The motivation for choosing such a methodology is to substantially improve accuracy of the derived Sentinel-1 flood and water extent maps and to build a high degree of redundancy into the production service.

The (internal) availability of three separate flood and water extent maps tackles, by readily identifying them, the shortcomings a single algorithm, by itself, might be suffering of in specific conditions and/or part of the world due to many well-known factors like topography or environmental conditions: for all those reasons In the frame of the quality assurance procedure, the ensemble approach will further allow differentiating between classification errors that can be attributed to shortcomings of individual algorithms and errors that are inherent to the SAR sensing instruments and their difficulty to capture the appearance or disappearance of surface water in particular situations. The user, on the other hand, will only receive “consensus flood maps” where two or even all three cutting-edge algorithms agree, giving her or him the extra confidence needed to use the fully-automatic Sentinel-1 flood data product instead of manually derived flood maps.


data processing architectures underlying the different scientific algorithms, namely single-image, dual-image, and data cube processing architectures. The latter is based on the “data cube” concept, whereby SAR images are geocoded, gridded and stored as analysis ready data (ARD) in an existing spatio-temporal SAR data cube. By using a data cube, where the temporal and spatial dimensions are treated alike, each Sentinel-1 image can be compared with the entire backscatter history, allowing to implement different sorts of change detection algorithms in a rather straightforward manner. Importantly, the entire backscatter time series can be analysed for each pixel. Therefore, model training and calibration may be carried out systematically for each pixel. As laid out in the tender, advantages of working with data cubes are: (a) Algorithms are better able to handle land surface heterogeneity; (b) Uncertainties can be better specified; (c) Regions where open water cannot be detected for physical reasons (e.g. dense vegetation, urban areas, deserts), can be determined a priori, and (d) Historic water extent maps can be derived, essentially as a by-product of the model calibration, which may serve as a reference for distinguishing between floods and the normal seasonal water extent. Fulfilling the requirements of the tender, the consortium will build the Sentinel-1 flood monitoring service on top of and in synergy to an existing data cube processing infrastructure which already serves an operational service, namely the Sentinel-1 surface soil moisture data service that is part of the Copernicus Global Land Service (CGLS).