We have a Sentinel-1 GPT processing workflow which applies the following steps to each raw image:
- Read
- Apply-Orbit-File
- ThermalNoiseRemoval
- Calibration
- Terrain-Correction
- Subset
- LinearToFromdB
- Write [to NetCDF4-BEAM]
GPT Graph: sentinel1_gpt_EU01.xml (5.1 KB)
This workflow almost always works and produces a valid NetCDF. However in seemingly random cases, it produces an invalid NetCDF that cannot be read by other NetCDF readers. In another software I get errors like this: “There was an error opening this dataset at 4775020970896772037 file length = 4775020970896772037 Unexpected end of file”
An example of an image that fails is:
S1A_IW_GRDH_1SDV_20211220T231702_20211220T231730_041099_04E1F9_46EE.zip
When I load the raw Sentinel-1 image into the Desktop application, it shows data inside the subset area. I manually rebuilt the GPT processing steps in the SNAP Desktop application and it also produced an invalid NetCDF4-BEAM file for this Sentinel-1 image. Strangely, when I change the format to NetCDF4-CF, it produces a valid file.
Questions: Why does this fail for some Sentinel-1 images? Will changing the export format to NetCDF4-CF produce a more robust workflow? I recall having issues with NetCDF4-CF a year or two ago, which is why we ended up using NetCDF4-BEAM in the first place.
Note: I also run workflows for Sentinel-2 (producing NetCDF4-BEAM files) and RCM (producing NetCDF4-CF files), and this only seems to be a problem for Sentinel-1 imagery.
Using SNAP 8.0.9.