well I need to reduce the file size, so usually I am working with up to 250 m resolution Its definitely coarser but still the spatial structure I want to see is resolved. Consider also that if you want to plot a jpeg or something, even if you have 10m resolution, the dpi that you define when saving a plot most likely, will not be able to resolve the nominal spatial resolution.
Ah I see, understandable. Unfortunately I don’t know then, I don’t use
incidenceAngleForSigma0 in my pipeline, but I’m not sure if that’s doing anything to affect it this badly.
When i try to make a mosaic, i have a error:
mosaic_pro = GPF.createProduct(‘SAR-Mosaic’, mosaicking, test_mosaic)
RuntimeError: org.esa.snap.core.gpf.OperatorException: Product ‘S1A_IW_GRDH_1SDV_20170106T2236_mosaic_’ has no geo-coding.
Does anyone know how to fix this error, this is my code:
mosaicking = HashMap() mosaicking.put('pixelSize', 10.0) mosaicking.put('resamplingMethod', 'BILINEAR_INTERPOLATION') mosaicking.put('sourceBands', 'Sigma0_VV') mosaic_output = mosaic_path + substring + "_mosaic_" mosaic_pro = GPF.createProduct('SAR-Mosaic', mosaicking, test_mosaic) ProductIO.writeProduct(mosaic_pro, mosaic_output, 'BEAM-DIMAP') mosaikings.append(mosaic_pro) mosaic_pro.dispose()
The problem is the
test_mosaic. How do you create it?
The error message says that is does not have a geo-coding. Probably it is not well defined.
It is not a normal S1 product. The
_mosaic_ name extension identifies this. So probably the problem is one step before the mosaicing.
I read product after Terrain Correction. This is full code i made:
mosaic_files = “D:\SENTINEL\Results\6.TerrainCorrection\*.dim”
files = glob.glob(mosaic_files)
for mosaic in files:
mosaic_path = “D:\SENTINEL\Results\7.Mosaic\”
mosaic_data = ProductIO.readProduct(mosaic)
mosaikings =  mosaikings.append(mosaic_data) names = mosaic_data.getName() substring = names[:-48]
test_mosaic =  if substring in mosaikings: print substring else: test_mosaic.append(substring) for test_mosaic in mosaikings: mosaicking = HashMap() mosaicking.put('pixelSize', 10.0) mosaicking.put('resamplingMethod', 'BILINEAR_INTERPOLATION') mosaicking.put('sourceBands', 'Sigma0_VV') mosaic_output = mosaic_path + substring + "_mosaic_" mosaic_pro = GPF.createProduct('SAR-Mosaic', mosaicking, test_mosaic) ProductIO.writeProduct(mosaic_pro, mosaic_output, 'BEAM-DIMAP') mosaikings.append(mosaic_pro) mosaic_pro.dispose() print '7.FINISH MOSAICKING DATA...........................DONE'
It really seems that one of the products “D:\SENTINEL\Results\6.TerrainCorrection*.dim” has no geo-coding.
Maybe something went wrong during the terrain-correction. Have you opened the
S1A_IW_GRDH_1SDV_20170106T2236_mosaic_ in SNAP Desktop and checked if it is geo-coded. You can also try to exclude it and see what happens.
Hi everyone, its been some time since my last efforts. I am trying again to build a SAR mosaic. Four Sentinel1 files are preprocessed with thermal and ground border noise removal, radiometric calibration, speckle filtering, terrain correction and saved as geotiffs with a spatial resolution of 150m, using snappy. Their file size at this stage is around 100Mb each. Gdalinfo shows that float32 is being used. Those files will be used as input for the mosaic. So far so good.
I have the following situation:
When using SNAP and saving it as geotiff the mosaic looks fine with a file size of 1.7 Gb (BTW isn’t a bit large??).
Now I am trying to use snappy. First attempt with GeoTIFF:
java.lang.IllegalStateException: File size too big. TIFF file size is limited to  bytes!
That is odd, why with snappy the size seems to be more than 4G?
Next attempt with GeoTIFF-Big-TIFF
After 3h the file size is more than 8G and it seems that the script is hanging…
Attempt to save it as NetCDF-CF produces the following error:
RuntimeError: java.lang.IllegalArgumentException: Variable size in bytes 8545341456 may not exceed 4294967292
Does somehow the ProductIO.writeProduct method implicitly assumes float64 or something? If this is the problem, iIs there a way to set this to different format? Is there any work-around to save it with a geotiff format using snappy?
only BEAM-DIMAP Format
Maybe your input products use a compression? 1.7 GB for a mosaic does not seem to big to me.
But the difference between SNAP Desktop and snappy is strange. I think somehow the configuration must be different.
As far as I can see the operator uses the data type of the source bands for the target bands. This should not make the difference.
Why the GeoTIFF-BigTIFF hangs after a while might be caused by your memory settings.
Please have a look at the following thread.
You can use NetCDF4-CF as output format. This uses compression by default and does not have the 4294967292 limit.
For GeoTIFF-BigTIFF you can configure the compression:
You can add those properties to the etc/snap.properties file in the installation directory.
regarding the memory settings, since the beginning my snappy.ini is configured such
java_max_mem = 8G
I am not facing any memory issues up to now, and i have tested that with several sar features.
Besides that, i do not have any error report and monitoring my memory i do not see swaping or so.
My input products are directly ESA’s safe folders. I have no idea if they have any kind of compression…
A potential workaround would then be to save the mosaic in beam-dimap format and then call with a system command the pconvert to save as GeoTIFF?
If it works, it sounds like a feasible workaround.
But actually it should be possible to write directly to Big-GeoTiff.
But I have no clue what goes wrong here.
Did you find a solution? I am facing the same problem. I am using a GeoTIFF-BigTIFF as an output, using snappy and the file that is created has 0 bytes and the script hangs for ever. I changed the compression at the etc/snap.properties:
but still not working. Any tips?
Are you using the command line or do you use the GUI?
What are your memory settings?
I wrote a .py script importing snappy.
Do you need more info?
And have you changed the tile cache size?
I left it as it was:
snap.jai.tileCacheSize = 1024
Should I change it?
Yes, try with 7000 or 8000 if you have 11G of memory available.
I changed it, but I still have the same problem. The weird thing is that I am testing two different Polygons for the creation of the subset.
When I use the small area, the script is running even if I use:
OutputType = [".tif", “GeoTIFF-BigTIFF”]
ProductIO.writeProduct(pca, newFile, OutputType)
So GeoTIFF-BigTIFF is working properly as a parameter for my output product.
When I use the larger area, the script hangs and the product has 0 bytes.
I can’t understand why this is happening. Could it be a memory setting?
How big is the difference of your polygons in terms of pixels?
Do you really need GeoTiff format?
You could try with another format.
Are you only using subset or do you have other processing steps before?