Subset Sentinel-1 products

Hello colleauges
Now I’m process my data for image fusion sentinel-1 and sentinel-2 following the tutorial advanced by Mr. ABraun. (S1TBX Synergetic use of S1 (SAR) and S2 (optical) data Tutorial)
the problem is I can’t get matching between two images although these in same coordinate system UTM.
In addition the Geo-Coordinate four sides for subseting process are same too.
My Geo-Coordinate as like as square but after the process of terrain correction TC for S-1 product became rectangular and turning to side

Any clarifications are welcomed

radar data is rotated during range doppler terrain correction. So what seems to be the acutal extent before the correction can slightly differ afterwards. So it is good to make the subset larger than expected because the extent of the S2 image cannot directly applied to non-geocoded S1 data.

1 Like

Hmmm OK
I"d think subset of s1 after Terrain Correction, but it difficult to conduct the workflow of your tutorial (Calibration_Speckle filtering_TC) because the workflow failed in the speckle filtering process with GC over head limit exceeded error…
What do you think about subseting S1 before and after TC when the area of S1 become matching S2 image exactly. I do it process and without problem.

In case you are not sure about the coverage, make a first subset before terrain correction which contains more than the S2 extent. After terrain correction and collocation you can make a second subset which only containts the parts which are covered by both images.

the GC overhead limit error indicates insufficient RAM. So maybe you make the first subset earlier to reduce the data size and computing time.

Thanks Mr. ABraun
RAM of my PC 16 giga byte, and l’m always go option and performance and increase and compute cache size /benchmark test value… However got error when applying (cal_spk_tc) workflow, which it failed in speckle filtering as I mentioned above.

have you tried if it is the same for all filter types? Maybe you can test another one.

Thank you Mr. ABraun
Yes I`m now try the other speckle filtering one after one however I get error (Cannot construct DataBuffer)…

this error mean don’t have memory ((Either your system does not have enough memory (RAM) or the configuration for SNAP is not sufficient)). SNAP help.

Here in the forum I read post how configured the VM Parameter in this post ((Basic error with Snappy (Python beginner)))… there discussed the configuration with snappy and edit the file of (snap.properties) in (etc. folder). Actually I’m do editing the snap.conf and replace the Xmx from 11 to 13 giga but the file doesn’t save the change with denied alarm.

Any advices are welcomed.

you need to open the file with administrator privlilleges in order to save it.

How do this please? :thinking:

Ok Mr. ABraun
I changed the (snap.conf) from Xmx11G to Xmx13G, but I got message ((
GC Overhead limit Exceeded)) with all speckle filtering, however my PC 16 Giga Ram and Intel® Core™ Core i7 processor with 8 cores threads.

how large is your image? 16 GB should be sufficient for speckle filtering actually…

Hello Mr. ABraun
the sentinel-1 GRD about 1 Giga, the messages error of ((GC overhead limit exceeded/ Cannot Construct Data Buffer)) with no solution.

you can also try to run the benchmarking tool which adjusts the memory settings for you.

Tools > Options > Performance > Processing > Select the Speckle Filter operator here and click compute

Thanks Mr. ABraun
But how can release the VM Parameters for edit maximum Xmx of RAM used by SNAP (Screenshot 1) ; and so what about the S1TBX option I saw posts in other websites checked on the Use FileCache in Readers to conserve memory (Screenshot 2).


this option approximates the settings which are the best for your machine. Therefore, you cannot edit it yourself