Snap2stamps package: a free tool to automate the SNAP-StaMPS Workflow

good. Now you select exactly these two products and load them into the table of Menu > Radar > Coregistration > S1 TOPS Coregistration > S1 Back Geocoding

In the second tab you select SRTM 1Sec (AutoDownload) as DEM start the process with Run. Please share a screenshot of either the result (RGB of stack of two intensities) or the error message.

so the coregistration works - I wonder why you said it doesn’t work when you perform it manually:

Please create a RGB image as described in this tutorial: Sentinel-1 TOPS interferometry (page 9) and share it.

If it looks alright, please apply ESD to the coregistered product

I mean it does not work when I run the python coreg_ifg_topsar py project.conf script coreg_ifg_computation.xml (5.9 KB) using this graph .

the xmls cannot be called directly, they only serve as a template which is modified and executed by the python scripts.

How exactly did you run slaves_prep.py, splitting_slaves.py and coreg_ifg_topsar.py before you have encountered the error? Please post the commands you executed.

I create slaves and master folders, then splitting master image and set in master directory, and slaves images in slaves directory, and set these incoreg_ifg_computation.xml (5.9 KB) coreg_ifg_topsar.py (4.9 KB) project.conf (712 Bytes) project.conf then Then I executed these codes in order.

conda activate py27

python slaves_prep.py project.conf

python splitting_slaves.py project.conf

python coreg_ifg_topsar py project.conf

now I see.

Please download the python scripts and xml files again and do not modify them. This is all done automatically by python. The only thing that has to be set by you is the project.conf file. This looks alright already* . Then you run the scripts as you have written above.

*CACHE=32G is too high when your computer has 32 GB RAM. If this is the case, try 24G instead.

Okay. Thank you very much for your help. I will follow these instructions and share the result with you.

The instructions you mentioned separate the whole sabwase and take a lot of time. :neutral_face: the Snap 8 is not really interesting and friendly

If they are containing 3 burst the coreg xmls need to be updated. Specially the one ending with subset

You mean updated from your Github page, right?
Not manually, just to make it clear.

Maybe the coordinates in the config file are not correct.

I checked, the coordinates look alright and should reduce the bursts to around 3

no, I sure project.conf (712 Bytes) the coordinates in the config file are correct.

Do you have any other ideas? :roll_eyes:

please first test and confirm our suggestions from above.

how can I update the coreg xmls.

First, please visually check the intensity of all split products for valid data (just as above)

Then make sure that you use the latest versions provided on GitHub (you can keep the project.conf file)

Lastly, you change

<useSuppliedShifts>false</useSuppliedShifts>

to

<useSuppliedRangeShift>false</useSuppliedRangeShift>
<useSuppliedAzimuthShift>false</useSuppliedAzimuthShift>

in these files:

  • coreg_ifg_computation.xml
  • coreg_ifg_computation_subset.xml

These are the only adjustments which are necessary. No user parameters have to be entered.
Then save these xmls, maybe delete the created folders ifg and coreg and run the python script for coregistration again.

1 Like

Dear ABraun,
Many Thanks to the invaluable answers and tips from you and your colleagues specially @mdelgado and @marpet, the problem was finally resolved, and after updating the XML files, the Coreg script is working successfully.
Sincerely,

2 Likes

very good, congratulations on not giving up!

Hello dear ABraun,
I’m going to process two swaths(Iw2 and Iw3) in the snap2stamps. I wanted to know how I could do this. And which of the two should I write in the project.conf file?IW2 or IW3?
THANKS

currently, snap2stamps only supports the processing of single sub-swaths. You can probably process all points and then merge them in the end. Do you really need such a large area?