Snap2stamps package: a free tool to automate the SNAP-StaMPS Workflow

Hi @andretheronsa,

Indeed your suggestions regarding snap2stamps are points in which I wanted to update for next release.
Regarding the data on the project directory… call target.dim I think it is a bug of the StaMPS export operator which I will handle (removing it after exporting) on the next version of the scripts.

I have one question regarding your master processing, is that method extracting only the full burst matching (intersecting) with the WKT polygon? In fact the polygon in WKT can be handle directly on my scripts (as done for the coregistration/interferogram step) but at the time I have created I had the doubt that it could be extracted the full burst (otherwise you get issues on the coregistration step) using a polygon.

I would love to try your approach for the master processing and include it on next version (indeed it is a big improvement in this regard). So thank you very much for your suggestion.

Regarding octave instead of maltab, that represents a big challenge in fact! but the mt_prep file does not use matlab, but binaries which should be compiled once you get StaMPS package. Is it correct or I am wrong?

Let me know, probably we could discuss about improving the scripts functionalities…

1 Like

Thanks for your replies and work!

Noted about the target.dim

Yes, so far in my testing that method extracts the full bursts intersecting the WKT. So it does not clip the bursts and it also does not pull any bursts not in the WKT. So I think it works well. So yes, that polygon=(geosjon x,y etc.) approach in your other scripts should work for the master script too. If this is included in the next version it will simplify your code by totally eliminating the Master format confusion as well as making it automated.

An Octave-StAMPS thread somewhere would be nice - my institute is not interested in paying for Matlab.
Since this community is all about open source, and my PhD is all about open source PSI processing I am trying REALLY hard to get Octave to work with Stamps. I have earlier in the year managed to get to step 5 in Octave without needing to change too much. Just going over the scripts and fixing one or two lines in each script after googling the matlab/octave known inconsistencies. One important learning point was that there was a bug in versions prior to Octave v4.4.1 that resulted in some needed Stamps scripts to fail.

I THINK you are wrong about the binaries, unless I am doing something wrong?
I can read the mt_prep_snap script which is a csh script that calls a matlab function in line 146…

I would like to help where I can. I need this workflow automated and robust for my PhD in any case.

2 Likes

Thanks again for your comments.

I am eager to find anything that could work with StaMPS in non matlab environment, so let me know if you are thinking to distribute your code or needing somebody to discuss or test your code.

Regarding mt_prep, you are right, I did not see that in addition to calamp and mt_extract_cands, it also calls matlab for running the ps_parms_initial.m

Thanks for your comments and good luck with the PhD!

1 Like

maybe this is relevant to someone here:

If S1 SLC data is already extracted*, the first two scripts have to be modified a bit because they only search for zip files. I made some changes so they also work for unzipped S1 files:
@mdelgado If you don’t mind, I would share them here. But if you try to avoid several versions being around (as it was the case for the updated stamps scripts in here, surely caused some confusion), I can also provide them to you and you decide if they can be integrated somehow. Currently, they only work with manifest.safe files (not both zip and safe, which would be the premium solution, but require some conditional phrase).

*In my case, the S1 SLC data are processed on the RSS Cloud Toolbox which has direct access to the archived data via a folder connection (/eodata/Sentinel-1/SAR/SLC/). This allows fast transfer of the data but they are already unzipped.

1 Like

Actually I have also another version ( :wink: ) I have done while using the VMs provided by the ESA Research and Service Support within their RSS CloudToolbox service , but I wanted to include it on next release.
If you agree, I would prefer to disseminate that within official release, but you can share them in the meanwhile.

that’s even better :+1:
I now remember that you once contacted me about the Toolbox, right?

I am currently testing stamps 4.0b6.
mt_prep_stamps worked great and using it in the RSS Toolbox makes many things easier. Currently, I get an error at step 2 - do you have an idea what could be the reason?

My guess is the version… you should use the latest development version available in github (https://github.com/dbekaert/StaMPS ). As mentioned in the past, this solves issues during step1 that affects step2. And for your log seems exactly that.

Normally that happens when your ifg contains borders with No data or zeros… had you checked that?

Btw, probably it was me the one replying on for the VM, yes. :slight_smile:
The world is very small…

1 Like

thank you for the response.
Although I downloaded the latest release 4.1b (https://github.com/dbekaert/StaMPS/releases/tag/v4.1-beta) and correctly sourced it, it still says

STAMPS: ########################################
STAMPS: ####### StaMPS/MTI Version 4.0b6 #######
STAMPS: #######  Beta version, Jun 2018  #######
STAMPS: ########################################

I also saw no borders in the interferograms.

Edit: I might have had an old path in bashrc, I will have to check this.

Had you checked it already? And? Solve the step’s 2 error?

I accidently removed a folder and now the Virtual Machine doesn’t start any longer because the folder was called in my bashrc :joy:
I have contacted the RSS support team who can hopefully reset my bashrc file.

I removed the old StaMPS installation (also from my PATH), deleted all temporary files, and compiled StaMPS 4.1b from scratch, sourced it and ran mt_prep_snap again.

Still, I get this error at step 2

Is the version number okay? The temporary matlab path also points to the new folder.

Well… it does not show the version 4.1b so it seems that it is loading some old version still…

Let me check with my matlab to see the logging…
Indeed starts with:

addpath(’/application2/software/StaMPS-master/matlab/’,path)
stamps(1,1)

STAMPS: ########################################
STAMPS: ####### StaMPS/MTI Version 4.0b6 #######
STAMPS: ####### Beta version, Jun 2018 #######
STAMPS: ########################################

But please run stamps(1,1) again with the new scripts, as it is in step 1 where the software takes care of the issue that appears in step 2.

Try and let me know

thank you for the response and testing at your side.

I run everything from scratch, including mt_prep_snap and stamps(1,1) and it seems fine until this step.
I deleted all previous versions of stamps, as well as links. After sourcing the CONFIG file, the new scripts are callable from the shell
All data is correctly exported, the preparation shows no zero amplitude pixels and reasonable mean amplitudes, in MATLAB the ‘insar_processor’ is snap, the initial baselines make sense.
But the error remains.

I’ll try a different case area.

Seems to work now. My previous area was near the coast so the SRTM might have been insufficient.
I am at step 2 now and will report when I completed the processing with StaMPS v4.1b

1 Like

Regarding the ps_load_initial_gamma I do not remember to had modified it. So it should work as it is. Let me know if this is not the case and I will share my mt_prep_gamma and ps_load scripts.

Regarding DEMs, indeed the use of a more accurate and updated DEM is better, but it becomes more important when working with higher resolution SAR (X-band stripmap for example). Indeed there are many DEMs that could be used, ie: SRTM DEM (3 or 1 arc second), ALOS DEM, TanDEM-X, etc , but always in GeoTIFF for being used using SNAP.

Can we add an aoiWKT to splitting slaves graph?
Currently the slaves are split but all the bursts are kept. Bursts are only extracted during co-regestration where only bursts overlapping with the master are kept. This creates unnecessary size and processing requirements.
The splitting slaves graph can then in fact be used to split the master too the the requirements of the project.conf file.

Yes, why not!
Had you successfully use it on any test before?
I have tried to use it just yesterday and I got an error.
So, if you provide me with a working example I will do it for next release.

One question, what happens if the AOI it between two subswaths?

2 Likes

I quickly cloned your package to https://github.com/andretheronsa/snap2stamps and made the changes I think are needed and tested it once and it seems to work! It splits MUCH quicker for small study areas since it only processes the required bursts. I havn’t done the next steps yet, so not 100% sure having less burst would break the next steps. I assume as long as you have more than one burst it will work (less then one would make ESD coreg fail right?)

TOPSAR-split only uses the AOI to select bursts, so I don’t think it matters if it is out of bounds or in another swath, it only looks for intersecting bursts in the current swath.

Summary of changes: (Only small changes needed)
splitting_slaves.py now reads the polygon data in too (from line 46) with the code from coreg_ifg_topsar.py;

  if "LONMIN" in line:
  	LONMIN = line.split('=')[1].strip()
            if "LATMIN" in line:
                    LATMIN = line.split('=')[1].strip()
            if "LONMAX" in line:
                    LONMAX = line.split('=')[1].strip()
            if "LATMAX" in line:
                    LATMAX = line.split('=')[1].strip()

polygon=‘POLYGON ((’+LONMIN+’ ‘+LATMIN+’,’+LONMAX+’ ‘+LATMIN+’,’+LONMAX+’ ‘+LATMAX+’,’+LONMIN+’ ‘+LATMAX+’,’+LONMIN+’ ‘+LATMIN+’))’

slave_assemble_split_apply_orbit.xml and slave_split_apply_orbit.xml gets polygon added to node id =TOPSAR split(1/2): (this example I had to add a space between the arrows to work on this forum)

< wktAOI > POLYGON < / wktAOI >

This way the python script reads the project.conf polygon and gives it to the graph to run extract only the relevant bursts.

I think this way the master can also be prepared with the splitting slaves scripts but not sure how you would incorporate this.

I just tested it once with an AOI that overlaps 3 bursts, no time for proper testing sorry, on my way for a holiday break. Enjoy!

1 Like

Yes, indeed this is exactly what I have done, but still I got SNAP error on the TOPSAR-Split operator. I will give a try on other machine.

Thanks

I’d like to share what I found relating to use SNAP AND StaMPS, I think those two articles are good answer for many questions, I don’t know in case of they have been posted before, but anyhow, I’d like to post them once more,

Foumelis_et_al_IGARSS2018_SNAP-StaMPS[42].pdf (591.3 KB)