Snap2stamps package: a free tool to automate the SNAP-StaMPS Workflow

I just completed all scripts and they work like charm. No problems with mt_prep_gamma_snap (provided by katherine) and all of the data was included. Makes it a lot easier to prepare Sentinel-1 data.

I just have one question. In case 1) I add a new scene in the slaves folder or b) I don’t want all processed data to be exported - what should I do best?

2 Likes

Hi @ABraun,

Glad to see that snap2stamps works fine for you as well!
Regarding the mt_prep_gamma_snap run you made, could you tell me which values were reported as amplitude average per scene? if it was written in the script an s instead of f for the format type, the script will return values over 0.xxxxx instead of being a lot higher, and then you may get errors later inside StaMPS processing.

Regarding your question, in both cases you should let on the search folder only the data that you want to process or export.

  1. the splitting starts with reading data within the slaves folder, but later steps read all the other folders
  2. the stamps export reads the folders ifg and coreg

The current best option is to moved the already processed data. But with this indeed grows the need of the overwrite option (true or false) for the processing. I will take care of it.

Thanks for the comments.

Regarding the mt_prep_gamma_snap run you made, could you tell me which values were reported as amplitude average per scene? if it was written in the script an s instead of f
for the format type, the script will return values over 0.xxxxx instead
of being a lot higher, and then you may get errors later inside StaMPS
processing.

Thank you for pointing it out. I noticed it in the other topic but overlooked it.
With mt_prep_gamma_snap i get mean amplitudes of 30-50 while with mt_prep_gamma I get values between 24000 and 26000. I am currently testing the difference for the later steps.

In such case it should be ok. The values you got are in the expected range for C-band (and obviously it changes from site to site).
Hence, I expect you will manage to get through the entire StaMPS processing.
Good luck!

I now compared it:

  • mt_prep_gamma_snap: mean amplitudes of 30-50, but only 22 PS remaining (something seems wrong)
  • mt_prep_gamma: mean amplitudes between 24000 and 26000 with ~17.000 PS remaining, but the geocoding got lost (no lat/lon values displayed when checking with getparm)

Had you tried with Katherine’s script or with the one I have uploaded in the other thread?
I put it here again just in case. mt_prep_gamma_snap (6.4 KB)
This works perfectly for me.

Let me know

2 Likes

I didn’t see this one, thank you very much! I’ll try it.
Is it normal that I get these values for getparm?

            ref_centre_lonlat: [0 0]
            ref_lat: [-Inf Inf]
            ref_lon: [-Inf Inf]

It is normal as these are the default values for those variables and it means that you had not set the reference point, so it getsthe entire scene and takes the average value as reference (in fact I was still thinking what you meant with “no lat/lon values displayed when checking with getparm”)

I am still surprise for the 22PS with amplitude values on that range 30-50. I am wondering whether the processing of some slave was wrong. Could you check it in the calamp.out ? There the average values per slave are describe and maybe some of them could be near to 0… not sure.

Anyway, please let me know which output you get using this latter script.

thank you. I again checked. All (n=36) lay between 28 and 56. Before weeding, there are around 17000 PS but after weeding only 20 remain. Again, the geocoding is correct but the script seems to miss the correct PS candidates.

Maybe I should just try another area.

Had you used already the mt_prep_gamma_snap that I have provided here?

yes, I replaced it.

And already run mt_prep_gamma_snap and arrived until completing stamps step 4?
That was then really fast!

How big is your AOI?
If you want, we can try to see that together… it might be always easier to pick where is the problem

Yes, some steps are quite fast, I also wondered.
The AOI is roughly 10x10 km but some of it is covered by water and might not be usable at all.

Thank you for the offer! I think I might first just test another area to see if turns out different. Could be that the area I selected is not suitable for PSI. But I’ll let you know, of course.

Thank you again for your support!

1 Like

Maybe the weeding thresholds you used are strong.
Or as you said, that area is not the best one to get PS points.

you were right. Increasing the standard_deviation to 1.4 resulted in 4000 PS and the plots look reasonable. I’ll investigate this some more.

1 Like

Cool! I am glad that my advices are still useful! :sunny:

It would be nice if you could share some plots with us.
Just to have shared a sample of the first PSI results obtained with the snap2stamps scripts by the first time*! :wink:
Looking forward to it

*first time without counting myself

I will, of course!

1 Like

I keep getting an error now with slaves_split
Error: [NodeId: TOPSAR-Split] -1

I looked at splitgraph2run.xml and found that the output raster was defined as

D:\S1_Mexico/split/20150802/20150802_D:\S1_Mexico\split\20141006\20141006_IW1.dim.dim

What format does the
PROJECTFOLDER= in project.conf need if I am running Windows? I can’t remember what I inserted the first time when it worked.

1 Like

Hi @ABraun!

The PROJECTFOLDER should contain a full path where can be found the slaves folder.
My guess is that if the step 1 on the slave preparation worked, it might not be because of the PROJECTFOLDER.
Could you provide more info about this? Had you run step 1, right?

Was your first try also run in Windows OS? I have tested only in several linux distros.

Let me know if this helps.

it was the same machine and as far as I know I entered a full project folder in both cases. Step 1 was successful, all data was sorted according to the acquisition date. I am currently repeating it from scratch, now it works again…
Strange, but as long as it runs - fine :slight_smile:

1 Like