Workflow between SNAP and StaMPS

I agree. Once you have all images in one coregistered stack, you can open the InSAR stack tool and export the list of images including their respective perpendicular and temporal baseline.

That image is from a scientific article, however it can be built in matlab o en Excel.

Thank you so much mdelgado . It’s working from me.

1 Like

Could you tell me how to add path,Please!

actually, to modify and run the StaMPS_CONFIG script before processing should do the job. Make sure that you add all paths correctly. Instructions are given here (especially point 2.2): https://gitlab.com/Rexthor/gis-blog/-/blob/master/StaMPS/1_stamps_setup.md

1 Like

If you install all the software in root (using sudo), i think no need to add any path in Stamps_config.bash.
only thing i added in .bashrc file stamps/bin like below

open .bashrc file in root
export PATH=path of stamps upto bin (/home/username/stamps/bin):$PATH

then run
source .bashrc

And then run mt_prep_snap in any path, if it comes then are ready to use Stamps.

Thank you so much ABraun. This is solving the recent error of mine.

actually, the mt_prep_snap command is no longer needed since StaMPS 4.1 but I’m glad it solved the error.

1 Like

Thank you very much. My problem has been solved and your suggestions are very useful.

1 Like

Hello one question.
I have done an analysis with Stamps and after months I would like to reopen the whole process again (step stamps (8,8) already completed in the past), to be able to generate a series of commands (basically ps_plot ( ‘vs’) among others) not made by me even ignorance of the technique; However, I am not clear if a file can be directly loaded (I don’t know which of all those generated) or on the contrary, I should repeat all the analysis from stamps (1,1) again.

Therefore my question is how do I load in matlab a past job with the final results stamps (8.8) so as not to have to repeat the entire analysis process, thank you very much for the answers.

source the config file again in the command line, then navigate to the working directory where the files are processed and start matlab again.
You should get all parameters when you enter getparm and be able to continue working with the data.

Thank you so much ABraun.


Is it possible to convert it into a csv file to be able to identify which pixels have the highest standard deviation?

Maybe something similar to what is used to export to StaMPS-Visualizer? :man_shrugging:

I’m only aware of the CSV export of the StaMPS visualizer you mentioned.

Hundreds of topics in a thread is probably too much - specially for something that started in 2016 - I am sure that people have an hard time reading and finding something useful in this.

Maybe we should lock this thread (and the StaMPS-Visualizer, SNAP-StaMPS Workflow and ask people to create new ones for specific questions.
Perhaps we can create a pinned thread with pointers to pertinent topics within these two that may be relevant.

That is why we have this topic: StaMPS - Detailled instructions It should be pinned actually…
It summarizes the current findings (including the one you mentioned) and where people could start.

Great… perhaps then we can lock this thread…

Hi @ABraun and @mdelgado
I did the whole process and it makes me doubt if this message in all the images (27 images distributed from 2017/12/03 to 2019/01/27) is an error or a good result.

Number of pixels with zero amplitude = 0

I have checked on SNAP and it seems that everything is correct. However within StaMPS
the results I get do not seem realistic to me:

As you can see, the standard deviation is too high

good job so far!

Please use ps_plot(‘v-do’, ‘ts’) to plot the time series for some selected areas. This will give you a good impression on the temporal variability of the results and if they are following a trend or not.

I thought the message Number of pixels with zero amplitude = 0
it was a mistake.

So what does it mean for all of them to be 0 and for other jobs if there are pixels?

Even so, I do not believe that my work has a correct functioning, since I have obtained values ​​greater than 15 mm for a single year in infrastructures such as buildings.
As you can see I have values ​​such as: 383mm / year

actually, the opposite is the case, too many pixels with zero amplitude indicate an error or invalid data.

In your screenshot, you show a single point with no neighbors. It is unlikely that such an isolated scatterer has a reliable value. PSI is most effective when there is a dense network of PS. Also, the temporal variation of the single inteferograms is quite high (and random), so the sum is not a realistic representation.
Again, please check the time-series plot for areas of interest and feel free to share them here. It will help us interpret if the values are feasible.

1 Like