StaMPS-Visualizer, SNAP-StaMPS Workflow

May I ask how you calculated the mm/year value (color coding of your maps in R) from the single displacement values?

Dear @ABraun,

the colour coding represents the mean velocity, which is calculated during StaMPS processing. In fact, it is the same information you see in a ps-plot but not aggregated. The information is exported from StaMPS, included in the csv table and not calculated afterwards using the time series. I admit, that I tried to figure out how this value is calculated and I came very close but never get the same result…I looked up the Matlab scripts of StaMPS but due to time issues I had to accept the value without knowing the exact way it is calculated.
The colour coding in StaMPS-Visualizer is simply the min max values stretched linear to a Matlab like color ramp of the colorbrewer package…Maybe an optional histogram stretch would be interesting.
Anyway, if you figure out, how the mean velocity is calculated, I would very much appreciate to know about it!

3 Likes

thank you for your response!

Which column in the csv export is it? I found the first two were lon/lat, then there is NaN, followed by an encoding of the day. Numbers in intervals of 12, so I assume they are somehow representing the number of days from a given reference date?

If you could clarify those numbers I would be really grateful.

The csv table is not that intuitive, I have tried to make it as compact as possible…

[1,1:2] = centre of selection lon lat
[1,3] = Na
[1,4:ncol] = the date in days since 0 of the image, the intervals do not have to be equal, it depends on the image date.

That [1,3] shows a NA value is because, the values beneath it are not part of the time series, these are the mean velocity values, used for colour coding!

[2:nrow, 1:2] Measurement Points lon lat
[2:nrow, 3] mean velocity
[2:nrow, 4:ncol] cumulative displacement values

7 Likes

thank you for clarification.
[1,4:ncol] is 736865,736877,7366889… in my case. If I go back this amount of days I end up in the Quarternary :smile:
Did I misunderstand something or are my numbers somehow wrong?

7366889 this number can not be right…are you sure about it?

the first two numbers do make sense to me, quick and dirty:
736865/365=2018.808 which could be legit…

today it is day 737285 :wink: https://www.epochconverter.com/seconds-days-since-y0

4 Likes

I guess I shouldn’t make PS InSAR if I cannot even type a 6 digit number :see_no_evil:

Now everything is clear. Thank you for your help patience.

1 Like

@ABraun, you are always welcome! from your screenshot, it looks like you are working in an urban area, SBAS would be very interesting for you, give it a try :slight_smile:

1 Like

thank you for the suggestion! I would really like to test it.
Is mt_prep enough for the newly created directory or do I again need mt_prep_gamma(_snap)?

In both cases, PS and SBAS workflow, the first command is the mt_prep_gamma_snap

thank you. I was just confused by the instructions in the StaMPS manual.

@ABraun, just note that with SNAP you can only do StaMPS PSI as it is currently now.
In the future I guess that either using my scripts and/or SNAP, we will be able to work with StaMPS SBAS using SNAP as InSAR processor.

The mt_prep script you should run it once all the single master interferograms (for PSI) or multi-master interferograms (for SBAS) are done. And for the moment SNAP only does the single master. There are always work around to solve that, but for the moment they must be implemented.

3 Likes

I am aware of that, but thank you for pointing it out. There is some testing going on at the moment, let @thho know if you are interested to participate.

I would love to… @thho, may I do anything related to the SBAS test that @ABraun has mentioned?

Dear @thho, I used 24 TerraSAR-X images of Barcelona (provided here http://www.intelligence-airbusds.com/en/8262-sample-imagery) and was able to fully process them in matlab. However, when I start the export to CSV, I get an error:

Error using horzcat
Dimensions of arrays being concatenated are not consistent.

Error in Barcelona_2000m_export (line 14)
export_res = [lon2 lat2 disp ts];

Do you have an idea what might have caused it?

Had you by chance used also the snap2stamps package for the PSI data preparation? The current version needs adaptation to the SAR Stripmap datasets, as it was initially thought to the Sentinel-1 TOPSAR, but it is easy to adapt it.

Keep in mind that StaMPS does not correct for the PS geolocation so you will probably get PS shifted, mainly due to PS in high buildings… so you will probably see this in the results, right?

2 Likes

I didn’t adapt snap2stamps for TSX by now as the preprocessing is considerably easier (no orbit, no debursting, no sub-swaths…). It was basically just coregistration, subsetting and export.
Therefore I see no need for automation.

I thought about the shift of the PS but as the StaMPS Visualizer didn’t work by now, I didn’t have the chance to see the PS against a aerial image basemap.

1 Like

Infact, as you have said, the steps reduces for Stripmap, but the graphs may be adapted. I will do it in the next weeks probably.

Dear @ABraun,

this is a common error, which occurs in regions with a high density of measurement points. I was not able to figure out why this is happening in detail. Anyway, I wrote a work around to avoid this included in the current version of the visualizer in the manual within the application, which you can find here:

stamps_visualizer_installation_guide.html (2.8 MB)

stamps_visualizer_20.tar.gz (488.6 KB)

@mdelgado for SBAS processing, I will probably work on this project next month in order to present a stable workflow, I will present it here, but maybe it will take more time, I can not grant it :confused:

5 Likes

thank you very much for providing the latest version.
In my case, the error occurs already in MATLAB before the csv is exported. But never mind, the point density is in case very high. I will try to reduce it in the weeding step.