I need to create a 4 years PSI time series but my computer can only handle 31 Sentinel-1 images at a time (aproximately 1 year of observation). So i was thinking in creating 4 PSI processings (1 for each year) and then extract the information of the same point in every result. In this way I will have 4 graphs that i will then merge. The question is: ¿How can I have the graph of the same point based on its coordinates, if it is not by clicking the exact same point in the graph every time (wich is challenging)?
¿Or is there a better way to approach this problem?
Thanks in advance!
If it is a stable scatterer, you could maybe use only 15 images per year (24 days between pairs), but this would still lead to over 60 images over 3 years. The snap2stamps package allows you to subset the area of interest to a large degree as described here: Snap2stamps package: a free tool to automate the SNAP-StaMPS Workflow - #158 by andretheronsa
But don’t make it too small, otherwise the unwrapping produces weird results.
Thanks ABraun for your answer! I think that if I increase RAM to 128 GB it would be posible to analyze 15 images/year for 4 years. Also I see a command in StaMPS manual to plot a region, so making that region small may be what I need.
The bottleneck is the coregistration process where you have to coregister all images to the master. That is where you need as much RAM as the combined disk space of the images. Is there a workaround for this step that consumes less RAM? Is it posible to subset the images before coregistration? I undestand that it is not but maybe someone knows how
The co-registrations are usually done in batch so only one co-regstration at the time. You certainly won’t need as mush RAM as all your input data put together…
Thanks for the knowledge mengdahl! So I need to learn how to batch coregister in Windows with SNAP cause without batching it really consumes a lot of RAM.
You can start with the batch-processing tool to test your graphs out - this should be explained in the tutorial - @ABraun might know which one.
Unfortunately the batch processing tool will also consume too much memory if you throw too many images at it. For large jobs it’s better to script everything, for example with snapista in Python, which will then call gpt separately for each coregistration and not run into memory problems.
The pre-processing for PSI can be done with snap2stamps (StaMPS - Detailed instructions) and it processes all files one by one (also the coregistration) which is quite memory effective already.
Thanks Mengdahl and ABraun! I am on my way to test all of this and I will report my outcomes soon.
Hi! I just want to thanks Abraun and Mengdahl because after updating the snap2stamps scripts from python 2.7 to python 3 everything worked correctly. You just saved me 800 dollars in RAM sticks. So much appretiated!!