I’m pretty new to Snap and want to automatically process a lot of Sentinel-1 Data.
My Inputs are over 50 S1A_IW_SCL_1SDV…SAFE files.
I’ve already built a graph that reads 1 of these files and splits them up into single bursts and does some processing. It also writes these single bursts into GeoTiffs (there are 27 write operators in the graph).
Is there a way that I can create a fitting script (I’m on Linux) that takes let’s say an input folder, the graph I build and then writes all of the files into a output directory?
When I try to use the built-in Batch-Processing Tool, I get problems with the names and due to that files being overwritten.
Hopefully someone can guide me a little bit with my issue, thanks in advance!
an example is given here: https://senbox.atlassian.net/wiki/spaces/SNAP/pages/70503475/Bulk+Processing+with+GPT