I planned to do just basic steps to get backscatter from Sentinel-1 grd product and use it in machine learning with other satellites. But SNAP takes too long to process the data.
I am just doing apply orbit file - calibration - terrain correction - linear to DB and write.
And for about 800 files, SNAP takes 2 days to process. Considering 4-year data I intended to apply it makes around 1.5 - 2 months using graphs and batch processing. Any suggestions on how to speed up the process or any thoughts about faster software, please. I have started to read for about 1.5 months about sentinel-1 as an undergraduate student. So I think I am kind of an experienced newbie . Also, is snappy faster than graphical interface somehow?
I would really appreciate your contributions.
Some ideas
- is creating a subset feasible? Drastically reduces processing time
- do you really need all available data over 4 years? I suppose that there are lots of redundancies
- maybe reducing your analysis to data of one track (relative orbit) would be sufficient (also reduces inconsistencies regarding incidence angle and look direction)
- ASF allows to order analysis ready data (ARD): Webinar: Sentinel-1 On-Demand RTC Processing: Generating Analysis-Ready SAR Data with ASF Vertex | Earthdata, radiometrically terrain corrected (RTC) on demand: Sentinel 1 On-Demand RTC Processing: Generating Analysis Ready SAR Data with ASF Vertex - YouTube (but the free access is also limited to certain volumes)
- under given conditions (residency) you can apply for a virtual machine with high computing capacities provided by Copernicus RUS: How does RUS work? – Research and User Support Sentinel data can directly be accessed from a network drive which additionally speeds up the processing
- if none of the abvoe works, Google Earth Engine might be worth a try, maybe even a Google Virtual Machine
Your processing-time is only a few minutes per image, not “slow” by reasonable standards. If you need to process the data yourself you will need to spread the computing on a number of machines. There are tools both inside and outside SNAP that will help you achieve this. If you have IT-support available to help you, even better.
Thank you for your reply I will look into the webinars. I am just using data of south USD and as per professor said for this project we need all the data. The thing is I am somehow responsible all the Sentinel-1 part of the project. If I can’t manage othher options, I will ask about google earth engine, hope you help there as well. Have a healthy day.
You are right process speed is reasonable, but due to volume, total time takes so much time and I don’t have acces to computer center of the university, so I need to make go with departments pc. Also, I am just trying to speed up the process, sorry if I offended you.
No offense taken, Sentinel products are large and processing them quasi-instantaneously is usually not possible.
BTW do you have access to a group of PCs in your department? If you do it could be possible to set up Remote execution on that group of PCs.
You are right but unfortunately only 1 big one is accesible by me.
If you have tons of RAM setting up RAMDISK can help, but realistically with thousands of images to process it is going to take quite a long time.
Google Earth Engine has ALL Sentinel-1 GRD processed to geocoded calibrated backscattering (10 m pixel spacing) already. On Copernicus DIAS instances (cloud compute coupled to the Sentinel archive) you can order Processing-as-a-Service with unit prices per GRD (around 0.3-0.5 Euro per GRD scene to get the same. I don’t know where USD is, but chances are that not all DIAS instances may have that in archive.
If you only have a single workstation (and no money to spend) your best bet is GEE.
Like thank you so much. The thing I am required to prepare is already prepared, vaov thanks. I think I will try to do low pass filtering and data combining. Your response was heaven sent Thank you