S-1 TOPS ESD Coregistration with snappy

Hello step-forum community,

I’m want to replicate the RUS webinar: Rapid Landslide Detection with Sentinel-1 - HAZA07 using snappy. With the script below I’m trying to apply the S-1 TOPS ESD Coregistration step (22:17 - 27:46). When I run the script, python gets stuck on writing the output. There are no error messages returned. Instead, I’m forced to close/reset the console after waiting an hour. In contrast, the snap GUI completes coregistration in a few seconds.

I’m new to snappy, so I could be using incorrect parameters for the snappy objects or that the order of my workflow is wrong.

What do you think is happening?

Thanks for your help!

import os
import snappy
from glob import iglob
from os.path import join

# Read
product_path = '/home/huw/Documents/insar/iceland/original/'
input_S1_files = sorted(list(iglob(join(product_path, '**', '*S1*.zip'),

pre_landslide_s1 = snappy.ProductIO.readProduct(input_S1_files[0])
post_landslide_s1 = snappy.ProductIO.readProduct(input_S1_files[1])

def topsar_split(product_s1):
    parameters = snappy.HashMap()
    parameters.put('subswath', 'IW2')
    parameters.put('selectedPolarisations', 'VV')
    parameters.put('firstBurstIndex', 6)
    parameters.put('lastBurstIndex', 7)
    topsar = snappy.GPF.createProduct('TOPSAR-Split',
    return topsar

pre_landslide_topsar = topsar_split(pre_landslide_s1)
post_landslide_topsar = topsar_split(post_landslide_s1)

def apply_orbital_file(product_topsar):
    parameters = snappy.HashMap()
    parameters.put('orbitType', 'Sentinel Precise (Auto Download)')
    parameters.put('polyDegree', 3)
    orbital = snappy.GPF.createProduct('Apply-Orbit-File',
    return orbital

pre_landslide_orbital = apply_orbital_file(pre_landslide_topsar)
post_landslide_orbital = apply_orbital_file(post_landslide_topsar)

# Back-Geocoding
parameters = snappy.HashMap()
parameters.put('demName', 'GETASSE30 (Auto Download)')
parameters.put('demResamplingMethod', 'BILINEAR_INTERPOLATION')
parameters.put('resamplingType', 'BILINEAR_INTERPOLATION')
parameters.put('maskOutAreaWithoutElevation', True)
parameters.put('outputDerampDemodPhase', True)
landslide_geocoding = snappy.GPF.createProduct('Back-Geocoding',

# Enhanced-Spectral-Diversity
parameters = snappy.HashMap()
landslide_esd = snappy.GPF.createProduct('Enhanced-Spectral-Diversity',

# Write

Do you save the processed data to disk when you do it in the SNAP GUI?
I don’t think that everything is processed within seconds.

Yes, when I use SNAP GUI I save it to my disk.
I can’t work out why there is no output when I try to run the script above in python.

SNAP and snappy generally don’t actually do any calculations until the value is needed, so the GUI will indicate that a file has been saved long before all the data are actually written (so if you try to open the output file too soon you find output data are missing). If the snappy process is much slower than the GUI you may be using different memory settings or asking for more RAM that the system can provide. Compare the SNAP GUI setting with java_max_mem in snappy.ini (on this 16GB system, both are set to 11G so running both at the same time could cause one to be very slow). The Windows Task Manager Performance Panel should provide some information on memory usage and disk I/O.

Thanks for your response.

I hope RAM allocation isn’t the problem. When I first installed snappy I followed this very helpful SNAP Wiki and another useful tutorial. The system I’m using has 64GB. I configured the memory allocation based on the linked tutorials. Maybe I did something wrong here?

80% of 64GB

java_max_mem: 51G

80% of 64GB

jvm_maxmem = '51G'

80% of the java_max_mem value


After configuration, I tested snappy by following this RUS webinar where I processed Sentinel-1 SAR data with no RAM problems. (I hope RUS make more snappy-python tutorials, they are helpful!).

I don’t have the SNAP GUI open when I run snappy in python. In the SNAP GUI > Help > About Snap > Memory: 39410 MiB. Is this the memory allocation for SNAP GUI? How do I change it?

Do you think I still have a RAM allocation problem or is something wrong with my input parameters for the snappy operators?

I’d also like to add that now when I run my script, a file that’s 2.5 MB is being written while the code runs. However after waiting an hour, the code is still running and the output file being written is only 2.5 MB. Memory usage increased by 5%.

I wonder if Python is stalling somewhere – maybe in an Auto Download? Sometimes you get a corrupt download that causes problems later, so you might check the already downloaded files. I generally prefer to do all downloads before starting batch processing – one less failure mode to worry about.