Trouble writing calibrated RS2 quad-pol product (some bands appear empty)

Hello,

I am setting up RADARSAT-2 fully polarimetric image analysis through snappy. The first step I’m implementing is radiometric calibration. This process seems to work fine.

When trying to write the target product to BEAM-DIMAP, the following error messages are thrown :
javax.imageio.IIOException: I/O error reading image metadata!
java.lang.IllegalArgumentException: Empty region!

This only happens when more than one i-q couple is used in the calibration. Furthermore, the product saved (despite the errors) shows ~130 Mo per i or q band (which would suggest data is present), but when opened in SNAP one in two band is blank (i_HV, q_HV, i_VV, q_VV in the example below).

Does anybody know what I might be missing ?

SDB

####Simplified code :

import os
import snappy
from snappy import ProductIO
from snappy import GPF
from snappy import jpy
from snappy import ProductUtils

GPF.getDefaultInstance().getOperatorSpiRegistry().loadOperatorSpis()
HashMap = jpy.get_type(‘java.util.HashMap’)

Read source product

source_product = ProductIO.readProduct(product_path)

Radar-> Radiometric-> Calibrate

parameters = HashMap()
parameters.put(‘sourceBands’,‘i_HH,q_HH,i_HV,q_HV,i_VH,q_VH,i_VV,q_VV’)
parameters.put(‘outputImageInComplex’, True)

target_product = GPF.createProduct(‘Calibration’, parameters, source_product)

Write target product

ProductIO.writeProduct(target_product, image_name+’_cal.dim’, ‘BEAM-DIMAP’)

For the sake of completeness, here is how I solved my problem.

It turned out to be a memory issue which was resolved by using the SNAP performance optimisation tool (I was directed there by this interesting forum post : GPT batch processing abnormality). It is located in snap -> Preferences -> Performance. Running «compute» on the System parameters adjusted the java heap limits (from the default -Xmx5120m -Xms256m to -Xmx1468m -Xms1468m for my system) and the productWriter can now complete its task.

I hope this can help anyone who faces a similar issue in the future !

SDB

Are these numbers correct? You have decreased the heap space size and now it works?

Hi marpet,

Yes. These parameters enabled me to write the backscattering matrix in BEAM-DIMAP. I’m now facing memory issues when writing the covariance matrix, but I hope using a subset of my image will be enough to resolve this.

SDB

Follow-up : memory problems persist on both a mac (8G) and a PC with Windows 7 (16G).
On the mac 8G machine, I can successfully write a product to BEAM-DIMAP when it is expressed as a scattering matrix (4 iq couples : 8 files) but writing a polarimetric matrix product (9 files) fails with one of the following errors :
RuntimeError: org.esa.snap.core.gpf.OperatorException: Cannot construct DataBuffer
RuntimeError: org.esa.snap.core.gpf.OperatorException: Java heap space

On the PC 16G machine, it is impossible to write to BEAM-DIMAP through snappy wether the product is simply the original RS2 product or a processed (calibrated and filtered for instance) product. Writing fails with :
RuntimeError: java.lang.OutOfMemoryError: Java heap space

What I tried :
Explicitly adjusting the java heap parameters either through the snap-conf-optimiser tool or in the snappy.ini, jpyconfig.py or snappy-conf.bat files doesn’t solve the problem. Neither does setting disableFileCache and useTileFileCache to true and false (or vice versa) in snap-properties. SNAP, OS and python are all 64bit (as is the processor). There is sufficient storage space (more than 100 Go on the PC).

I’m pretty lost as to what else I can do ! Any ideas ?

Thank you in advance for your time,
SDB

PS. I also tried to attach a progress monitor to the writing operator to get more information but I cannot find the name of the progress monitor in snappy.