NullPointerException with Terrain Flattening of TDX

I’m trying to Terrain-Flatten a TDX strip map image in the GUI but can’t get it to work for various reasons. The base image is a stripmap image of forest in central British Columbia of which I took a subset (covering about 1/3 of the scene) and which I calibrated to beta nought.

For instance, the result of one of my test runs – I applied terrain flattening a couple times to the same image and it never worked:

  1. Terrain Flattening with CDEM and bilinear interpolation. Processing completed in 15 seconds. Resulting in an image with just a Gamma0 band filled with zeros.
  2. Terrain Flattening with CDEM and bilinear interpolation. Processing completed in 7 seconds. Resulting in an image with just a Gamma0 band filled with zeros.
  3. Terrain Flattening with a custom DEM and bilinear interpolation. This was a 1m LiDAR derived DEM, covering only part of the image. Failed after about a minute with java.lang.NullPointerException.
  4. Terrain Flattening with CDEM and bilinear interpolation. Failed after about a minute with java.lang.NullPointerException

My guess is that this is some kind of environment issue, possibly related to memory. I think it’s an environment issue because 1, 2 and 4 is exactly the same thing, yet I ran into different problems. I watched the Performance Toolbar and noticed that the NullPointerException always occurs when the memory use is high if you compare it over the whole minute that the TerrainFlattening task ran. The Performance Toolbar showed something like 3.5GB/4.3GB both times.

I’m running this on 64-bit Windows with 64 GB of RAM.
VM parameters from the SNAP Performance Configuration Optimizer: -Xmx45056m -Xms256m -XX:+AggressiveOpts -Xverify:none -Dnetbeans.mainclass=org.esa.snap.main.Main -Dsun.java2d.noddraw=true -Dsun.awt.nopixfmt=true -Dsun.java2d.dpiaware=false

The message log: messages.log (75.4 KB)

Anybody got an idea what’s going on here and how to solve it?

I also just tested this on a VV & VH Sentinel-1 IW GRDH image of roughly the same area. Previous processing steps: subset, thermal noise removal, calibration to beta nought.

  1. TF with CDEM (NN resampling): results in zero-filled bands after 40s.
  2. TF with 1s SRTM HGT (NN resampling): results in a zero-filled Gamma0_VH band while the Gamma0_VV band is produced without issue. That took 52s.
  3. TF with the custom 1m LiDAR DEM (NN resampling): NullPointerException after about 5 minutes.
  4. TF with CDEM (NN resampling): NullPointerException after 9 minutes.
  5. TF with 1s SRTM HGT (bilinear interpolation): Took 37s and I got the same result as in 2.

I can see that I’ve successful done 5. on 2017-04-20 with the same image and got both bands.

(1s SRTM HGT doesn’t work with the TDX image – I got an error that the spatial resolution of the DEM is too low which totally makes sense as the TDX image has 3m resolution.)

There is no benefit in using a DEM that is of higher resolution than your target pixel spacing. So if your target is 3m pixel size you can downsample your 1m DEM to 3 meters.

Okay I can see that I don’t need a higher resolution. But can I expect this to solve my problem? Doesn’t SNAP do this somewhere in the process anyways? It’s not like I’m actually running out of memory, I’m not even using 10% of -Xmx.

I finally managed to check whether SNAP can read the 1m LiDAR DEM. It can. Terrain Correction works perfectly fine with that DEM.

Hi Lucas, I’m having the same problem here. I’m working with COSMO-SkyMed Himage (Stripmap mode) with ~2.4 m resolution. I never manage to perform the Terrain Correction but now, a Reviewer asked me to do it on my data. So, I tested it again on a small subset. I only successfully produced a not blank Gamma image when I perform Multilook to lower my SAR image resolution approx. to 12m resolution, same as my DEM.
Also, the algorithm only run when the Re-grid option is deselected. Yet don’t know why of this behavior.