Thanks for Your reply.
Yes, it might be something in processing chain as it is actually quite difficult to understand which tools should be used in which order. Processing chain in different forum topic also sometimes varies, so there are a lot of experimenting, what happens when something is changed, if it works or not.
Output din’t change with or without debugging, the problematic part of it was:
Wed Apr 12 16:27:38 2017, Calculating Maximum backscatter: d:\file1.dim
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Program%20Files/snap/snap/modules/ext/org.esa.snap.snap-netcdf/org-slf4j/slf4j-simple.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/…/SNAP/modules/ext/org.esa.snap.snap-netcdf/org-slf4j/slf4j-simple.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See SLF4J Error Codes for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]
Wed Apr 12 16:27:40 2017 Writing: statsname.tif
java.lang.NullPointerException
java.lang.NullPointerException
java.lang.NullPointerException
java.lang.NullPointerException
java.lang.NullPointerException
Traceback (most recent call last):
File “s1-spk.py”, line 395, in
statsbackscatter(file1, ‘Maximum’)
File “s1-spk.py”, line 311, in statsbackscatter
snappy.ProductIO.writeProduct(targetDB, productpath, ‘GeoTIFF-BigTIFF’)
RuntimeError: org.esa.snap.core.gpf.OperatorException: java.lang.NullPointerException
Which happens during function statsbackscatter(filename, statistic):
This part works well, if I write speckle filtered product and open it again (commented lines between ###test###)
print (“\n”, time.asctime(time.localtime()), “\nCalculating”, stat, “backscatter”, file1)
source1 = snappy.ProductIO.readProduct(file1)
#deburst
parameters = HashMap()
targetTOPSARDeburst = GPF.createProduct(“TOPSAR-Deburst”, parameters, source1)
#multi temporal speckle filter
parameters = HashMap()
targetSpk = GPF.createProduct(“Multi-Temporal-Speckle-Filter”, parameters, targetTOPSARDeburst)
###test###
##Write
#testpath = “spk.dim”
#print (time.asctime(time.localtime()), ‘Raksta:’, testpath)
#snappy.ProductIO.writeProduct(targetSpk, testpath, ‘BEAM-DIMAP’)
#reopen
#targetSpk = snappy.ProductIO.readProduct(testpath)
###test###
#stats
parameters = HashMap()
parameters.put(‘statistic’, stat)
targetStat = GPF.createProduct(“Stack-Averaging”, parameters, targetSpk)
#multilook
parameters = HashMap()
targetMultilook = GPF.createProduct(“Multilook”, parameters, targetStat)
#terr
parameters = HashMap()
parameters.put(‘demName’, ‘External DEM’)
parameters.put(‘externalDEMFile’, ‘dtm.tif\’)
parameters.put(‘nodataValueAtSea’, ‘false’)
targetTerrain = GPF.createProduct(“Terrain-Correction”, parameters, targetMultilook)
#linear to dB
parameters = HashMap()
targetDB = GPF.createProduct(“LinearToFromdB”, parameters, targetTerrain)
###Write
productpath = path + statsname
print (time.asctime(time.localtime()), ‘Writing:’, statsname)
snappy.ProductIO.writeProduct(targetDB, productpath, ‘GeoTIFF-BigTIFF’)
# clean memory
source1.dispose()
I hope these parts could help us find out, what is not good.