Gpt command to export .dim to netcdf

Hello,
I am trying to use gpt command in Ubuntu to export a BEAN-DIMAP file to NetCDF but I am not getting it.

My intention is using gpt command to get a DEM band, and then to transform to netcdf file.

I use
./gpt AddElevation /home/user/Documents/S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.zip
and obtain
S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.data
and
S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.dim

However, I am trying to use ./gpt Write -f NetCDF-CF /home/user/Documents/S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.dim

but errors occur. Can you help me, please? How must I use this command?

Manu

Can you provide the error text?

SLF4J: Failed to load class “org.slf4j.impl.StaticLoggerBinder”.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
org.esa.beam.framework.gpf.OperatorException: [output] is null
at org.esa.beam.framework.gpf.internal.OperatorExecutor$GPFImagingListener.errorOccurred(OperatorExecutor.java:385)
at com.sun.media.jai.util.SunTileScheduler.sendExceptionToListener(SunTileScheduler.java:1646)
at com.sun.media.jai.util.SunTileScheduler.scheduleTile(SunTileScheduler.java:921)
at javax.media.jai.OpImage.getTile(OpImage.java:1129)
at com.sun.media.jai.util.RequestJob.compute(SunTileScheduler.java:247)
at com.sun.media.jai.util.WorkerThread.run(SunTileScheduler.java:468)
Caused by: java.lang.IllegalArgumentException: [output] is null
at org.esa.beam.util.Guardian.assertNotNull(Guardian.java:70)
at org.esa.beam.framework.dataio.AbstractProductWriter.writeProductNodes(AbstractProductWriter.java:104)
at org.esa.beam.gpf.operators.standard.WriteOp.writeProductNodes(WriteOp.java:265)
at org.esa.beam.gpf.operators.standard.WriteOp.computeTile(WriteOp.java:276)
at org.esa.beam.framework.gpf.internal.OperatorImage.computeRect(OperatorImage.java:78)
at javax.media.jai.SourcelessOpImage.computeTile(SourcelessOpImage.java:137)
at com.sun.media.jai.util.SunTileScheduler.scheduleTile(SunTileScheduler.java:904)
… 3 more

Error: [output] is null

I think the right way to get it is this:

./gpt Write -Ssource=S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.dim -PformatName=NetCDF-CF -Pfile=S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.nc

Yes this looks good. But i think the following should work too.

./gpt Write -f NetCDF-CF -t S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.nc S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.dim

Thanks for your answer but I have tried using this command but does not work. I get the same error as the first command I have used.

The problem with the last command that I have proposed is that the target file is very big (about 20 GB). It is logical that is so great?

Yes. the size is reasonable. It is not compressed as the original S1 data.
Try as formatName NetCDF4-CF. This will use compression.

I have already tried but I get a java.lang.OutOfMemoryError: Java heap space error. Is there any way to extract only the variables that I need and then export to netcdf?

Yes, have a look at the Subset operator. There you can specify the variables and the region you are interested in.

Ok. If I try to extract the “elevation” variable, I should use the following command ?

./gpt Subset -Ssource=S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.dim -PtiePointGridNames=elevation

Yes, but you don’t need to specify -Ssource. Just provide the source file as last parameter.
Following the command line with format and target file parameter.

./gpt Subset -PtiePointGridNames=elevation -f NetCDF4-CF -t S1A_IW_GRDH_1SDV_20150907T061013_elevation.nc S1A_IW_GRDH_1SDV_20150907T061013_20150907T061038_007605_00A887_68C1.dim

Again I have problems with java heap space:

Exception in thread “SunTileScheduler0Standard2” java.lang.OutOfMemoryError: Java heap space
Exception in thread “SunTileScheduler0Standard3” java.lang.OutOfMemoryError: Java heap space
Exception in thread “SunTileScheduler0Standard1” java.lang.OutOfMemoryError: Java heap space
Exception in thread “SunTileScheduler0Standard0” java.lang.OutOfMemoryError: Java heap space

Do you know any solution to solve this problem?

You could try creating a subset or reducing the resolution.

Thanks. I will try to reduce the resolution.

Hello again, I tried to create a subset, first this way:

gpt Subset -PbandNames=‘elevation’ -PtiePointGridNames=‘latitude,longitude’ -f NetCDF4
-CF -PgeoRegion=‘POLYGON((42.9 -8.5,42.7 -8.5,42.7 -8.7,42.9 -8.7,42.9 -8.5))’ -
t target.nc target.dim

But the following error occurred:

Error : cannot construct databuffer .

Then I tried the following:

gpt Subset -PbandNames=‘elevation’ -Pregion=‘6500,2500,8000,8000’ -PtiePointGridNames=‘latitude,longitude’ -f NetCDF4-CF -t target.nc target.dim

And so it worked.

Is there any error in the first command? Is there another way to do this with PgeoRegion?

Strange. Your first command looks good and should work.
What kind of product is the target.dim you use as source? Is it possible
that you provide it?

It is a BEAM-DIMAP file. It is obtained when I running the following command:

gpt AddElevation S1A_IW_GRDH_1SDV_20150506T064148_20150506T064213_005797_007738_79F2.zip

Also a directory target.data is also obtained.

This is the error I get:

edu.ucar.ral.nujan.hdf.HdfException: must call endDefine first
at edu.ucar.ral.nujan.hdf.BaseBlk.throwerr(BaseBlk.java:155)
at edu.ucar.ral.nujan.hdf.HdfGroup.writeDataSub(HdfGroup.java:1026)
at edu.ucar.ral.nujan.hdf.HdfGroup.writeData(HdfGroup.java:987)
at edu.ucar.ral.nujan.netcdf.NhFileWriter.writeTreeDimData(NhFileWriter.java:481)
at edu.ucar.ral.nujan.netcdf.NhFileWriter.endDefine(NhFileWriter.java:446)
at org.esa.snap.dataio.netcdf.nc.N4FileWriteable.create(N4FileWriteable.java:174)
at org.esa.snap.dataio.netcdf.NetCdfWriteProfile.writeProduct(NetCdfWriteProfile.java:50)
at org.esa.snap.dataio.netcdf.DefaultNetCdfWriter.writeProductNodesImpl(DefaultNetCdfWriter.java:62)
at org.esa.snap.core.dataio.AbstractProductWriter.writeProductNodes(AbstractProductWriter.java:109)
at org.esa.snap.core.gpf.common.WriteOp.doExecute(WriteOp.java:287)
at org.esa.snap.core.gpf.internal.OperatorContext.executeOperator(OperatorContext.java:1272)
at org.esa.snap.core.gpf.internal.OperatorImage.computeRect(OperatorImage.java:65)
at javax.media.jai.SourcelessOpImage.computeTile(SourcelessOpImage.java:137)
at com.sun.media.jai.util.SunTileScheduler.scheduleTile(SunTileScheduler.java:904)
at javax.media.jai.OpImage.getTile(OpImage.java:1129)
at com.sun.media.jai.util.RequestJob.compute(SunTileScheduler.java:247)
at com.sun.media.jai.util.WorkerThread.run(SunTileScheduler.java:468)
edu.ucar.ral.nujan.hdf.HdfException: close: the following dataset chunks still need to written:
/elevation chunk indices: [0 0]
/elevation chunk indices: [0 620]
/elevation chunk indices: [0 1240]
/elevation chunk indices: [0 1860]
/elevation chunk indices: [0 2480]
/elevation chunk indices: [0 3100]
/elevation chunk indices: [0 3720]
/elevation chunk indices: [0 4340]
/elevation chunk indices: [0 4960]

/lon chunk indices: [16128 20460]
/lon chunk indices: [16128 21080]
/lon chunk indices: [16128 21700]
/lon chunk indices: [16128 22320]
/lon chunk indices: [16128 22940]
/lon chunk indices: [16128 23560]
/lon chunk indices: [16128 24180]
/lon chunk indices: [16128 24800]
/lon chunk indices: [16128 25420]
/y chunk indices: [0]
/x chunk indices: [0]

at edu.ucar.ral.nujan.hdf.BaseBlk.throwerr(BaseBlk.java:155)
at edu.ucar.ral.nujan.hdf.HdfFileWriter.close(HdfFileWriter.java:531)
at edu.ucar.ral.nujan.netcdf.NhFileWriter.close(NhFileWriter.java:515)
at org.esa.snap.dataio.netcdf.nc.N4FileWriteable.close(N4FileWriteable.java:183)
at org.esa.snap.dataio.netcdf.DefaultNetCdfWriter.close(DefaultNetCdfWriter.java:123)
at org.esa.snap.core.gpf.common.WriteOp.dispose(WriteOp.java:428)
at org.esa.snap.core.gpf.common.WriteOp.writeProduct(WriteOp.java:230)
at org.esa.snap.core.gpf.GPF.writeProduct(GPF.java:404)
at org.esa.snap.core.gpf.main.DefaultCommandLineContext.writeProduct(DefaultCommandLineContext.java:63)
at org.esa.snap.core.gpf.main.CommandLineTool.writeProduct(CommandLineTool.java:501)
at org.esa.snap.core.gpf.main.CommandLineTool.runOperator(CommandLineTool.java:281)
at org.esa.snap.core.gpf.main.CommandLineTool.runGraphOrOperator(CommandLineTool.java:247)
at org.esa.snap.core.gpf.main.CommandLineTool.run(CommandLineTool.java:151)
at org.esa.snap.core.gpf.main.CommandLineTool.run(CommandLineTool.java:123)
at org.esa.snap.core.gpf.main.GPT.run(GPT.java:54)
at org.esa.snap.core.gpf.main.GPT.main(GPT.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.esa.snap.runtime.Launcher.lambda$run$12(Launcher.java:55)
at org.esa.snap.runtime.Engine.runClientCode(Engine.java:186)
at org.esa.snap.runtime.Launcher.run(Launcher.java:51)
at org.esa.snap.runtime.Launcher.main(Launcher.java:31)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.exe4j.runtime.LauncherEngine.launch(LauncherEngine.java:62)
at com.install4j.runtime.launcher.UnixLauncher.main(UnixLauncher.java:57)

org.esa.snap.core.gpf.OperatorException: Cannot construct DataBuffer.
at org.esa.snap.core.gpf.internal.OperatorExecutor$GPFImagingListener.errorOccurred(OperatorExecutor.java:375)
at com.sun.media.jai.util.SunTileScheduler.sendExceptionToListener(SunTileScheduler.java:1646)
at com.sun.media.jai.util.SunTileScheduler.scheduleTile(SunTileScheduler.java:921)
at javax.media.jai.OpImage.getTile(OpImage.java:1129)
at com.sun.media.jai.util.RequestJob.compute(SunTileScheduler.java:247)
at com.sun.media.jai.util.WorkerThread.run(SunTileScheduler.java:468)
Caused by: java.lang.RuntimeException: Cannot construct DataBuffer.
at com.sun.media.jai.util.DataBufferUtils.constructDataBuffer(DataBufferUtils.java:132)
at com.sun.media.jai.util.DataBufferUtils.createDataBufferFloat(DataBufferUtils.java:214)
at javax.media.jai.ComponentSampleModelJAI.createDataBuffer(ComponentSampleModelJAI.java:271)
at javax.media.jai.RasterFactory.createWritableRaster(RasterFactory.java:691)
at javax.media.jai.PlanarImage.createWritableRaster(PlanarImage.java:1982)
at javax.media.jai.PlanarImage.getData(PlanarImage.java:2160)
at javax.media.jai.PlanarImage.getData(PlanarImage.java:2016)
at com.bc.ceres.glevel.MultiLevelImage.getData(MultiLevelImage.java:59)
at org.esa.snap.dataio.netcdf.metadata.profiles.cf.CfTiePointGridPart.encode(CfTiePointGridPart.java:50)
at org.esa.snap.dataio.netcdf.NetCdfWriteProfile.writeProduct(NetCdfWriteProfile.java:52)
at org.esa.snap.dataio.netcdf.DefaultNetCdfWriter.writeProductNodesImpl(DefaultNetCdfWriter.java:62)
at org.esa.snap.core.dataio.AbstractProductWriter.writeProductNodes(AbstractProductWriter.java:109)
at org.esa.snap.core.gpf.common.WriteOp.doExecute(WriteOp.java:287)
at org.esa.snap.core.gpf.internal.OperatorContext.executeOperator(OperatorContext.java:1272)
at org.esa.snap.core.gpf.internal.OperatorImage.computeRect(OperatorImage.java:65)
at javax.media.jai.SourcelessOpImage.computeTile(SourcelessOpImage.java:137)
at com.sun.media.jai.util.SunTileScheduler.scheduleTile(SunTileScheduler.java:904)
… 3 more

Error: Cannot construct DataBuffer.

I think you should 1st geocode the data with terrain correction before trying to add elevation.

Ok, I am going to try it. How should I run the command , like this?

gpt Terrain-Correction -t S1A_IW_GRDH_1SDV_20150506T064148_20150506T064213_005797_007738_79F2.dim S1A_IW_GRDH_1SDV_20150506T064148_20150506T064213_005797_007738_79F2.zip

Thank you very much , I have tried it and now it works.