HDF error trying to export granule to HDF5 format

I have a Sentinel granule (S1A_S1_SLC__1SSV_20141018T142338_20141018T142403_002885_003443_654) that I opened in SNAP and when I try to export it as HDF5 format, I get the following error:

org.esa.snap.core.dataio.ProductIOException: HDF library error: null
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.createH5G(Hdf5ProductWriter.java:708)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeMetadataElement(Hdf5ProductWriter.java:410)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeMetadataElement(Hdf5ProductWriter.java:419)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeMetadataElement(Hdf5ProductWriter.java:419)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeMetadataElement(Hdf5ProductWriter.java:419)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeMetadataElement(Hdf5ProductWriter.java:419)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeMetadataElement(Hdf5ProductWriter.java:419)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeMetadata(Hdf5ProductWriter.java:404)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeProductNodesImpl(Hdf5ProductWriter.java:145)
at org.esa.snap.core.dataio.AbstractProductWriter.writeProductNodes(AbstractProductWriter.java:109)
at org.esa.snap.core.dataio.ProductIO.writeProduct(ProductIO.java:393)
[catch] at org.esa.snap.rcp.actions.file.WriteProductOperation.writeProduct(WriteProductOperation.java:148)
at org.esa.snap.rcp.actions.file.WriteProductOperation.run(WriteProductOperation.java:115)
at org.netbeans.modules.progress.ui.RunOffEDTImpl$3.run(RunOffEDTImpl.java:275)
at org.openide.util.RequestProcessor$Task.run(RequestProcessor.java:1423)
at org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:2033)

Hi,

at least I can confirm that this happens. I don’t know why this happens. Maybe there is something special about S1 data. Maybe the size? I tried to convert MERIS data to HDF5 and it works. So it is not the HDF5 export in general which has a problem
I’ve created a ticket in our issue tracker.

Can you work with a different format? Maybe you can NetCDF4?

Yes, I found that BEAM-DIMAP works fine, as well as bigTiff, but we really wanted to use HDF5, as our user community prefers that format over the others. I will try a few other formats as well, just to see how it does.

Thanks,
CS

Hi,
I am having the same problem. I am running GPT and trying to write an OLCI product after subsetting to a HDF5 file format. However I get this error:
org.esa.snap.core.gpf.OperatorException: Not able to write product file: ‘/Users/Mark/OneDrive/Code/OLCI-MPH-Processor/output/Hartbeespoort/refl/S3A_OL_1_EFR____20160509T074147_20160509T074447_20160509T095230_0179_004_049_3419_SVL_O_NR_001_refl.h5’
at org.esa.snap.core.gpf.graph.GraphProcessor$GPFImagingListener.errorOccurred(GraphProcessor.java:373)
at com.sun.media.jai.util.SunTileScheduler.sendExceptionToListener(SunTileScheduler.java:1654)
at com.sun.media.jai.util.SunTileScheduler.scheduleTile(SunTileScheduler.java:929)
at javax.media.jai.OpImage.getTile(OpImage.java:1139)
at com.sun.media.jai.util.RequestJob.compute(SunTileScheduler.java:255)
at com.sun.media.jai.util.WorkerThread.run(SunTileScheduler.java:476)
Caused by: org.esa.snap.core.gpf.OperatorException: Not able to write product file: ‘/Users/Mark/OneDrive/Code/OLCI-MPH-Processor/output/Hartbeespoort/refl/S3A_OL_1_EFR____20160509T074147_20160509T074447_20160509T095230_0179_004_049_3419_SVL_O_NR_001_refl.h5’
at org.esa.snap.core.gpf.common.WriteOp.doExecute(WriteOp.java:296)
at org.esa.snap.core.gpf.internal.OperatorContext.executeOperator(OperatorContext.java:1242)
at org.esa.snap.core.gpf.internal.OperatorImage.computeRect(OperatorImage.java:65)
at javax.media.jai.SourcelessOpImage.computeTile(SourcelessOpImage.java:145)
at com.sun.media.jai.util.SunTileScheduler.scheduleTile(SunTileScheduler.java:912)
… 3 more
Caused by: org.esa.snap.core.dataio.ProductIOException: HDF library error: null
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.createH5G(Hdf5ProductWriter.java:708)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeMetadataElement(Hdf5ProductWriter.java:410)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeFlagCodings(Hdf5ProductWriter.java:392)
at org.esa.snap.dataio.hdf5.Hdf5ProductWriter.writeProductNodesImpl(Hdf5ProductWriter.java:144)
at org.esa.snap.core.dataio.AbstractProductWriter.writeProductNodes(AbstractProductWriter.java:109)
at org.esa.snap.core.gpf.common.WriteOp.doExecute(WriteOp.java:294)
… 7 more

Error: Not able to write product file: ‘/Users/Mark/OneDrive/Code/OLCI-MPH-Processor/output/Hartbeespoort/refl/S3A_OL_1_EFR____20160509T074147_20160509T074447_20160509T095230_0179_004_049_3419_SVL_O_NR_001_refl.h5’

Any progress on this error?

Thanks!

Yes I found something. Probably it will be fixed with the next version.
The problem is that in the metadata attributes and elements with the same name exist.
e.g.:

+ bandDescriptors
  + band
    + name
  + band
  + band
  + band
...

This doesn’t work in HDF5.
If there are multiple equal elements we probably need to enumerate them.

1 Like

Meanwhile, till the release, you can create a subset while exporting the product and exclude the metadata. I think it is sufficient to exclude the manifest.

1 Like

Hi Marko, thanks for the reply.

I did exclude the metadata after subset, but it still seems to not want to write a HDF5 file (using GPT from command line). I haven’t tested in in the desktop app though.

I would really like to use snappy in python but I can’t seem to overcome the installations problems on my macbook. So I will try snappy on my linux machine and see how far I get.

On mac it easily happens that not the appropriate wheel exists. Probably on Linux you have more luck.

Hi Marko

I did have luck on linux :slight_smile:

Can I ask a question:
Does the snappy module provide all the functionality of the gpt command line tool, including the memory efficient use of data?

I am thinking of converting all my GPT graph command line calls into direct python code using snappy.

Thanks.