@marpet With your cluster, do you know if you do a similar thing with reading as you do writing? Eg. ProductIO.writeProduct() to a temporary file then add this to HDFS? It doesn’t seem to look like I can write a product to a path in HDFS directly from the SNAP classes.
I get a java.io.IOException: failed to create data output directory: <my directory in hdfs>
It’s either a similar issue with reading, or possibly permissions? Although I’m running the code as the owner of the folders in HDFS
Yes, we do the same for writing. First write it locally and then copy to HDFS.
Just, in case you are curios. This is the about page of our cluster. A bit outdated.
The disk space is meanwhile around ~1.5 PB.
When trying to write the file locally and copy to HDFS, I’m having an issue with saving as BEAM-DIMAP, it appears that writing to a temporary file with ProductIO.writeProduct() does not produce the .data file, only the .dim, how would you go about doing this as ProductIO.writeProduct() only takes in one File object to write to?.
the DIMAP writer tries to write the .data file into the same directory the that you specify for the .dim file.
Could it be that this directory is not writable ? Maybe you better create a temporary direcory first and the write the .dim into that directory.
If I’m pointing into the wrong direction could you share the lines of code that involve the writing?
Hello,CiaranEvans,
i coded it like your discribled above to creat a tempFile. And the HDFS data can lode this File , but while to use product.getBands () , there have exception, java.io.IOException: Stream closed at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy. Can you help me to solve this?
@Xiaojian_Gan Could you possibly show me the code you’re using to do this?
I can’t really help with what you’ve given me right now, the most I can offer is that it seems your file system has closed before you try to access it: