Yes, but even at 16GB making very big graphs should be avoided.
Please help me in solving this error.
Solving the error for you is not possible without complete information of your situation. Did you try to get it to work as a graph first?
Please have a look at the original post. I’ve answered there.
Btw, please don’t ask the same question in two threads. One time is usually enough.
Hello, I have the same problem. How can I solve it?
please go through the solutions proposed above or ask more specificly.
Hi. I’m facing the same problem. I’m working with very large COSMO-SkyMed Himage, and I want to do a multi-temporal speckle filter using my 6 images. Each image has about 1.8 Gb. I’m running in a 32 Gb RAM server, and it doesn’t seem to use more than 20 Gb. I tried to stack one by one but got the ‘Cannot construct DataBuffer’ at the 4th.
I also tried to save the output to BigTIFF.
I’m using the SNAP GUI where I already set to 32 Gb in the Performace options.
Do you have any idea what else could be the cause?
Thanks for any help.
I solved java heap, data buffer memory errors in S1TBX (S1 TOPS) with 32 gig of memory and set SNAP cache size to 16384 (the larger of 3 benchmark test values) in Tools>Options>Performance in 64 bit variety of SNAP. Seems to have the same effect as editing -Xmx string in snap/etc/snap.conf.
Still, after memory intensive operations, it serves to save the product and exit SNAP (to clear the memory?). Must save the processing product or it is lost on exit. More memory is better.