Currently I’m trying to integrate snap into a data pipeline that I have. I’ve got it running on a system with 64 GB of ram and I’m running gpt with a cli flag allocating 32 GB of ram but I’m still getting the null pointer error and “could not construct databuffer” error that is typical with not enough ram set aside. I’m curious if I’m doing something wrong/should maybe update some sort of setting or if it’s actually the case that 32 GB is not enough ram to process SAR data.
Please provide more details: OS, SNAP version, your command-line, and the graph (attach it unless it is very simple). Have you tried the
You can also specify heap size in the
gpt.vmoptions file. The supported OS’s all have tools to monitor memory usage that may help confirm that is using 32G RAM (and that nothing else is hogging RAM – it has happened more than once that a user had multiple gpt processes running in background).