I doubled the RAM on my virtual machine to solve the Java Heap Space error and now the memory in the .snap/snap-python/build/lib/snappy/snappy.ini and in the .snap/snap-python/snappy/snappy.ini configuration has this value:
The ram it’s working as it should but I still have Java Heap Space error that doesn’t allow to create backscattering + incidence angle file
You didn’t mention the total RAM in the system or how you are using ESA SNAP snappy. Windows Task Manager can show you how memory as actually being used. Python and Java each do their own memory management, so you can encounter situations where large arrays are allocated by both Python (e.g., numpy) and Java, leading to Java Heap Space or Python errors.
Committed shows 50.7/73.5. The larger number is what was requested by your processes and can be memory that was allocated by never accessed (e.g., because Java errored out).
Look under the processes tab to see memory usage by process. If you are creating large python arrays, python may grab too much memory and gpt won’t be able to create Java arrays. Sometimes you can get past a Java Heap Space error by running the ESA SNAP snappy script immediately after a reboot.
After some testing (and avoiding paging to have max 64 GB RAM available) the java heap space error however appears after some elaboration (my code create back scattering of multiple data). Does it exist a way to clean the Java Heap space after each elaboration?
Dispose products you don’t need anymore, and call the Garbage Collector by System.gc()
This cleans at least the Java side.
But it also depends on how you are using the data on the Python side.
As a general comment - I don’t know your code or what you are doing Python - it helps to reduce memory if you process the data in chunks and not the whole scene at once.