i am experiencing following problems when processing S1 TOPS Coregistration with ESD.
my S1 dataset located over Iceland (acquisition dates 04.06.17 and 16.06.17 recently downloaded from scihub, we use the external DEM for back geocoding)
First i thought it was the data that was too new (i firstly started a few days ago processing the scenes acquired on 22.06.17 and 10.06.17) and snap couldn’t find precise orbit file at that time yet that’s why error (by looking at the log file)
Then i switched to the older dataset said above and also increase the heap space size -Xms8g (see below) max heap size have been set to -Xmx88g, still i got the below error. could someone help solving this problem?
i run snap on the remote Linux server (128 gb RAM) from my PC.
Linux OS is Centos 7.0, Snap latest version 5.0 andS1 Toobox version 5.0.5 were used
thank you very much in advance.
[username@linux-server ~]$ snap -J-Xms8g
Entire image is outside of SRTM valid area.
Please use another DEM.
Entire image is outside of SRTM valid area.
Please use another DEM.
Entire image is outside of SRTM valid area.
Please use another DEM.
Entire image is outside of SRTM valid area.
Please use another DEM.
Entire image is outside of SRTM valid area.
Please use another DEM.
Entire image is outside of SRTM valid area.
Please use another DEM.
Entire image is outside of SRTM valid area.
Please use another DEM.
java.lang.OutOfMemoryError: Java heap space
Dumping heap to /home/jirathana/.snap/system/var/log/heapdump.hprof …
Java HotSpot™ 64-Bit Server VM warning: record is too large
Java HotSpot™ 64-Bit Server VM warning: record is too large
Java HotSpot™ 64-Bit Server VM warning: record is too large
Heap dump file created [53930262169 bytes in 198.193 secs]
Java heap space
Java heap space
java.lang.NullPointerException
java.lang.NullPointerException
java.lang.StackOverflowError
java.lang.StackOverflowError
java.lang.NullPointerException
java.lang.NullPointerException
Java heap spacejava.lang.StackOverflowError
java.lang.StackOverflowError
java.lang.StackOverflowError
java.lang.StackOverflowError
java.lang.NullPointerException
java.lang.NullPointerException
java.lang.NullPointerException
i’m not quite sure how many GB of RAM are allowed to use by SNAP on my linux server. And i couldn’t find snap.conf file in the ‘etc’ "folder at all. Is it a kind of ‘hidden’ file in the directory?
one more question: is it possible to coregister S1 (SLC) subset images with this operator> S1 TOPS Coregistration?
Have you got this experience?
I have the same heap error issue and have tried to apply the fix using the Tools->Options but the VM window cannot be edited in my version of SNAP. I don’t have a snap.conf file either or I would try that.
even if i subset the image (reduced 50% of size), i still got the java heap space error.
BTW i haven’t seen the bar used to select the Bursts as well thanks to @ABraun a really good Tipp.
at Tool > options i can’t modify VM parameters as well (they are in grey and faded out), although my admin said that i do have all rights to read and write the file.
thank you for all advices and for further advices! i really would like to get the things solved…
Another way to approach the java heap space error: flush memory. Save products, save the session, close SNAP, open SNAP, select File>Reopen Product. Select only the product needed. Alternatively, select File>Session>Open Session, select the session. Some operations, viewing images seems to hold memory, this removes that load.
Also look at creating virtual memory. This is emulated memory on a drive. I have not tested utility of virtual memory by SNAP.
My understanding of “swapping” or “disk swapping” is when a multitasking operating system takes an image of memory used for program execution, writes it to a disk, when the program is idle, freeing memory for active program execution. That’s memory management in multitasking environment. Noticeably slow on old hardware, much faster with new, faster bus, faster drive speed.
Virtual memory is similar, where much slower disk is used to increase the amount of virtual memory. However, I have seen virtual memory, showing as “enabled” by default, on a new machine. Still, I have not tested use of virtual memory.
I did watch memory usage by SNAP and solved the java heap space error by restarting SNAP.
Check the system disk usage. If the disk is full, clean up as much as possible the disk, the trash and the SNAP cache.
(check your disk usage with df using command line), for SNAP cash ( you must go to this direction to check this: var/cache to remove tmp files, in the case of your processing, because you use Sentinel1 processing SNAP doesn’t make temporary files in cache folder It makes just for Sentinel2 processing.)
-increase the minimum and maximum java heap size. To do this, modify snap.conf in /usr/local/snapx/etc. Then increase parameters Xms and XMx(respectively minimum and maximum memory size), without exceeding the allocated RAM of the VM.( you can check your memory with free command in command line). Then open with emacs snap.conf file ( you must use it with sudo otherwise it’s just read only and you can’t modify it). you can also increase parameters Xms and Xmx of gpt.vmoptions file in /usr/local/snapx/bin.
I found that setting the the VM parameter -Xmx , maximum size, to be a critical setting. A value about half the size of total memory has been working. The parameter can be changed in Options>Performance tab, or be edited the configuration file with a text editor. -Xms , minimum size, may be small (256M). Also useful to set the cache size, the medium value of the benchmark tests seems to work better than the smaller, default, value.
I ran into java heap space when performing all swath split in SNAP, and I have gone through the contributions on the subject matter but non worked in my favour yet. Can someone simplify the process of resolving heap space error? What do i need to change on the dialogue box (tool -> option https://forum.step.esa.int/uploads/default/original/2X/0/024ae75804f4c48f72e97245c1f5bf3b40b2e0c1.png). I have also increased the SNAP values but no solution yet. I also tried to remove the tmp files but permission denied.
Increasing the available memory only has an effect if your PC has this amount of RAM. It only defines how much of your RAM can be used by Java (SNAP in this case).
For example, if you have 8 GB of RAM, increasing the max value to 16 GB has no effect. m in this context stands or megabytes and the values should ideally be a multiple of 512 (explanation)
You can more flexibly also use G instead: 2G are 2048m, 4G are 4096m, and so on…
So if you have 16 GB of RAM, you can set it to -Xmx12G
If you only have 8 GB of RAM, you can set it to -Xmx6G
My system is Window10 64bit, corei5, 32GB RAM, ~30GB free space in C drive. How should I optimize configuration of the SNAP Performance (E.g: VM parameter, Cache size, Tile size, Number of Thread).
I have just upgraded SNAP from 6.0 to the version 7.0 and found it is processing slower than before.