I still have problems:
In the gpt.vmoptions file I included the same settings I chose in the gui under preferences --> performance:
In the gui the execution of a graph takes 30 min. The same graph processed using the commandline gpt takes more than 4h. What can I do?
I have a MacOSX:10.10.5 with 32 GB RAM
By the way what setting would you recommend for Xmx, Xms, tile and cache size?
thanks. I computet the settings by chosing different combinations.
Then I saved the changes. When I closed snap and reopened now I got some strange messages. It took me some time to find that in snap.conf now a slash after jdkhome is missing which caused the mesages:
The uncomment line was the new which caused the error messages (). I changed this and it worked.
Thanks for all your patience.
Regarding the wrongly changed snap.conf I think that @NicolasDucoin will have a look.
Regarding the big performance difference we need to further investigate why this happens. Do you experience this with every operator or only with a few specific. Is it possible that it is dependent on the source product? Or maybe it depends on the format of the target product.
Do you have any observation to share?
-c Sets the tile cache size in bytes. Value can be
with ‘K’, ‘M’ and ‘G’. Must be less than maximum
available heap space. If equal to or less than
caching will be completely disabled. The default tile
cache size is ‘4,096M’.
-q Sets the maximum parallelism used for the computation,
i.e. the maximum number of parallel (native) threads.
The default parallelism is ‘8’.
Indeed after some test, I have tried to use -x “Clears the internal tile cache after writing a complete row of tiles to the target product file. This option may be useful if you run into memory problems” and decreasing in gpt option the max memory to 3G and it works very well. However the counterpart seesm to have more hard disk access (I am not really sure)
Did you know if there is equivalent parameter in snap than -x because I think it could be an important way to solve memory problem as I have shown in the forum.
I will do other test in Windows on 4Go ram standard laptop to validate this settings.
Then the 4G of RAM is the problem. This is not enough to handle the amount of data.
Probably your OS is also still 32Bit? Then it is really not sufficient. If it is already 64Bit, then you can try to tweak the memory settings a bit.
In the ‘etc’ folder of the installation directory of SNAP, you’ll find a file named snap.conf. Open it in a text editor.
There is the line which starts with ‘default_options=’
In this line, you’ll find an option like -J-Xmx2G. Increase the value. You could use something like -J-Xmx3G.
I am using snap gpt on Ubuntu 18.04 but it not finish process when i try with sentinel1-slc graph. My computer has 32 GB RAM. what can I do? please help me solve this problem! Has anyone run slc graph in Ubuntu successfully?