I’m having problems reading a Sentinel3 file (L2, C2RCC processed, NetCDF4-BEAM, ~3GB) with snappy.
I use the simple ‘snappy 1st Contact’ script from the ‘How to use the SNAP API from Python’ Tutorial. I can read a L1 file and plot radiances with this script without problems, but trying to plot conc_chl from L2 C2RCC file fails in line:
chla.readPixels(0, 0, w, h, chla_data)
I already followed suggestions from other forum posts to increase memory. I have enough RAM available and made these changes in:
snappy.ini > java_max_mem: 16G
jpyconfig.py > jvm_maxmem = ‘16G’
I also tried @gusortiz suggestion to set max_mem directly in __init__.py.
The problem still persists.
Through the SNAP GUI I can read and write everything just fine.
Any idea what I’m doing wrong here?
You can notice the change in the value of max_memory when you edit the java_max_mem property. Unfortunately, the values are only correct up to ~2GB. Above they are wrong. This is caused by an issue in jpy which is already fixed but not yet released. At least you can see if the setting has an effect.
Maybe you can send me the file. I’ll send you a ftp location in a separate message.
Thanks Marco, with the snappy_c2rcc.py script I can read/write the scene linewise but not as a whole.
I checked the memory settings and theres a problem. I only get this:
max memory: 491 MB
total memory: 491 MB
free memory: 350 MB
no matter if I give java_max_mem 1G, 2G, … 32G, or even change the path to snap_home, it has no effect. Seems snappy.ini is never considered.
Even when I increase the default max_mem in __init__.py nothing changes.
Let me know if you have an idea, otherwise I’ll try a clean installation of snap.
I don’t think this is entirely correct.
When using snappy (calls like GPF.createProduct(...)) then it should be sufficient to change memory settings in snappy.ini.
If you generate command line calls (like gpt graph.xml ...) and create sub-processes then the gpt.vmoptions need to be changed.