Snappy MemoryError with Sentinel3-C2RCC file


I’m having problems reading a Sentinel3 file (L2, C2RCC processed, NetCDF4-BEAM, ~3GB) with snappy.
I use the simple ‘snappy 1st Contact’ script from the ‘How to use the SNAP API from Python’ Tutorial. I can read a L1 file and plot radiances with this script without problems, but trying to plot conc_chl from L2 C2RCC file fails in line:
chla.readPixels(0, 0, w, h, chla_data)

I already followed suggestions from other forum posts to increase memory. I have enough RAM available and made these changes in:
snappy.ini > java_max_mem: 16G > jvm_maxmem = ‘16G’

I also tried @gusortiz suggestion to set max_mem directly in
The problem still persists.

Through the SNAP GUI I can read and write everything just fine.

Any idea what I’m doing wrong here?
many thanks

Actually you are doing nothing wrong. But in general it would be better to process the data line wise.
More like in this example.

Setting the memory in the snappy.ini should be sufficient.
Actually I wanted to suggest the following little script to detect if the memory settings are effective:

Runtime = jpy.get_type(‘java.lang.Runtime’)
max_memory = Runtime.getRuntime().maxMemory()
total_memory = Runtime.getRuntime().totalMemory()
free_memory = Runtime.getRuntime().freeMemory()
mb = 1024 * 1024
print(‘max memory:’, max_memory / mb, “MB”)
print(‘total memory:’, total_memory / mb, “MB”)
print(‘free memory:’, free_memory / mb, “MB”)

You can notice the change in the value of max_memory when you edit the java_max_mem property. Unfortunately, the values are only correct up to ~2GB. Above they are wrong. This is caused by an issue in jpy which is already fixed but not yet released. At least you can see if the setting has an effect.

Maybe you can send me the file. I’ll send you a ftp location in a separate message.

I was able to read chl with the (1.6 KB), line wise and the whole scene.

My snappy.ini looks like this:
snap_home = C:\Program Files\snap
# java_class_path: ./target/classes
# java_library_path: ./lib
# java_options: -Djava.awt.headless=false
java_max_mem: 3G
# debug: False

Are you doing something different?

Thanks Marco, with the script I can read/write the scene linewise but not as a whole.
I checked the memory settings and theres a problem. I only get this:

max memory: 491 MB
total memory: 491 MB
free memory: 350 MB

no matter if I give java_max_mem 1G, 2G, … 32G, or even change the path to snap_home, it has no effect. Seems snappy.ini is never considered.
Even when I increase the default max_mem in nothing changes.
Let me know if you have an idea, otherwise I’ll try a clean installation of snap.

The only idea I have is that you might have two installations and you change it in the wrong place. Did you copy snappy into the site-packages folder or are you using

import sys
sys.path.append(‘snappy-dir’) # or sys.path.insert(1, ‘snappy-dir’)

Maybe it helps if you run snappy-conf again (Configure Python to use the SNAP-Python (snappy) interface)

@dbhr Don’t know if you solved your memory problem, but it could be because you are using GPF or GPT commands in your scripts, e.g. GPF.createProduct, etc.

If that is the case setting the memory in snappy.ini, or snap.conf will not help you. You need to set it in /home/snap/bin/gpt.vmoptions.

In this file you can add the following java switches:
-Xmx12G # max memory -Xms1G # min memory -Xverify:none -XX:+AggressiveOpts

I’m not entirely sure what the last two switches do but they are in the default setting in /home/web/snappy/snappy/

This ambiguity has caused me much trouble but after persevering I finally found how to set the memory.

Hope it helps.

Hi @MarkWilliamMatthews
I don’t think this is entirely correct.
When using snappy (calls like GPF.createProduct(...)) then it should be sufficient to change memory settings in snappy.ini.
If you generate command line calls (like gpt graph.xml ...) and create sub-processes then the gpt.vmoptions need to be changed.