It’s a bit of everything really. Any time you run a loop with snappy, anything you instantiate or perform in memory is not garbage collected.
For your example, I’d use my work around and run read_cloud_band()
in a separate script. You’ll find that each time the script terminates the memory will be freed.
This is taken from my workaround post:
The line of code I use to spawn my processing pipeline is:
pipeline_out = subprocess.check_output(['python', 'src/SarPipeline.py', location_wkt], stderr=subprocess.STDOUT)
Note:
pipeline_out
is the STDOUT from the script, so in my case to find out what file has just been processed I haveprint("filepath: " + path_to_file)
insrc/SarPipeline.py
So for your code I would try
for file_name in file_list:
print (file_name)
pipeline_out = subprocess.check_output(['python', 'readCloudBand.py',file_name], stderr=subprocess.STDOUT)
#Your old call CheckClouds(file_name).read_cloud_band()
Where readCloudBand.py is the method in your CheckClouds class in a python script and file_name is a passed in parameter to the script.