Hi so i am running out of space for the cache storage i have 3k S2 images to process but after only 223 images i have filled my C drive that is 372GB of available space (dont have the luxury of being able to have a dedicated cache drive would need to be like 4TB). The cache actually fills up faster than the output location for the processed images
I am just doing S2 L1C though S2resampler(20m) then C2RCC in a multiprocessing script calling the gpt. eg there are 4 instances of “(”/home/btera/snap/bin/gpt " + xml_file + " -f NetCDF4-CF -q 10 -c 60G -x -Ssource=" + y + " -t " + z)"
I have the S2 cache setting set to daily deletion but it actually only takes ~1h to fill the 372GB of space so the deletion is not fast enough.
Options > Performance > Cache Size (MB) is set to 1024 (never sure if this affect the gpt or not)
My other question is does calling the gpt count as “start up” so i could have “max time in cache” set to delete on each start up?
I figure i probs have something wrong in some settings somewhere. Anyone’s experience would be helpful here thanks.