After collocating Sentinel 2, Landsat 7 & 8 i wanted to export it to a GeoTIFF file so that i can access it in python with rasterio. But as soon as the export reaches 100% it doesn’t finish the export, instead it uses up alle the hardware capabilities of my computer for no reason.
It finishes at 100% by occupying 6GB out of 32GB RAM, after a couple of minutes it just seems to loop in some process and increase the RAM usage.
I just bought myself a new high end PC just to finish my bachelor thesis so my CPU is a 12700KF Intel, 32GB RAM. It can’t be an issue with the hardware, the same was happening on my older PC.
I think it is, as i am trying to do pixel classification with TempCNN (1DCNN). Therefore i am sampling all pixel values based on the coordinates of the training pixels (with extracted mask pixels from snap).
Only if there is another format that does the same trick then i would use that one. But 2 stacks already worked out fine.
Even one with Sentinel 1&2 and Landsat 7&8, which is bigger than without Sentinel 1 data. But unfortunately this can’t be read by rasterio as it is too big. (Error: Can not allocate 31GB)
The BEAM DIMAP format handles masks and coordinates as well and I have read it into Python (as nested array) before. Conversion into another format only creates redundant data with questionable size.
I just now found out that it’s not because of exporting to GeoTIFF format but GeoTIFF/BigTIFF. There seems to be a memory leak when exporting to BigTIFF, probably should be inspected further.
I need a tif file to be able to read it into rasterio