Thank you Marco! This worked perfectly.
However, I have now started to encounter the memory issues I have read about on the forum from a couple of years ago. Has there been any updates regarding these issues? Here are some of the posts that I am referring to.
In my project I have to subdivide my image into thousands of smaller images (400x400) pixels and write these using WriteOp. I am currently creating the subsets using
def create_subset(source, x, y, x_offset, y_offset, copy_boolean):
parameters = HashMap()
parameters.put('copyMetadata', copy_boolean)
parameters.put('region', '%s, %s, %s, %s' % (x, y, x_offset, y_offset))
return GPF.createProduct("Subset", parameters, source)
and then writing them using
write_op = WriteOp(subset, File(output_name), 'GeoTIFF')
write_op.writeProduct(ProgressMonitor.NULL)
write_op.dispose()
I have seen the possible workaround using subprocesses, but I am mainly curious if there has been any general progress regarding this. Thank you!