I’m running the StatisticsOp over a product with a selected shapefile as ROI.
Is there a way to store the StatisticOp output values (those inside the ASCII file) directly inside a python object?
My problem is that when the operator creates the ASCII file, most of the values in it are truncated and are not completely exact.
The main issue happens for the max_error value, which is supposed to be something around 10^-6 but, due to the truncament to 4 digits after comma, the value in the ASCII file is 0.0000, so 0.
Is there a way to avoid this behaviour and preserve the real Statistics values?

I take this to say max_error is the data range divided by the number of bins. It is easy to check that max_error does indeed scale with the data. Take a band that has nicely scaled values and note the max_error value. Then use band maths to scale the band by some factor (I tried band * 100) and check that the max_error value gets the same scale. If max_error is the only value that loses
precision, you could just compute it by hand.

If the data is “badly” scaled such that the important statistics lose precision, then a workaround is to scale the data by a factor (10^n) where n is chosen to give a useful precision. You can either adjust the
units of the data or rescale the output of StatisticsOP back to the original units.

I understand, so it seems like my only choice is to scale the band values pre-statisticsOp and then revert to the old values after the statistics.
Thank you for the advice!