GPT write: Filesize to big

Hi guys,
I am trying to process a full S1-Scene using a Graph (See below) via the command line, but it tells me that the output file size is too big.

This is the first part of the error message if needed I can provide the rest of the ~100 rows:

org.esa.snap.core.gpf.OperatorException: File size too big. TIFF file size is limited to [4294967296] bytes!
at org.esa.snap.core.gpf.graph.GraphProcessor$GPFImagingListener.errorOccurred(

I am using the following command from command-line:
gpt GRD_Test.xml -PFilename=“C:\Users\Felix\MA\” -t “C:\Users\Felix\MA\test1.tif”

I wondered if it has something to do with the .tif when writing? But I am unsure, any help is appreciated.

Thank you,

GRD_Test.xml (5.9 KB)

GeoTiffs have a maximum file size of 4 GB, your product probably exceeded this limit.
Is it an option to write a .dim file? It has no limites and can be treated as a raster in a similar way. The actual raster ends with .img abdlies in the /data folder. It is compatible with most librares and programs.

Oh nice, I did not know that geotiffs are limited that way. I know that I did save my results as .dim earlier but I changed it for a reaso I do not remember. But if the geotiffs are that limited I will have to find a workaround once I hit the last problem, since I really need the preprocessing using my SNAP-Graph.

Thank you, Mr Braun!

Dear ABraun,
I changed the output to dim, but I get an error that my GC overhead limit exceeded.
Can I tell the graph which GPU to use? I have an intern one and a NVIDIA GEFORCE 1050 which should be more than enough to calculate this. Is there a way to change the used Graphic card?

I don’t know exactly, to be honest. Doesn’t look like it is possible: Speed up processing through GPUs

But moving the data to the SSD (if your PC has one) can bring decisive improvements regarding memory allocation.