GPT extremely slow when called from python script

Hey there!
I implemented the gpt routine into a python script in order to run it on dozens of scenes.
Now I realized that it takes ages for gpt actually compute results.
If I run the gpt (“gpt myxml.xml”) from the Linux command line everything works just fine…

Any ideas where this come from?

Cheers

You should show us the Python script. If you are moving data between Python and Java (e.g., using snappy) that could be a bottleneck. If you are simply looping thru a list of files and running gpt on each file, a big slowdown on linux may mean the system is tight on memory. There are tools, starting with top, to monitor the resources used by a process which may show you where the bottleneck occurs. You may need to tweak the settings in gpt.vmoptions. You may want to consult a distro-specific forum for help finding appropriate tools.

1 Like

Thanks for the suggestions.

Yes, indeed I am using a python script to call Popen to start gpt for each file.

For some reason os.system was much slower than than subprocess…therefore I replaced it, which increased the processing Speed extremely.
Where can I find the gpt.vmoptions file?

Cheers

Glad you found a way to speed things up. The gpt.vmoptions file is in <snap_installation_dir>/bin.

Is there a more convenient way to call gpt without using subprocess?

You can use gpt with command line scripts.
Maybe this brief introduction helps you:
https://senbox.atlassian.net/wiki/spaces/SNAP/pages/70503475/Bulk+Processing+with+GPT

you could even call the gpt functions with command lines from python script, but that is an extreme solution :slight_smile: