Bulk processing with GPT - command not found

Hi Everyone,

I try to run this example https://senbox.atlassian.net/wiki/spaces/SNAP/pages/70503475/Bulk+Processing+with+GPT . I have downloaded all files and I have everything same except path to gpt and folders.

gptPath="/home/lukas/snap/bin/gpt"

There is same error all the time.

bash processDataset.bash resample_s2.xml resample_20m.properties “/home/lukas/Desktop/automatic/inputs” “/home/lukas/Desktop/automatic/outputs” resampled20m

processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/AUX_DATA”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/DATASTRIP”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/GRANULE”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/HTML”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/INSPIRE.xml”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/L2A_Manifest.xml”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/manifest.safe”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/MTD_MSIL2A.xml”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/rep_info”: No such file or directory
processDataset.bash: line 77: : command not found
processDataset.bash: line 76: “/home/lukas/Desktop/automatic/test/S2A_MSIL2A_20170518T095031_N0205_R079_T34UCV_20170518T095032_20170606T232414_report.xml”: No such file or directory
processDataset.bash: line 77: : command not found

Command is not found and I do not know why it is looking for files (manifest, …) in directory of script when there is defined input folder with data in the script.

I am trying this example because I have this problem with my script also, I have tried different version of codes but all the time there is error - command not found. When I run it directly in shell with command gpt_path xml_file properties_file, it is working. When I try to put the command in script then command is not found. What coul be wrong ?

Hi,

I think you miss to put gpt.sh or to launch gpt as this “./gpt”

Your gpt path is juste “gpt” so the system don’t know the command.

Hope it’ll solve your problem.

Cheers,

I have tried it with gpt.sh but there was same error. In my case only gpt without .sh works when I run it directly through shell. When I pass gpt.sh it is not recognised as command. I have to pass to shell ./gpt or whole path but ./gpt.sh is not working. I do not know why.

There is also the problem with files. Why it is looking for files in folder with script and not in folder defined as input folder where are data ? I think the script in the SNAP documentation must be correct so I do not understand what is wrong.

Why is this test directory accessed. In your command line I see only “…/inputs” and “…/outputs”

Which lines are 76 and 77 in you script? They do not match to the original files.
Actually, I’ve tested the scripts and they worked. But I’m a Unix expert. So maybe I missed something. Also I’ve only tried with S2 L1C products and not L2A.

You can try to put print in front of a line to see the evaluated variables on the terminal. This might help.

There are script, xml and properties in the “/home/lukas/Desktop/automatic/test” folder. I have not write it anywhere in the script. The only thing I have changed in the script is GPT path, nothing else.

line 76 is :
procCmd="\"${gptPath}\" \"${graphXmlPath}\" -e -p \"${parameterFilePath}\" -t \"${targetFile}\"" \"${sourceFile}\"

and line 77 is:
"${procCmd}"

I have used downloaded script and only changed GPT path. I have put echo after each step and error is connect with this step:
procCmd="\"${gptPath}\" \"${graphXmlPath}\" -e -p \"${parameterFilePath}\" -t \"${targetFile}\"" \"${sourceFile}\"

strangely, it creates as input path this path:
/home/lukas/Desktop/automatic/test/rep_info

But I really do not understand why it put before prefix path to the script and not to the input folder.

For me the following lines are L76 and L77.

# Create the target directory
mkdir -p "${targetDirectory}"

That’s why I was asking.

Unfortunately, I have also no solution to your problem

Thanks for a try :slight_smile: It is really strange, I put echo in this part of the code:

for F in $(ls -1 “${sourceDirectory}”/S2*.SAFE); do
echo 1 “${sourceDirectory}”
sourceFile="$(getAbsolutePath “$F”)"
echo 2 “${sourceFile}”

result of first echo is 1 /home/lukas/Desktop/automatic/INPUTS,
but the result of echo 2 is 2 /home/lukas/Desktop/automatic/TEST/S2A_MSIL2A_20170518T095031_N0205_R079_T34UC

It changes in the path from “/inputs” to “/test”.

Hi Marpet, I would like to ask one more question about the script. It is supposed to load .zip folders from a input folder or unzipped folders with data ?

It shall load from unzipped folder.

Hello,

I am experimenting with the GPT in the command line. It seems its a good choice for doing bulk processing.
I have created an InSAR graph for doing deformation analysis where at the end it produces the GoldSteinFiltering product. Then I use this product as an input for the phase unwrapping. So, I have two products:

  1. GoldsteinFiltering
  2. Snsaphu unwrapped phase

I want to use the GPT on the command line to perform the snaphuImport. The description of snaphuImport command is given below:

In order to run this command, I did the following:
snap/bin/gpt SnaphuImport -Ssource=/home/john/Desktop/InSAR/unwrapping/snap/Subset_S1A_IW_SLC__1SDV_August2016_Sept2016_batch/UnwPhase_ifg_VV_08Aug2016_13Sep2016.snaphu.hdr -Ssource=/home/john/Desktop/InSAR/output/GoldstenFiltering.dim

The error that I get is the following:

The error says that snaphuImport requires two products. This is what I did, I used -Ssource to provide the two source products (the wrapped phase and the unwrapped phase produced by snaphu).

@marpet Is there a specific way on how to provide the two source products?

Thank you in advance

I think your command line call should look like this:

snap/bin/gpt SnaphuImport /home/john/Desktop/InSAR/unwrapping/snap/Subset_S1A_IW_SLC__1SDV_August2016_Sept2016_batch/UnwPhase_ifg_VV_08Aug2016_13Sep2016.snaphu.hdr /home/john/Desktop/InSAR/output/GoldstenFiltering.dim

Simply removing -Ssource=.

1 Like

thank you for your answer, it worked :slight_smile:

Hello marpet,

I am trying to learn how to use gpt which is very useful but sometimes i do not understand the error messages. So, I have another question about GPT tool.
I have five images which were a result of interferometric processing and I want to create a stack using the CreateStack command. So, I look at the command usage to see how it works. Below, the picture on the left shows information about CreateStack command and on the right is the graph i have created in SNAP.

When I execute the graph, it works smoothly and I get the images stacked as i expected. On the other hand, When i run the CreateStack in gpt i get he following error.
Error: Value for 'Slave Bands' is invalid: '/home/john/Desktop/InSAR/output_product/S1A_IW_SLC_20161112_20170228.dim'

I do not understand what it means by slave Bands. Looking at he graph (on the right) there is nothing related to slave bands. For gpt, I used exactly the same files used in the graph.
here is my attempt:

snap/bin/gpt CreateStack -Pextent='Master' -PinitialOffsetMethod='Orbit' -PresamplingType='NONE' -PsourceBands=/home/john/Desktop/InSAR/output_product/S1A_IW_SLC_20161112_20170228.dim,/home/john/Desktop/InSAR/output_product/S1A_IW_SLC_20160913_20161112.dim,/home/john/Desktop/InSAR/output_product/S1A_IW_SLC_20160808_20160913.dim -t /home/john/Desktop/InSAR/output_product/STACK.dim

Also, i do not understand what -Pmasterbands is. It looks like its exactly the same with -PourceBands

thanks

Yes, you’re right. The error message is not well to understand.
With ‘Slave Bands’ the sourceBands are meant.
I wonder why it works with the Graph Builder. Maybe the GUI is doing some additional magic.
I guess you need to specify some bands as master and as slave / source.
But I’m not much involved in SAR things. So I can’t help you further.

Maybe @cwong or @timmoorhouse can?

1 Like

Hey there,

I have the same problem using CreateStack with gpt as @john_chamber has.

Does anyone know what the value for SlaveBands and MasterBands is??

these define the bands you want to include in your stack. ‘masterBands’ are those from the master product. The first one in the list of sourceProducts. And there must be two. One real and one imaginary band. The slave bands are from the other products. Also here imaginary and real band must be specified.
I’ve no much knowledge about the CreateStack operator. I only had a quick look at the code. So, probably I will not be able to answer further questions.

I created this bash file, and xml file including two operators, I got error, as below,

Any suggestions,

bash file, processDataset.docx (12.9 KB)

xml file,

the error,

The first line indicates that you are not allowed to create the directory
And the second that the file does not exist.
Is the log coming from a different run as the bash you have attached?
Because the paths are different.
IN the log it starts with /shared/Training/…
And in the script, it is set to /home/rus/Desktop/shared/Training/…

@marpet thanks a lot I corrected the path, but now I have the following error,

So the loop is going wrong in your case.
I tried to do the same, but for S2 data.


So this seems to work. I get the same message if there is no data (the S3 test) matching the expression.
Maybe there is something wrong in the path you use or you don’t have data there yet.