Cast issue with gpt for S3 Binning

Hi all, I am trying to run the L3 binning tool using the gpf interface for snappy. I am using the following graph to process the data where the following variables need to change:
$(start) - String of the start date in ‘yyyy-MM-dd’ format.
$(duration) - string of the number of days for the composite.
$(output_name) - string with the output name and location of the file.

I have written and attached a code written in python to loop through a series of dates to produce the monthly composites. However when ran, the command prompt closes straight away after displaying the following error message.

Error: com.bc.ceres.binding.dom.XppDomElement cannot be cast to org.esa.snap.core.gpf.graph.Graph

Can anyone help in debugging this issue please.
Best wishes, Harry
snap_gpt_binning.py (1.5 KB)
s3_composition_parameters.xml (1.2 KB)

The graph file is not complete. It contains only the parameter section for the binning.
You can type on the command line

gpt binning -h

to get an example for the xml.
Attached is an updated. I’ve corrected also the path to the sources. I guess there was one space to much in it.

s3_europe_composition.xml (1.7 KB)

Dear Marco,

Thankyou for helping to fix the xml file.
I just have a few final question regarding to the gpt binning tool:
I see that you have inserted this section into the xml graph
image

Is this different to the -PsourceProductPaths that is listed under the parameters section of the help?
And if so how do i pass the ${sourceProducts} into the tool?

Finally, where I have created a variable for the -PstartDateTime as ${start} as the value of start should vary to loop through each month in a year to generate composites; should I refer to this in the binning tool by writing
gpt Binning -PstartDateTime=‘2017-01-01’…
or using the variable name I assigned e.g.
gpt Binning -Pstart=‘2017-01-01’…

Thank you, for your help in solving these issues, It is my first time attempting to loop through list of dates using the gpt tools hence the confusion.
Best wishes, Harry

This is not different to the other parameters, or it is, but if you use sourceProductPaths it is better.
The sourceProducts is the default for operators. This allows you to list some source products at the end of the command line call.
But all products listed are opened automatically. This consumes memory.
When using sourceProductPaths the products are opened one after the other. And this saves memory. You can use sourceProductPaths for a few products, but not for hundreds of products.

gpt Binning -PstartDateTime=‘2017-01-01’…

This call is correct if you use the Binning operator directly.
If you use the graph XML and have specified the ${start} variable, you should use
gpt s3_europe_composition.xml -Pstart=‘2017-01-01’…

Dear Marco,

I adapted your s3_europe_composition.xml file to my needs, I ran it in CMD, after an hour or so I got an error message. I wonder if there is something wrong with my xml file or if it is the input data the issue. Please find attached my xml and the CMD log.binning_log_01.txt (722.6 KB)
sen3_binning_snap_forum_test01.xml (1.7 KB)

Best wishes,
Julio

I would suggest that you investigate the last product mentioned in the log.
S3A_OL_2_LFR____20160503T064754_20160503T065054_20180207T190906_0179_003_348_2880_LR2_R_NT_002.SEN3
I think it is broken. Because as error message it is said that its size is less than 2x2 pixels.

In addition you can gain performance if you switch to a tie-point based GeoCoding. Currentyl you use a pixel based GeoCoding. But I think for your binning the tie-pints are sufficient.
Open SNAP and in Tools / Options go to the S3TBX tab and disbale this option:
image

Dear Marco,

Thank you very much for your help. I think the issue was memory related. I investigated the last product in the log in SNAP and it looked fine, not broken, not too different from previous products. I then disabled the option “Read Sentinel-3 OLCI products with per-pixel geo-coding instead of using tie-points” and this time gpt Binning ran without issues. I made a visual inspection of the output product and it looks fine.

Another question. Is it expected the gpt Binning loops through every single file in the folder including those that are before and beyond the date range? CMD prints the warning below for all those products outside the time range:

“WARNING: org.esa.snap.binning.operator.BinningOp: Filtered out product ‘G:\project\data\s3\S3A_OL_2_LFR____20160426T062612_20160426T062912_20180205T114238_0179_003_248_2700_LR2_R_NT_002.SEN3\xfdumanifest.xml’
WARNING: org.esa.snap.binning.operator.BinningOp: reason: Does not match the time range.”

Good that the binning is now working.
And yes, the binning needs to loop over all data you have specified in order to check if it is within the time range.
You can change the pattern for sourceProductPaths to reduce the number of prodcuts.

<sourceProductPaths>G:\project\data\s3\S3A_OL_2_LFR____201605*.SEN3\*.xml</sourceProductPaths>

This will limit the data products to those from May 2016.
Hope this works. *crossingfingers*

Dear Marco,

Thanks again for your help.

Summary for the record:

  1. In inspected the apparently broken product and it looked fine, it was not broken.

  2. I disabled the option “Read Sentinel-3 OLCI products with per-pixel geo-coding instead of using tie-points” and then gpt did not complain about faulty product, it ran just fine.

  3. Changing the pattern for the sourceProductPaths did reduce the number of products gpt loops through.

What I did at the end:

  1. It did not work for me to set binning tool parameters from CMD e.g. gpt binning_template.xml -PstartDateTime=‘2017-01-01’ having ${startDateTime} in the XML template. Definitely I was doing something wrong there.

  2. I worked around that issue by generating individual XML templates for each of the level 3 products I wanted to create. I produced the templates in a loop (in R) then called them from CMD. Please find attached an XML example template that worked for me. I hope someone else finds it useful too S3_L3_gpt_example_template.xml (1.4 KB).

One final question, should one expect gpt to run or process faster than SNAP graphical interface?

Best wishes,
Julio

Have you tried to remove the single quotes around the date?

I think you know already this page: Bulk Processing with GPT
I just put it here for completeness of this thread.

Actually, both gpt and desktop should run at equal pace. However, it was observed that gpt is sometimes slower. Probably a bug, which has not yet been sorted out.

Dear Marco,

I have managed to produce some monthly composites using the help provided in this thread, thank you for that.
However I am now attempting to produce 8 day composites and am running into a new issue.
For the monthly composites, to reduce the number of scenes that the tool was looping through I used the character to set the source product paths to look at the specific month as shown in the example below
sen3_binning_europe_20170201_500m_v2.xml (1.6 KB)

This reduced the time it was taking to produce a composite from 6 hours to below 2.
However, I am unable to use this technique when producing 8 day composites which bridges over two months. Is it possible to set multiple sourceProductPaths in the xml file to restrict the code to loop through the files for the 2 months worth of data?

Cheers, Harry

Good that it works for the monthly.

Yes, you can specify multiple paths, separated by a comma.

Dear Maco,

I have tried to specify multiple pathways as you recommended above. However when I run this xml file the command prompt appears with a message stating that it is expanding sourceProductPaths wildcards, but then the command prompt stops without creating a product.
sen3_binning_europe_20161124_500m_8days_v2.xml (2.0 KB)

Is there a limit to the number of sourcepaths that can be specified or is the error occurring because the sourcepaths are filtering to different dates?

Hi Harry,

in the sourceProductPaths field should be product names separated only with comas. In your file you have new line after each product path, if you remove them it should work.

Best, Roman

Dear Marco,

Having issues with gpt binning tool. I simply run a previous .xml file that was working fine and I get this error message:

Error: [NodeId: s3binparams] latBand.getProduct().getSceneRasterWidth() < 2 || latBand.getProduct().getSceneRasterHeight() < 2

I looked at the product mentioned in the log that appears right before the error message in SNAP (see figure attached) and it does not seem to be broken. On the left hand side the product without any of the quality masks used in the .xml and on the right, the same product with all masks shown in red.

This is the product:

S3A_OL_2_LFR____20160426T081012_20160426T081312_20180205T135515_0179_003_249_2880_LR2_R_NT_002.SEN3

Nothing has changed since last time the tool worked. This is, same S3 products, same .xml, same line of code. Any clue what the problem may be?

Best wishes,
Julio

snap_version

S3_L3_10dd_20160420_20160430_1km_gpt_template.xml (4.6 KB)

A post was split to a new topic: Binning and Time information