I am trying to use the snap7.0 as the pre-processor for StaMPS.
when I issue “python coreg_ifg_topsar.py project.conf”, I received this error message, Operator ‘SpectralDiversityOp’: Unknown element 'useSuppliedShifts’
I dont know how to correct it.
So I want some help from you!
Thank you very much!
Please provide more information… such project.conf
and also please provide a graph with the SpectralDiversityOp, because what you are telling sounds to me that the new SNAP version modified tags with respect to version 6.
I have a problem to export the results of coregistartion and interferogram to StaMPS with stamps_export.py. The problem is:
[NodeId: StampsExport] The Product ‘20170301_20150628_IW1’ already contains a band with the name ‘i_20170301.rslc’.\n"
Previous steps are done successfully. I unistalled the SNAP and reinstalled the SNAP6. the “project.conf” is:
Hi,mdelgado,i have a problem when i run snap script ‘python coreg_ifg_topsar.py project.conf’, aftering running it, it seems locked in the first salve and didn’t show progress. i have let it run more than 12 hours. this is my project.conf, and i have three bursts.
######### CONFIGURATION FILE ######
###################################
I need more information to find the problem…
Please provide
My first question would be… how you modified any graph?
And second: as you said you uninstall SNAP and reinstalled v6… had you processed the previous steps with higher SNAP version?
Thank you for you reply!
i have only three bursts, you say my issue may be SNAP configuration, it has been mentioned in the sanp2stamps_user_manual that $HOME/snap/bin/gpt.vmoptions and modify the param
o –Xmx 12G (according to your computer set up; i.e –Xmx 512M ), if i should change the parameter 512MB to 12G. Or is there any other solutions?
SNAP STDOUT:INFO: org.esa.snap.core.gpf.operators.tooladapter.ToolAdapterIO: Initializing external tool adapters
SEVERE: org.esa.s2tbx.dataio.gdal.activator.GDALDistributionInstaller: The environment variable LD_LIBRARY_PATH is not set. It must contain the current folder ‘.’.
Executing processing graph
INFO: org.hsqldb.persist.Logger: dataFileCache open start
Hello,mdelgado,this is my processing procedure,it works without error,but it takes too long time. Does it seem to be any problem from this process? Thank you!
I am glad that it works.
Regarding the timing… probably you should configure SNAP to optimize the computing resources. There are some parameters you can modify, but at the end the performance will entirely depend of your computer and the amount of bursts you want to process from the same product.
In the sense… if you work with 2 burst it takes 2-4 mins depending on your computer. If you use 4-5 bursts it takes much longer, not following linear relationship.
You should find the proper combination. For tuning the parameters please look in the forum for the right thread. I remember that are 2 files snap.properties and snap.conf
Hi,mdelgado, i have a new problem when coregistration and interferogram, before i process this step, i have prepared corresponding DEM in the ‘’.snap/auxdata/dem’’, but when i process this step, it always reloads the corresponding DEM automatically. I’ve never had this problem before. Is there any solutions?Thank you!
Not sure what you are trying… if you include an external DEM, you may need to tune the corresponding graphs to get it directly, as there is no option from the project.conf in the current release.
I have this error while processing, i recently formatted my computer and tried to install snap, stamps, snapp2stamps for processing the data. But i am not sure they are installed properly or not. Can someone please help me in checking the installations?
I can provide access to my computer to check the installations.
in case this is still an issue, just found out that snap7.0 had changed or replaced the parameter " useSuppliedShifts" in operator " SpectralDiversityOp" into two new parameters: “useSuppliedAzimuthShift” and “useSuppliedRangeShift”.
I believe these answers are already several times in the forum.
It is sometimes worth to use the search engine to get problems solve easily and avoid duplicated issues and questions.
Still, good that you managed to solve it