When I checked what parameters can be set for the Calibration operator using gpt, I found a parameter called outputImageScaleInDb.
So I dug into the code on this page to see how it takes effect.
If I understood correctly, it will take effect, by adding _dB to the end of the band name, when I set createBetaBand or createGammaBand to true with outputImageScaleInDb to true.
Therefore, I create the following GPF graph for testing:
Unfortunately, It didn’t seem to be working no matter my input is SLC IW or GRD IW product:
Can anyone give me a hint of what I might do wrong?
Any helps will be appreciated!
Thank you in advance~