Since SNAP v2 beta 06, the calibrated beta nought images produced appear no longer to be properly absolutely calibrated. Whereas before gamma or beta nought images typically were in a range approx. from -25 to +5 dB, they are now skewed about 30dB to the positive side. This is true if one uses the dB generation in the tool directly (Linear to/from dB), or if one calculates 10*log10(DN) in the band maths.
Do others concur that the radiometric calibration has lost its proper absolute reference?
In beta 06 there was a block shift of over 30dB that applied to the whole histogram. That has been fixed in beta 07. Due to speckle, with properly calibrated data you should expect to see a wider distribution of possible values (outside of -25dB to +5dB) when you zoom into full resolution in an SLC.