The difference between linear and dB radiometric calibration for SAR images

I am not really understanding the difference between linear and dB radiometric calibration. Especially when using the calibrated image as an input for a classification approach. Is there a wrong/right calibration type in this case?

Thank you!

There’s not really a right or wrong way to proceed, you can try both. dB images are usually easier on the eye so I personally would choose them.

Thank you, so I understand that for automatic classification using clustering algorithms there is no right or wrong calibration type?

I would suspect its depends on the algorithm, perhaps for some algorithms it makes zero difference. IMO the dB-scale should be easier to cluster since the distribution of the data is less skewed than with the linear scale.

I’d say the db would be better for clustering as the values show a (nearly) normal distribution after conversion to db. This could be probably be handled better by the clustering algorithms.