Min(@,@) & max(@,@) not returning values as expected

BandMath Expression used:

(Sigma0_VH_db - min(Sigma0_VH_db,Sigma0_VH_db))/(max(Sigma0_VH_db,Sigma0_VH_db)- min(Sigma0_VH_db ,Sigma0_VH_db))

Expecting : [Sigma0_VH_db - min(Sigma0_VH_db)] / [max(Sigma0_VH_db)-min(Sigma0_VH_db)]
Essentially, an output histogram in the range[0.0,1.0] as shown in the right image,below.

OBTAINED: NaN output
image

I just want min & max of band “Sigma0_VH_db” but the function expects 2 input parameters for say minimum as min(@,@) . Can you please throw some light on this as the help section isn’t clear. Thanks in advance!

min(@,@) and max(@,@) just compare two scalar quantities. Bandmath normally operates pixel-by-pixel and gives a new image. For summary statistics for a full image, use GPT StatisticsOp . I think you can export histogram data to a flat file (e.g., CSV). This would allow you to use an external program to generate normalized histograms.