# About reflectance of sentinel-2 image

About the reflectance displayed when opening a band image with sentinel-2 with snap.

The value is different when opening the jp2 image saved in the zip file using python.

What kind of calculation is done when opening with snap?

Here you will probably find all the answers you need:

I am very happy to have the information.

I will refer to it.

This image is an L2A image displayed in python.
For example, suppose the value stored in a pixel is 448.
At that time, should I apply QUANTIFUCATION_VALUE and think that the reflectance is 0.0448?

Yes, that’s correct.

Some pixels have a reflectance of more than 10000.
Why does the reflectance exceed 1 even though the unit is dimensionless?

This can happen.
It is explained here:

I will refer to it.

I heard that the value divided by 10000 is the reflectance.
However, there is a difference when comparing the reflectance values ​​between image with snap and displayed by the program
It’s about 1: 2. Is it correct to divide by 20000?

No, actually not. Do you see this ratio in multiple products?
Can you show how you compare the values?
Maybe you check different bands?

Comparison using B4 band.
The comparison method looks at the reflectance at the location of pixels with the same coordinates.

I will attach the actual image.
The image displayed by the program has the coordinates and reflectance written in the lower right corner.

This time, we are comparing with coordinates (x=1529, y=7473).

Reflectance of program image = 0.0500

Reflectance at SANP = 0.0250

It’s just doubled.

That’s strange.
So far I haven’t heard that SNAP is showing wrong values for Sentinel-2.
Can you show your python program?
You could also check if the quantification value is correct.
In the Zip file you’ll find MTD_MSIL2A.xml file.
Open it in an text editor and search for quantification. Then you find the scale factor for the bands.

Hello @bakeho

Could you could provide the product information (Tile and Orbit number, date of acquistion) of your L2A so we can assess it, please?

Cheers

Jan

S2 MPC Operations Manager

This is the program I am using.
chap4.py (602 Bytes)

So the code is quite simple and looks good.

It could also be an issue of the PIL library or in the data.
The product you are using (S2A_MSIL2A_20190504T014701_N0211_R017_T53SLV_20190504T043621.SAFE) is offline at the Open Access Hub and not directly available for download. If requested it, but this can take a while.

Thank you jan.

I am using the data of 2019/05/04.

S2A_MSIL2A_20190504T014701_N0211_R017_T53SLV_20190504T043621
https://scihub.copernicus.eu/dhus/odata/v1/Products(‘993ac16d-6ac5-4d6b-a419-2148e9f1d906’)/\$value

I’ve opened this product in SNAP and also the jp2 file saparetly.
This image shows them side by side.

The Pin is located at the location you have referred to.
You can see the scaled value of B04.
In this image you see the raw value of the jp2 file.

When looking at the values of the jp2 file in QGIS I see the same values.

When I tried it with a png file using the same program, there was no difference in the values.

Does that mean the quantification value is wrong?

No, I don’t think so. The quantification value is correct. It hasn’t changed for years. Reading a jp2 or a png are two different implementations. So it might be that in the PIL library implementation of reading jp2 files is an issue.

The values which QGIS shows are the same as in SNAP. I think this proves that SNAP is correct and also the data.

Thank you for answering many questions.

I recreate the program again.