Computing L1C TCI from RGB Bands

Hello,

I am trying to reproduce the TCI RGB 8bits image from the B04, B03 and B02 bands of an image and failing to reproduce the exact results,

Currently I tried the following (from the 16bits band data)

  • Clip to and normalize each band by 4095 then stretch to [0,255]*
  • Normalize each band by the maximum per band, then stretch to [0,255]**
  • Clip to and Normalize each band in the [1th percentile / 99th percentile] of each band,then stretch to [0,255]***
  • Normalize by 10000, clip to 0.2 then stretch to 0-255 as per User Guides - Sentinel-2 MSI - Definitions - Sentinel Online - Sentinel Online

In numpy, providing x is the RGB in int16 obtained from [B04, B03, B02] converted to float32:

*: x = np.clip(x / 4095., 0., 1.) * 255. (then to np.uint8)
** x = x / np.maximum(x, axis=(0,1))
*** x = (x - np.percentile(x, 1) ) / (np.percentile(x, 99) - np.percentile(x,1)) . clip(0.,1.) )

Is there an official public algorithm to convert [B04, B03, B02] to the TCI jp2 ?

For reference, there’s also a “custom script” on sentinel hub to get TCI (custom-scripts/sentinel-2/true_color/script.js at main · sentinel-hub/custom-scripts · GitHub) but I don’t know the starting space…

Best regards,

Hii,

In sentinel - 2A assets there is ‘visual’ property which returns rgb image of 8-bits, but I am unable to reproduce the same with with 16-bit values from (B2, B3, B4) bands through any kind of normalization.
I am currently searching all over the internet, for such public algorithm to convert 16 bit sentinel-2A images to 8-bit images.

Am wondering if you found any?

The EO Browser says it uses a gain of 2.5 for each of the three bands.