Sentinel3 LST single product to web map (leaflet.js or mapbox) using Snappy

Hi,
I would like to make an animation of Sentinel 3 LST products placed on a map over time, following their acquisition timestamp.
I’m using python, I already downloaded the products with sentinelsat and the next step is using snappy to convert the LST band to a png, reprojected to a web mercator projection.
Since png can’t store information about coordinates I was thinking about saving each product’s bounding box in a database or static file. this is the code to get the coordinates, which works fine:

directory = r'./data'
for entry in os.scandir(directory):

    if entry.is_dir():
        in_file = entry.path + "/" + "/xfdumanifest.xml"
    
        product = ProductIO.readProduct(in_file)
        w = product.getSceneRasterWidth()
        h = product.getSceneRasterHeight()
    
        geocode = product.getSceneGeoCoding()

        p1 = geocode.getGeoPos(PixelPos(0,0),GeoPos())
        p2 = geocode.getGeoPos(PixelPos(w,0),GeoPos())
        p3 = geocode.getGeoPos(PixelPos(0,h),GeoPos())
        p4 = geocode.getGeoPos(PixelPos(w,h),GeoPos())

These coordinates should then be used to get the square bounding box in pixels relative to the map. I have two issues:

  • I’m having a hard time finding an example of reprojection with a png output, does anyone have an example to show me? I tried it on SNAP, but choosing PNG as output and “Popular visualization pseudo mercator” as projection, I get a blank image. Doing so on a GeoTiff, I get a 1 Gb file

  • I should also resample each product to be in scale with the global map. Let’s say I want a 1920px wide map, how should I resample a product accordingly?

Thanks,
Daniele

Perhaps @mdelgado can give you some hints… he has something on GitLab - Binning S3 LST, not exactly the same thing.

I tried the Level3 binning function on SNAP, which does most of what I need:

  • reprojection
  • rescaling min and max according to global values
  • managing resolution

the difference is that I need the single image of each product, in order to show them sequentially on the map based on acquisition timestamp. And obviously, I would like to automate it with python :slight_smile:

update, for whoever might be interested: I found this repo that uses gpt line commands to do most of what I need in a very simple way: more specifically, I’m using these two lines:

# Run subset to keep bands (later pass bands to arguments)
    	gpt subset -Ssource=$elem/xfdumanifest.xml -PcopyMetadata='false' -PsourceBands=$bands  -t $out_folder/$name -f GeoTIFF-BigTIFF

# Run reproject to get to correct projection.
gpt reproject -Ssource=$out_folder/$name.tif -Pcrs=$epsg -PnoDataValue=-9999 -t $out_folder/$name -f GeoTIFF-BigTIFF

variables are quite self explanatory, apart from $epsg, in my case being 4326 (web mercator).

There is just one step missing: I need to recompute minmum and maximum values for each product according to min and max global values. Does anyone know which gpt command could I use?

Thanks!

gdalinfo -mm $name.tif will give you the min /max values of the geotiff.

I guess this is what you need.
Glad you solved your issue.

1 Like