Sentinel 5p raster creation

The problem is probably that the data is provided with lat/lon values per pixel.
That’s why you have an RGB with 3 channels. Two of them are lat and lon.

I have now also noticed (SNAP-945) that when doing a reprojection on the original data the northern part is cut-off.
image
The red shape represents the reprojected data and the white one is the original data.

There is still a SNAP-only way. But I’m not sure if it is better than your solution. It is a bit strange. Obviously, something is not working correctly. Might be because of the S5p data, or maybe there is a bug in SNAP. But on the other side, the behaviour is correct. SNAP tries to preserve the best geolocation which is available.

However, here is the SNAP-only solution:
The trick is to do the export two times.

First open the S5p product, then export only the ozone band to GeoTiff.


Answer the question regarding the flag dataset with no.

Now open the exported file in SNAP and do the same. Now you have to exclude the tie-point grids on the corresponding tab too.


After this, you should get the result you want.

Thanks Marco. The two export method, produces in fact a raster with one band. But if i open that raster in ArcGIS or QGIS it’s not in the correct location.
This is the SNAP view of that final raster (left) and the view in ArcGIS (right).

Now if i do the method i explained before, with the Weather and Climate Toolkit, this is what i get.

Not sure what is wrong, but SNAP doesn’t seem to work for me.

Hi Luis,

I had similar issues with S-3 OLCI data. Reprojecting to Geographical Lat/Lon (WGS84) in SNAP before exporting helped. See here.

wishes,
Jana

Thanks Jana. I now understand what Marco was telling about the north part being cut-off. Not just the north part but also a little bit on the east part as well. Every part of the original data that exceeds the footprint of the reprojected data is cut-off.

It seems that an issue is back that was resolved almost a year ago. I have reopened SNAP-839.
We are working on a more general solution to better handle pixel-based geocoding. Unfortunately, this needs some time. Not a simple task.

I am one of the main developers of the ESA Atmospheric Toolbox (BEAT).

The toolbox definitely supports S5P, but you will have to use the HARP component to do this (https://github.com/stcorp/harp).
The beatl2 interfaces are obsolete and will be phased out in the near future.

The Python interfaces of HARP are also included in the VISAN application (there is even an ingestion dialog in the VISAN menu for it), but to integrate things in a workflow, it might be easier to just install HARP itself and use the command line tools of HARP.

Note that with the latest release of HARP we included a way to raster the S5P data.
The new operation in HARP for L3 gridding is called ‘bin_spatial’.

As an example, for the product S5P_NRTI_L2__O3_____20180712T064913_20180712T065413_03858_01_010000_20180712T072955.nc (available from s5phub) you can do the following:

harpconvert -a 'bin_spatial(241,24,0.1,361,74,0.1)' S5P_NRTI_L2__O3_____20180712T064913_20180712T065413_03858_01_010000_20180712T072955.nc l3.nc

This will grid all harp-supported quantities from the product to a 240x360 lat/lon grid with latitude edges [24, 24.1, 24.2, …, 47.9, 48] and longitude edges [74, 74.1, …, 109.9, 110].

The gridding algorithm uses an area weighted average and knows how to deal with satellite pixels that cross the dateline boundary or cover the poles.
You can adapt the resolution to whatever size you like. You can also combine the transformation with filtering and other operations. For example:

harpconvert -a 'O3_column_number_density_validity>50;keep(datetime_start,datetime_length,O3_column_number_density,latitude_bounds,longitude_bounds);derive(datetime_stop {time});exclude(datetime_length);bin_spatial(1201,24,0.02,1801,74,0.02);derive(latitude {latitude});derive(longitude {longitude});exclude(longitude_bounds,latitude_bounds)' S5P_NRTI_L2__O3_____20180712T064913_20180712T065413_03858_01_010000_20180712T072955.nc l3.nc

Furthermore, you can use harpmerge to combine data from multiple products into a single output. For example:

harpmerge -a 'latitude>9;latitude<21;longitude>19;longitude<31' -ap 'bin_spatial(101,10,0.1,101,20,0.1)' S5P_NRTI_L2__O3* l3.nc

More information about the HARP operations can be found at: HARP Operations

Note that we also plan to update the website of the Atmospheric Toolbox in the future and add examples of how to use the toolbox to access S5P data here as well. Unfortunately, due to unforeseen circumstances, it took a while for us to be able to start on that activity.

2 Likes

If you want additional help on the atmospheric toolbox, I’d suggest to send an email to the beat email address that is mentioned on our toolbox page. BEAT is not part of SNAP or STEP and therefore not covered by this forum.

1 Like

Thank you Sander. I’ll give it a go. Regards

Hello svniemeijer,

I am trying to use harpconvert but I am always getting an error saying " syntax error, unexpected $undefined, expecting $end" but I can not figure out whats the problem. I wrote: harpconvert -a ‘bin_spatial(-5,279,0.1,111,451,0.1)’ E:\S5P_NRTI_L2__O3_____20181101T064128_20181101T064628_05447_01_010102_20181101T072721.nc l3.nc
(I tried different paths, with and without space between the values and so on)
I also tried to use your example file, but near real-time products are not in the S5phub. Do you have any suggestions?

Thank you very much

1 Like

Dear Anna,

First, I assume you are trying this using a Windows Command Prompt? If so, you should not use single quotes around the operation. Using proper quoting in a Windows Command Prompt is rather tricky. In this case though you can omit the quotes, or use a double-quote character.

Second, you seem to have swapped the arguments to the bin_spatial function. The offset and length values need to be swapped.

Finally, your longitude offset seems to high. You probably want to start at e.g. 86 deg for that specific product.

So your final command (on Windows) should look like:

harpconvert -a bin_spatial(279,-5,0.1,451,86,0.1) E:\S5P_NRTI_L2__O3_____20181101T064128_20181101T064628_05447_01_010102_20181101T072721.nc l3.nc

Dear sniemeijer,

Thank you for your quick response. That was very helpful and the command is working now. Nevertheless, it seems like I have problems with the coordinates. I read the help from https://cdn.rawgit.com/stcorp/harp/1.3/doc/html/operations.html and understood it like this: to get the first value I have to read the rows and colums from the file +1 (means my file has an extent of 278 and 450 so I use 279 and 451). The second value is the coordinate in lat and long. I used R to read out the min coordinates from the metadata file (“PRODUCT/latidue” and “PRODUCT/longitude”). And the third value is the 0.1 is the step size I am counting the coordinates per pixel. Is it a problem that my sample file is at the equator? Even after projecting it to WGS84 the file is way to big and not at the right location.

Knd regards

Dear Anna,

The pixels in the L2 product do not form a regular grid.
A ‘bin_spatial’ operation is what can turn the data into a regular lat/lon grid, but you should then determine yourself what grid resolution to use. This is usually application specific.

A resolution of 0.025x0.025 deg would be about the same magnitude as the S5P resolution (around the equator; near the poles you can use a much coarser grid).

Once you have chosen your resolution, can you determine the size of your grid based on the min/max lat/lon of the file you have. e.g. -5…18 deg at 0.025 -> 920 grid cells. 86…115 deg at 0.025 -> 1160 grid cells.

harpconvert -a bin_spatial(920,-5,0.025,1160,86,0.025) E:\S5P_NRTI_L2__O3_____20181101T064128_20181101T064628_05447_01_010102_20181101T072721.nc l3.nc

Hi there,

After looking at all the valuable inputs from all of you, I also started digging into the matter. My objective was to create a tiff file from Sentinel 5P data that is available in netcdf format.
So I also came up with a solution.
I used SNAP toolbox for the purpose
I downloaded Sentinel 5P level 2 NO2 product for this purpose using the link below.
https://s5phub.copernicus.eu/dhus/odata/v1/Products(‘2fbd5daf-5b03-4cc3-a7e0-d963f3b63c59’)/$value

The product looked flipped when viewed in the SNAP

Then I subset the dataset using band subset for my required band that was “nitrogendioxide_tropospheric_column”

In this next step I used Reprojection tool in the geometric operation with my prefered Reference system (EPSG:4326)

The output from this step gave me a tiff file that was georeferenced and in tif format… that was my objectice

when i open this layer in ArcGIS it was a three band layer having “nitrogendioxide_tropospheric_column” as Band 1 and lat and long as band 2 & 3

Hope this will be helpful for everybody and I hope somebody will help me with the analysis part as a I am not good with the statistics from here on…

Enjoy…

Regards

Yasir Shabbir

4 Likes

hi @svniemeijer thanks for taking the time to help with questions. I am also trying to automate this process of rater creation using python. I have a couple of questions i hoped you could help me with.
The example i chose was a methane product (google drive file link): “S5P_OFFL_L2__CH4_____010300_20190417T054418.nc”
The docs seem to say the images are 7 x 7 km, and images plotted by @luisgpaixao look the right shape, but when i plot this methane product (using panoply here) i get a long strip:


Given each ‘pixel’ has a lat/long I converted the grid to vector points (python snippet below).
printing the lat.min(), lat.max(),lon.min(), lon.max() extents and I get the whole world!
west:-179.9999237060547, south:-89.92754364013672, east:179.9989013671875, north:89.9327392578125

I’m assuming the extreme lats because the strip genuinely seems to got from pole to pole; in which case the extreme longitudes are because the data covers one of the poles where the longitudes are very close together. This is more obvious in a plot of the points from the csv

Anyway, is this data such a long strip because it is a different swath capture (akin to IW, EW etc for sentinel-1); or is it captured differently like this for the methane product?

You mentioned that the harp caommandline tool can bin the data into a regular grid.
Could you show us how to call the same functionality directly within python? I could always call the same tool you mentioned via subprocess(), but it would be great to import the harp api directly and apply the methods from within python.
When interpolating/binning to a geotiff grid, could one simply pass to harp the extents as printed above for this product, even though they are so extreme, or will defining the geodesic ‘wgs84’ simply handle this?

Thanks in advance!
Ryan

@dmichelakis interesitng?

Python snippet used to generate points:

data = Path(r'/home/vagrant/projects/data/nc_files')
data.exists()
methane_variable = 'methane_mixing_ratio_bias_corrected'
netcdf_file = list(data.glob('*S5P_OFFL_L2__CH4____20190411T035326_20190411T053456_07729_01_010300_20190417T054418.nc'))[0]
#netcdf_file = netcdf_list[1] #S5P_OFFL_L2__CH4____20190411T035326_20190411T053456_07729_01_010300_20190417T054418.nc
print(f'netcdf_file being used: {netcdf_file.stem}')
save_path  = netcdf_file.with_name('test.tif') #still havent made this!

nc_file = netCDF4.Dataset(netcdf_file)['/PRODUCT']

lat = nc_file['latitude']
lon = nc_file['longitude']

west, south, east, north = [lon.min(), lat.min(), lon.max(), lat.max()]
print(f'west:{west}, south:{south}, east:{east}, north:{north}')

Hi @Ryan, S5P products consist of full orbits (but only the sun-lit part of the earth), and this indeed covers the poles. Sentinel-1 switches between modes many times within an orbit (i.e. datatakes), but this is not the case for Sentinel-5P. You could compare this to Sentinel-5P having a single datatake that covers a full orbit.

You can easily get the data inside Python using the HARP Python interfaces.
(HARP is available via conda: https://anaconda.org/stcorp/harp):

>>> import harp
>>> data = harp.import_product('S5P_OFFL_L2__CH4____20190411T035326_20190411T053456_07729_01_010300_20190417T054418.nc','bin_spatial(181,-90,1,361,-180,1);derive(latitude {latitude});derive(longitude {longitude});squash(time, CH4_column_volume_mixing_ratio_dry_air);keep(latitude,longitude,CH4_column_volume_mixing_ratio_dry_air)')
>>> print(data)
source product = 'S5P_OFFL_L2__CH4____20190411T021156_20190411T035326_07728_01_010300_20190417T033330.nc'
history = "2019-05-03T11:01:36Z [harp-1.6] harp.import_product('S5P_OFFL_L2__CH4____20190411T021156_20190411T035326_07728_01_010300_20190417T033330.nc',operations='bin_spatial(181,-90,1,361,-180,1);derive(latitude {latitude});derive(longitude {longitude});squash(time, CH4_column_volume_mixing_ratio_dry_air);keep(latitude,longitude,CH4_column_volume_mixing_ratio_dry_air)')"

double CH4_column_volume_mixing_ratio_dry_air {latitude=180, longitude=360} [ppbv]
double latitude {latitude=180} [degree_north]
double longitude {longitude=360} [degree_east]

awesome thanks @svniemeijer , a one liner!
I am using pypi for my python ecosystem, do you have a pypi package release (as opposed to anaconda?).
Otherwise we have been trying to install on the ./configure step not finding hdf4 libraries
this is what we’v run so far (in sudo):

git clone https://github.com/stcorp/harp.git
cd harp/
git submodule init
git submodule update

apt install autoconf automake libtool flex bison
apt install hdf4-tools hdf5-tools

./bootstrap 

./configure --prefix=$HOME/harp HDF5_INCLUDE=/usr/lib/x86_64-linux-gnu/hdf5/serial/include HDF4_INCLUDE=/usr/include/hdf/ HDF4_LIB=/

Struggling to find hdf4_lib!

We don’t have a pypi installation precisely because of the dependencies on non-python libraries such as hdf4/hdf5.

Using HDF4_LIB=/ will not work, you will have to use a path that contains libdf.so and libmfhdf.so

OK that makes sense about the pypi.
i ran

cd /
find . -name “libdf.so

got:

./usr/lib/libdf.so.0
./usr/lib/libdfalt.so.0
./usr/lib/libdfalt.so
./usr/lib/libdfalt.la
./usr/lib/libdfalt.a
./usr/lib/libdf.so.0.0.0
./usr/lib/libdfalt.so.0.0.0

same directory found for ‘libmfhdf.so’
so iv update the configure to run:

./configure --prefix=$HOME/harp HDF4_INCLUDE=/usr/include/hdf/ HDF4_LIB=/usr/lib/

but i still get


ERROR: HDF4 libraries and/or header files are not found.
Try setting the HDF4_LIB and HDF4_INCLUDE environment variables to the
location of your HDF4 library and include files.


is the problem that libdf.so is actually libdf.so.0 <<<--------------------- ?? version number?

You should indeed have libdf.so files without the version number (they usually are provided as symlinks).
Make sure you install the ‘dev’ versions of hdf4/hdf5 if these are missing.

To not further pollute this forum thread with your installation problems, please send an email to harp@stcorp.nl if you have further issues.

SNAP seems unable to open S5P .nc file.


How did succeed in open S5P file?