Is there any easy way to retrieve the number of times that each tile has been ingested in the Sentinel scientific data hub without having to query multiple times the API? Is there for instance a bulk metadata file available similar to Landsat?
Is there perhaps also a orbit file available showing the positions of the different orbits?
and how about if you want to do this global for all possible tiles? Is querying the API multiple times (to avoid the 100 rows max) the only solution or is there somewhere a global updated metadata file available
If there is such a file, then I’m afraid it’s not know to me. I presume - and don’t quote me on this - the only people to have that sort of information to hand are those responsible for the management of the online catalogues (so ESA, Google, Amazon AWS etc…). It sounds like one of the core functions of a database.
You might use my little tool peps_download.py to download data from PEPS (French Sentinel mirror site).
On linux, you would have to use two command lines, one for each different product format, which, in PEPS are stored in two different collections, namely, “S2” and “S2ST” (for single tiles).
Using the -n option queries the catalog, but does not download the product.
Your tool works flawless, however is there perhaps an easy way to specify the tilename in the old products (not the S2ST)? Thing is that in the meantime I’ve found that the google mirror provides such a bulk metadata file which seems to be updated daily based on the products they ingest : https://storage.googleapis.com/gcp-public-data-sentinel-2/index.csv.gz
See below for the resulting 2016 image. The colorscale is a bit misleading as for instance for tile 25XDJ (north Greenland) this gave 594 different products due to the overlapping orbits at the poles. When specifying the lon/lat coordinates in the peps tools however for verification, multiple tiles are selected
I have made a Python script for harvesting the most recent acquisition plans for both Sentinel-1 and Sentinel-2 by parsing the content in the respective websites at Sentinel Online. The harvester stores the .kml file(s) to wherever you want on your computer.
If someone is interested, the script is found here:
The script also allows for filtering out swaths within on your Area of Interest.