ESA Copernicus data access - Long Term Archive and its drawbacks

Dear SNAP users and developers,

If you are interesting in time-series by far or not, you probably faced the problem of offline products.

As a reminder,

Following the recent upgrades to the Data Hub dissemination points we are pleased to announce that the Long Term Archive (LTA) Access interfaces are ready to be operated, part of the nominal regime to ensure the continued access to all Sentinel Data at all time.

Access to the product URL for data that are no longer available online will automatically trigger the retrieval of the data from the LTA, the actual download may be initiated once the data are restored (within minutes to hours).

By trying to download an offline image, we trigger its reactivation following the rules :

Access to the product URL for data that are no longer available online will automatically trigger the retrieval from the LTA. The actual download can be initiated by the user once the data are restored (within 24 hours).

A user quota on the maximum number of requests per hour per user is set.

Products restored from the long term archives are kept online for a period of at least 3 days. Quota and keeping time will be fine tuned according to the usage patterns to ensure efficient access to the most recent and frequently downloaded data.

But the very slow quota leads to a very common error message :

The request is not accepted because the number of submitted requested exceeded the allowed user quota

If you play the game and focus on a single past image, the procedure is annoying but still manageable. If you want to do something more serious, it became very painful to work with automatic scripts.

I’m probably not the only one who developed an automatic tool that download and process SAR images using GPT. The Long Term Archive policy is really a pain to work with and I was asking myself how other people are dealing with it ?

Is there a workaround ? Note that I’m working in Antarctica a lot of mirrors are not accessible.

If we are numerous in this situation, is there a way to let ESA know about this issue? Do they plan to do something in the near future ?

All the best,

Quentin

3 Likes

Please email your questions to eosupport@copernicus.esa.int

This was rather addressed to the SNAP community :

But yes I should write again to eosupport to let them know

is the ASF archive an option to you then? https://vertex.daac.asf.alaska.edu

1 Like

Thanks for the link. Is there a way to script automatic download using this depository ?

put some products in your cart, you will then be given a python script which assists you to download these data.
I suppose you can easily use parts of the script to automate the download process. I am not sure about searching to be honest.

Thanks a lot. I will definitely try it and let you know the results.

It still requires a bit amount of “handmade” process and clicks, but that’s better than nothing :wink:

for bulk downloading, I mostly use ASF because of the reasons you mentioned. But I sometimes have the feeling that they are missing some of the products.

There is an API for searching as well: https://www.asf.alaska.edu/get-data/api/search-download/

You probably should (worry).

I noticed (in sci-hub) that I occasionally see GRD data, but not their corresponding SLC images. So from time to time, I made a list of these detected images and send them to eosupport. Within few days, I receive a confirmation that these SLC data are in fact missing and thus are added to the database.

I don’t know how ASF handles its database but I’m not sure these old SLC images are tracked by ASF

I’ve a branch of sentinelsat with some additional code for retrieving products from the long term archive
https://github.com/gbaier/sentinelsat/tree/lta_async. One word of warning, the code is still experimental. In essence the download_all function puts LTA-products on a queue and periodically tries to trigger their retrieval. This works around the user quota. Already online products are downloaded while waiting for the LTA-products.

4 Likes

This means a 1 image per hour image retrieval if I’m right?

Thanks a lot for your developments

My latest info is that users can request a product from the long term archive every 30 minutes. But this is a quota set by the Copernicus open access hub.

For the sake of completeness, ESA also provides a download script that deals with products in the long term archive https://scihub.copernicus.eu/userguide/BatchScripting#dhusget_script.

2 Likes

thanks for the link.

Quentin

Did anyone find any stable solution? I have a list of product IDs now how I can download them using a script?

I think the best you can do is to use alternative mirrors such as PEPS or ASF.

I contacted the eosupport for the Sentinel platform but their response doesn’t fit with my needs :

[…] you try an bypass the quota restrictions by using multiple user accounts to access large volumes of data products from the LTA. Please note that the maximum number of products that a single user can request on SciHub is 1 every 30 minutes. An additional quota limit is applied to users of the APIHub of maximum 20 products every 12 hours. These restrictions are dictated by ESA and unless they change their policy then users will have to follow these guidelines. […]

So I guess you will not find what you need using the “official” sci-hub.

Note : in their response, they pointed out the importance this type of requests in order to understand people needs. So feel free to contact them to let them know the issue.

1 Like

Hi, your branch looks looks really useful to workaround this issue. I use sentinelsat (not a dev) and am wondering if you plan to merge this back in via pull request if its working well for you? Cheers

Upstreaming would be great and is also my goal. I still have to debug some issues though. Another problem is, that it currently relies on some Python3 only features.

1 Like