Fast way to Downloading Sentinel-2

@unnic thanks a lot again for your help and time, really much appreciated!.

One more thing, just once I create the geojson file, How to connected with the sentinelsat tool ? Do I have to put the geojson file in certain path ?

Thanks a lot!

the path to the geojson.
sentinel search --sentinel2 -s 20160704 guest guest D:\Data\location.json -d

1 Like

Thanks a lot for your help , much appreciated!

Can I use Visual Studio .NET Python?

Hi @unnic
Sorry but just had this error after I managed to run the search and download command , then it just stopped with this error below, Do you have idea any reason why ?

C:\Users\Daniel>sentinel search -d -s 20151219 -c 30 --md5 Alba xxxxx27
Error: API returned unexpected response 503 .
Traceback (most recent call last):
File “c:\anaconda2\lib\”, line 162, in _run_module_as_main
main”, fname, loader, pkg_name)
File “c:\anaconda2\lib\”, line 72, in run_code
exec code in run_globals
File "C:\Anaconda2\Scripts\sentinel.exe_main
.py", line 9, in
File “c:\anaconda2\lib\site-packages\click\”, line 716, in call
return self.main(*args, **kwargs)
File “c:\anaconda2\lib\site-packages\click\”, line 696, in main
rv = self.invoke(ctx)
File “c:\anaconda2\lib\site-packages\click\”, line 1060, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File “c:\anaconda2\lib\site-packages\click\”, line 889, in invoke
return ctx.invoke(self.callback, **ctx.params)
File “c:\anaconda2\lib\site-packages\click\”, line 534, in invoke
return callback(*args, **kwargs)
File “c:\anaconda2\lib\site-packages\sentinelsat\scripts\”, line 91, in
corrupt_scenes = api.download_all(path, md5)
File “c:\anaconda2\lib\site-packages\sentinelsat\”, line 268, in do
for product in self.get_products():
File “c:\anaconda2\lib\site-packages\sentinelsat\”, line 114, in ge
raise ValueError(‘API response not valid. JSON decoding failed.’)
ValueError: API response not valid. JSON decoding failed.

C:\Users\ Daniel >

CNES Sentinel mirror site, named PEPS, is much faster than scihub (probably because it has less users :wink: )

You may get an account here : and then download any products you want, the repository is global and updated in real time.

If you want to download the files automatically from the command line, you can use this code :

To use this software, you must accept first peps licence, and for that download a first product manually.


Thank you OHagolle. This is the only place from which I had success downloading the images without data corruption.

Hi @unnic
What if I want to download S2 but for specific granul… any idea what the command should be if I am using sentinelsat. Or if you advice with other options ?

Thanks a lot


You can give a try to to download S2 data from Amazon S3, either full products or specific granules.


Using ArcMap, you can try
When choosing the “Image selection” option (within the Download tool), the download action is restricted to those particular images (bands), and only marked tiles are download (relevant if you are dealing with older multi-tile packages).

i do not use the tool mentioned above anymore.
Here’s the method I prefer now:

December of 2015 I reported to EOSupport (CDS-3622) that the “corrupt” file issue when using software that resumes the download was caused by a Windows server in their download path inserting an error message into the download stream when the error occurs. When the file is resumed at the next byte, which is after the Windows error message, the file ends up corrupt.

My solution was to set retries to 0 so the software I was using (wget/curl) to download would not automatically resume. I then assumed that there were bytes of a Windows error message appended to my file, which was not always the case. I then truncated a number of bytes from the file and then resume the download. This is implemented in a loop so every failure results in truncation and resume. I used 8192 byte truncation which is much larger than the appended error message. In Linux this is: truncate -s -8192 ${SAFE}.zip

The error message appended to the file is always similar to: <?xml version='1.0' encoding='UTF-8'?>error xmlns=“”>code />message xml:lang=“en” />/error>
(note that I had to remove some of the XML formatting to get the message to display on the Forum)

After implementing the "truncate before resume’ work around, file corruption rate dropped from 75% to less than 5%.

I now only download much faster and ~100% reliably from Amazon:

Google also has a mirror: see

Hi @unnic,

the script you mentioned (from author Max König) has an issue: It blows the whole into memory before finally writing it into a file on disk. This might be a concern, as you can see from the comments section, where user Sunny reports:
MemoryError: out of memory
This issue becomes relevant especially when dealing with those huge Sentinel2 multi-tile packages (about 8GB per


yes. I noticed there was a RAM issiue but didn’t have the time to look into it (and had plenty of RAM to blast). Thanks for pointing out the soruce of the problem.
Do you know a way to solve it? I’m not very knowledgable about the methdods used.

Sorry for my late answer here (I missed your reply).

To avoid the memory issue, the HTTP response has to be read block-wise. When using vanilla Python2 (without any 3rd-party libs), one option to accomplish this is to use shutil.copyfileobj(), like explained here: .

As an alternative, especially if you want to do some extra stuff in the course of reading the response (like calculating a MD5 sum or setting some progress notification), you can simply iterate over read(BLOCK_SIZE) manually, like implemented here:

for block in iter(,""): # urllib.retrieve() has 8KiB as default block size, shutil.copyfileobj() 16KiB.

(When using Python3, maybe there are some other options… I have not yet investigated this topic with regard to Python version 3.)

If you are not too keen on using GAFA’s clouds, download from PEPS is now faster ! PEPS is the French collaborative ground segment, and it provides all Sentinel data globally. To allow that with a moderate cost and low electric consumption, most of the data is stored on tapes, except for a couple of petabytes on disks, which are used as a cache. A new version of tool allows to speed-up the downloads from PEPS, staging the reading of tapes while downloading the products already on disks

Hi all, would you know if sentinelsat package has a TCI only download option for Sentinel 2?

Hi there

Just for information.

Fortunately it was acknowledged at many levels that the SciHub did not provide enough performance and functionality and EU Commission have now procured 5 DIAS that give faster access and more functionality to sentinel data.

Some of them are very fast. Following are the links that I could find:
EUMETSAT, ECMWF and Mercator Océan
ATOS Integration, consortium includes T-SYSTEM International, DLR, eGEOS, EOX, GAF, Sinergise Ltd, Spacemetric, and Thales Alenia Space.
Airbus Defence and Space, consortium includes Orange SA, Airbus Defence and Space, Geo SA, Capgemini Technology Services SAS, CLS and VITO
Serco Europe, OVH, Gael Systems and Sinergise Ltd.

Creotech Instruments, Cloud Ferro, Sinergise Ltd, Geomatis SAS, Outsourcing Partner Sp. z o.o., Wroclaw Institute of Spatial Information and Artificial Intelligence Sp. z o.o.

1 Like

A post was split to a new topic: Blan result after supervised classification. Why?