So it seemed it did the atmospheric correction at the end. Thanks for all help!
Only it did not store the ouptut at the allocated ouptut folder but within the renamed input folder (in my case “S2_1C_Barrax”). It created the folder “S2A_USERarrax.SAFE” and within that folder the full name: ‘S2A_USER_MTD_SAFL2A_PDMC_20150818T202138_R094_V20150818T111059_20150818T111059’.
However it currently has no detailed description on the sen2Cor Integration into the Toolbox.
Within this manual, there is also a chapter dealing with the long path name.
Please note (!) This is a Windows problem, not a Sen2Cor related issue. All Applications will have this problem.
In you special case there is another issue: sen2Cor normally expects a correct product name to process successfully, as it uses this for creating the L2A metadata. In your case the product name is S2A_1C_Barrax.safe. It must contain at least: “S2A_OPER_LIC” to work correctly. This will fail also on my side:
Thanks Uwe - :L2A processing works but when the progress meter reaches 100% it just stays there and only gives an option to ‘Cancel’. I waited 5 minutes and then cancelled. The files are complete so perhaps this is a minor issue to be addressed (or maybe it is just how my machine handles python calls to the graphics drivers … )
Looking at L2A data now (Jochem - it is good to know I am not alone!)
Thanks for the clarification. I will read the manual and it is good that you pointed to the naming issue. Also thanks for all the efforts you guys did in developing the processor and I am sure the atmospheric correction algorithm does a great job.
However, in my opinion the so-called ‘Windows problem’ should be resolved properly. You can imagine that the majority of users install SNAP in a Windows environment. In my earlier experience with BEAM in Windows, the atmospheric correction plugins worked seamless for the older MERIS or CHRIS images (no additional software installment or renaming required). I was expecting the same experience for S2 in SNAP but it appeared to be way more complicated. Both in Windows 7 and 10, processing the images caused error messages.
Although manually shortening the image name may somehow resolve the issue, however, the user cannot shorten to just any name but, as you mentioned above, needs “S2A_OPER_LIC”. The user should be made aware of that (without reading this forum first).
I will now try the work-around as proposed in the Manual (point to a Network drive), but I believe a lot of trouble can be avoided if informative error messages would be provided with suggestions for solutions. Even better would be if the software first checks the name, and if too long, e.g. uses internally a temporal shortened name for further processing.
Thanks again for all efforts. I will now finally start using the images.
In the beta-7 version of the L2A reader the resolution is not asked and only bands of highest resolution are displayed, meaning if the product was processed at 20m only 20m bands are displayed. On the next update there will be choice of resolution. Until then processed images can be opened by opening the jp2 files.
I followed the naming instructions as you suggested and effectively it was able to write away the image correctly. Also I changed DN_Scale to 2000. Thanks for the suggestions.
The BOA reflectance map seems to make sense, as you can see in the screenshot below. Only, I am not sure how to interpret the values. As you know, reflectance is expressed as a factor, typically scaled between 0 and 1. In that scale it is also shown in your Manual, see Figure 2-10. However, the obtained L2 maps provides values a factor 1000 bigger, and somethimes ranging far above 1000 in case of clouds. So this makes me wondering, to which factor I have to divide the image in order that the reflectance values make physically sense? This is important in order to be able correctly interpret the image and apply further processing.
This means, that for the current used satellite images the digital numbers of the bands between 0 and 1000 represent reflectance values between 0 and 1 as you expect. Values > 1000 are also present and you will find them primary as you remarked for clouds and high reflecting structures like roof and greenhouses. For the L2A correction these saturated values will not taken into account.
For the moment, the L2A output you will receive will have still 2x the output value of the L1C data. This has historical reasons. At current I am preparing a fixed version which then will have the same scaling for the L2A output, i.e. 0 : 1000 then represents also a reflectance value of 0 : 1 for the L2A data. For the moment you need to divide the L2A output data by 2000. The fix will be available soon.
Thanks for the clarification. It seems we are getting there in processing the images into sth interesting, e.g. vegetation products.
Although probably beyond the responsability of the developers, I am only a bit worried that dividing by 2000 will lead to extreme low vegetation spectra. In reality vegetation reflectance in the NIR (B8A; 865 nm) is typically around 0.3-0.6. If I would divide the image by 2000, refletance will at most reach up to 0.25. That is considerably lower than what is typically observed by spectrometers or models. Dividing by 1000 seems to lead to more realistic values when it comes to vegetation. Anyway, I will divide by 2000 and see what comes out when processing against a canopy radiative transfer model.
ps: It would be great if SNAP automatically recognizes RGB bands for L2A in the same way as L1C. At L2A SNAP does not know which RGB bands to choose.
Hi, the 10 m resolution Band 8 is not used in the 60 m calculation. Instead, B8A is used. Same is true for 20 m resolution. In the 10 m resolution (which of course takes time) then all four 10m bands 2,3,4,8 are generated. The 10 m cirrus band is in generally not included, as it is only an input band used for processing, but there are no L2A related outputs on this band. The exact processing parameters of all the band for the three different resolutions are specified via the ATBD document.
-Hiding/showing the window does not seem to change anything
-Same result everytime i try it
-Tools/Manage External Tools works, but the “New” button leads to the same results.
I found out that if I click the location where the input fields are, they start appearing, but there are no captions, buttons etc. so I can’t save any settings or something. Additionally, my PC seems to become really really slow.
Another request/idea for the SEN2COR python script: Currently, the original *.jp2 tiles are georeferenced, the corrected tiles aren’t. It is easy to add the georeferencing manually, but in my opinion, it makes sense if this would be incorporated in SEN2COR. Currently I’m using gdalcopyproj.py which actually works great.