Python codes of GLCM for texture feature extraction


#1

Any idea how to have access to Gray Level Co-occurence matrix (GLCM) python codes for SAR texture feature extraction? I would like to run the texture analysis on SAR Terrain correction data in order to produce “entropy”, but through the python.

Your help is very appreciated.


Run processes of SNAP toolbox using python script
#2

Good morning SAR2016,

I once wrote a python code for it, but I could not really test it because of “out-of-memory” on my computer. But it should work:

snappy_glcm.py (8.7 KB)

Best regards,
Andreas


Snappy GPT operators
#3

thank you andreas for this example! I generally understood the python API but often struggled with the right syntax.


#4

Morning, Thanks a lot for sharing it.
My SAR data is 1.4 Gb which has caused us to use a 32 gb ram .

I am going to test it on my data (SAR Terrain Correction geotiff) and will keep you posted about it.

PS- This is a good tutorial for those who might be interested in learning more about how GLCM works
http://www.fp.ucalgary.ca/mhallbey/descriptive_stats_group.htm


#5

by the way, for the memory issue, this probably helps.


#6

Yes, I changed it already to the maxium available. But the problem is I work with Sentinel-2 (~6 GB). So with my 8GB Desktop I have no chance. Not even with my private 16GB Laptop. :slight_smile:


#7

Can you elaborate the procedure to find the “java max mem” and how to increase it? i am using only snap desktop app and getting java heap space error, not familiar with snappy and commands. so, kindly help me to increase the java heap space.

my laptop is of 12gb ram 64 bit, i am working on sentinel 1 data. If i increase java heap space, will it be helpful?

Thanks in advance


#8

Look at the following links to know how to increase SNAP memory…

Increase snappy memory(python): in python when we face with the problem of
java.lang.OutOfMemoryError: Java heap space or Data Buffer

good link to increase memory when we use GPT:(comment 6)