I think SNAPHU should only contain the path to the script, so you can remove the second part (usr/share/…snaphu.1.gz). But this should not cause the problem you mentioned in the first post.
The file looks alright, you don’t need to modify the other lines because they are not included when data is preprocessed with SNAP.
Whenever you open a new terminal, you have to call (source) this file, before all commands are found. What happens when you type snaphu or triangle?
Although I have started processing in SNAP again. But I don’t find the reason for the problem that is why willing to know.
Is it possible that I have done anything wrong in SNAP Processing as my first three steps go very smoothly without any error every time I do?
sorry, I was browsing the forum on my phone and confused your issue with this one. Obviously, all scripts run fine on your side.
As your error occurs during step 4, there could be something wrong with the unwrapping.
Maybe this solution is an option: https://yunjunzhang.wordpress.com/2015/02/10/doris-9-segmentation-fault-error-in-snaphu/comment-page-1/ It requires to compile snaphu on your own (instead of apt-get), so first removing the existing instance is required. I cannot guarantee if this will solve your problem (although the error message is quite similar), but if you are familiar with compiling scripts, it could be worth a try.
Installing 3rd party packages with “sudo” may result in user configuration data being owned by “root” and/or being stored in root’s home directory. Complex packages often lag behind changes in the OS or have only been tested in one OS, so may overwrite critical OS programs.
Using the whereis command explains why you had a man page in one of your environment variables. Also, your X_BIN variables sometimes refer to the bin directory where a program can be found and sometimes give the full path of the program. The documentation always shows the path to the bin directory where the program has been installed.
Thanks a lot suribabu! but unfortunately that didn’t work for me…
Apparently, not so many people had the same issue so probably that’s something related to the dataset.