Cocoa or Bash: Test if a file is an executable binary - cocoa

In Cocoa, how would I test if a file is an executable binary? Unfortunately, [NSFileManager isExecutableFileAtPath:] will also return true for scripts and directories, pretty much any file that has the executable bit set which is not what I want.
While doing it in straight-up Cocoa is my preferred approach, a Bash solution that I can easily wrap in an NSTask would be sufficient.

Directories you can filter out easily in code, but knowing what is a binary and what is not is a little hard because, effectively, the only way is to open the file and read it, which is something you need to do yourself.
The main problem, however, is what should be considered a binary.
I have seen executable files that had a dozen text lines in the beginning (so, effectively they were scripts) but then the rest was binary. How would you classify them?
If you are ok to classify them according to how they are loaded, you can try the command file that will try to tell you as precisely as possible what a file is.

I don't know Cocoa, but this is a bash solution:
find ../ -type f -perm +111 | \
xargs -n 1 -I {} file "{}" | grep -v text | cut -d: -f1

Related

How to locate a string among all files under a directory?

I have a directory containing a bunch of header files from a library. I would like to see how "Uint32" is defined.
So, I need a way to scan over all those header files and print out lines with "Uint32".
I guess grep could help, but I'm new to shell scripts.
What should I do?
There's a couple of ways.
grep -r --include="*.c" Unit32
is one way.
Another is:
find . -name "*.c" | xargs grep Unit32
If you have spaces in the file names, the latter can be problematic.
find . -name "*.c" -print0 | xargs -0 grep Unit32
will solve that typically.
Just simple grep will be fine:
grep "Uint32" *.h*
This will search both *.h and *.hpp header files.
Whilst using grep is fine, for navigating code you may also want to investigate ack (a source-code aware grep variant), and/or ctags (which integrates with vi or emacs and allows navigation through code in your editor)
ack in particular is very nice, since it'll navigate through directory hierarchies, and only work on specific types of files (so for C it'll interrogate .c and .h files, but ignore SCM revision directories, backup files etc.)
Of course, you really need some form of IDE to give you complete navigation over the codebase.

How do I list all the ACLs of a directory tree?

I have a Debian Fileserver with ACLs.
With getfacl -R I get all the files, but I only need the directories.
I tried to only list the directories and then input that to getfacl:
ls -R | grep ":$" | cut -d: -f 1 > file.txt
and then getfacl cat file.txt. But a lot of directories have spaces, and we can't change them.
Thanks for your help.
Your problem is not really hard, is it, but only painful. Sorry about that. For reliable performance, you may have to write and compile your own command-line tool using the library interface summarized by man 5 acl.
If you happen to use Python (I don't use it myself, being of the Perl habit, rather), then there appears to exist the Debian package python-pylibacl, which looks as though it might do what you want.
Good luck.

Can I search all the text files in a tree (but not the binaries) for a certain string

My best shot so far is (for looking for strings in a directory containing a large C program)
find ~/example_directory -type f \( -name "*.mk" -or -name "*.[sch]" \) -print0 | xargs -0 -e grep "example_string"
Which works pretty well, but it relies on all the interesting things being in .mk makefiles, .c or .h source files, and .s assembler files.
I was thinking of adding in things like 'all files called Makefile' or 'all *.py python scripts', but it occurs that it would be way easier if there were some way to tell find only to find the text files.
If you just run grep on all files, it takes ages, and you get lots of uninteresting hits on object files.
GNU grep supports the -I option, which makes it treat binary files (as determined by looking at the first few bytes) as if they don't match, so they are essentially skipped.
grep -rI <path> <pattern>
The '-r' switch makes grep recurse, and '-I' makes it ignore binary files.
There are additional switches to exclude certain files and directories (I frequently do this to exclude svn metadata, for example)
Have you looked at ack?
From the top 10 reasons to use ack:
ack ignores most of the crap you do not want to search
...
binary files, core dumps, etc
You can use grep -I to ignore binary files. Using GNU Parallel instead of xargs will allow you to break up the work into multiple processes, exploiting some parallelism for speedup.
There is an example of how to perform a parallel grep available in the documentation:
http://www.gnu.org/s/parallel/man.html#example__parallel_grep
find -type f | parallel -k -j150% -n 1000 -m grep -I "example_string"

Bash script to transverse a directory

I have a directory with XML files and other directories. All other directories have XML files and subdirectories, etc.
I need to write a script (bash probably) that for each directory runs java XMLBeautifier directory and since my skills at bash scripting are a bit rubbish, I would really appreciate a bit of help.
If you have to get the directories, you can use:
$ find . -type d
just pipe this to your program like this:
$ find . -type d | xargs java XMLBeautifier
Another approach would be to get all the files with find and pipe that to your program like this:
$ find . -name "*.xml" | xargs java XMLBeautifier
This takes all .xml files from the current directory and recursively through all subdirectories. Then hands them one by one over with xargs to java XMLBeautifier.
Find is an awesome tool ... however, if you are not sure of the file name but have a vague idea of what those xml file contains then you can use grep.
For instance, if you know for sure that all your xml files contains a phrase "correct xml file" (you can change this phrase to what you feel appropriate) then run the following at your command line ...
grep -IRw "correct xml file" /path/to/directory/*
-I option searches the file and returns the file name when pattern is matched
-R option reaches your directory recursively
-w ensure that the pattern given matches on the whole and not single word individually
Hope this helps!

BASH file attribute gymnastics: How do I easily get a file with full paths and privileges?

Dear Masters of The Command Line,
I have a directory tree for which I want to generate a file that contains on two entries per line: full path for each file and the corresponding privileges of said file.
For example, one line might contain:
/v1.6.0.24/lib/mylib.jar -r-xr-xr-x
The best way to generate the left hand column there appears to be find. However, because ls doesn't seem to have a capability to either read a list of filenames or take stdin, it looks like I have to resort to a script that does this for me. ...Cumbersome.
I was sure I've seen people somehow get find to run a command against each file found but I must be daft this morning as I can't seem to figure it out!
Anyone?
In terms of reading said file there might be spaces in filenames, so it sure would be nice if there was a way to get some of the existing command-line tools to count fields right to left. For example, we have cut. However, cut is left-hand-first and won't take a negative number to mean start the numbering on the right (as seems the most obvious syntax to me). ... Without having to write a program to do it, are there any easy ways?
Thanks in advance, and especial thanks for explaining any examples you may provide!
Thanks,
RT
GNU findutils 4.2.5+:
find -printf "$PWD"'/%p %M\n'
It can also be done with ls and awk:
ls -l -d $PWD/* | awk '{print $9 " " $1}' > my_files.txt
stat -c %A file
Will print file permissions for file.
Something like:
find . -exec echo -ne '{}\t\t' ';' -exec stat -c %A {} ';'
Will give you a badly formatted version of what your after.
It is made much trickier because you want everything aligned in tables. You might want to look into the 'column' command. TBH I would just relax my output requirements a little bit. Formatting output in SH is a pain in the ass.
bash 4
shopt -s globstar
for file in /path/**
do
stat -c "%n %A" "$file"
done

Resources