Im wondering how to make this script work:
for f in *.php
do
for c in seq 1 $#
do
eval `eval echo \$$c`
done
done
The main idea of the script y to execute all commands passed to the script on every file, for example:
bash script.sh "grep text \$f" "echo \"Done!\""
Should be the same as:
for f in *.php
do
grep text $f
echo "Done!"
done
I think it's pretty simple but I've been stuck here for long
Any help?
You can make it work like this:
for f in *.php
do
for arg
do
eval "$arg"
done
done
A more common (and safer) approach would be
find *.php -print0 | xargs -0 -I{} "$#"
which could be run as ./yourscript grep text {}, but it doesn't allow arbitrary shell constructs.
The following simple recipe in your bash script should work.
for arg in "$#"
do
find . -name '*.php' -print0 | xargs -n1 -0 -i $arg {}
done
Related
I'm in zsh.
I'd like to do something like:
find . -iname *.md | xargs cat && echo "---" > all_slides_with_separators_in_between.md
Of course this cats all the slides, then appends a single "---" at the end instead of after each slide.
Is there an xargs way of doing this? Can I replace cat && echo "---" with some inline function or do block?
Very strangely, when I create a file cat---.sh with the contents
cat $1
echo ---
and run
find . -iname *.md | xargs ./cat---.sh
it only executes for the first result of find.
Replace cat---.sh with cat and it runs on both files.
There's no need to use xargs at all here. Following is a properly paranoid approach (robust against files with spaces, files with newlines, files with literal backslashes in their names, etc):
while IFS= read -r -d '' filename; do
printf '---\n'
cat -- "$filename"
done < <(find . -iname '*.md' -print0) >all_slides_with_separators.md
However -- you don't even need that either: find can do all the work itself, both printing the separator and calling cat!
find . -iname '*.md' -printf '---\n' -exec cat -- '{}' ';' >all_slides_with_separators.md
A common usage pattern is xargs sh -c 'command; another' _ where the entire shell script in the quotes will have access to the command-line arguments. The underscore is because the first argument to sh -c will be assigned to $0 (where you'd often see e.g. -sh in a ps listing).
find . -iname '*.md' |
xargs sh -c 'for x; do
cat "$x" && echo "---"
done' _ > all_slides_with_separators_in_between.md
As noted in the comments, you should probably investigate find -print0 and the corresponding xargs -0 option in GNU find (and maybe install it if you don't have it).
You can do something like this, but it can be insecure in some cases (see comments):
find . -iname '*.md' | xargs -I % sh -c '{ cat %; echo "----"; }' > output.txt
You'll rarely need find in zsh; its globbing facilities cover nearly every use case of find.
for f in (#i)**/*.md; do
cat $f
print -- "---"
done > all_slides.md
This looks in the current directory hierarchy for every file that matches *.md in a case-insensitive manner.
For even more efficiency, replace cat $f with < $f; zsh itself will read the file and write its contents to standard output.
Using GNU Parallel it looks like this:
parallel cat {}\; print -- --- ::: **/*.md
I want to make a shell script for searching pattern in pdf files (to make them kind of corpus for myself!!)
I stole the following snippet from here
How to search contents of multiple pdf files?
find /path/to/folder -name '*.pdf' | xargs -P 6 -I % pdftotext % - | grep -C1 --color "pattern"
and the output looks like this
--
--
small deviation of γ from the average value 0.33 triggers
a qualitative difference in the evolution pattern, even if the
Can I make this command to print filename?
It doesn't have to be a "one-liner".
Thank you.
Not much. Just split the command into a loop.
find /path/to/folder -name '*.pdf' | while read file
do
echo "$file"
pdftotext "$file" | grep -C1 --color "pattern" && echo "$file"
done
EDIT: I just noticed the example included a parallel xargs command. This is not impossible to solve in a loop. You can write the pdftotext & grep command into a function and then use xargs
EDIT2: only print out file when there is a match
it might look something like this:
#!/bin/bash
files=$(find /path/to/folder -name '*.pdf')
function PDFtoText
{
file="$1"
if [ "$#" -ne "1" ]
then
echo "Invalid number of input arguments"
exit 1
fi
pdftotext "$file" | grep -C1 --color "pattern" && echo "$file"
}
export -f PDFtoText
printf "%s\n" ${files[#]} | xargs -n1 -P 6 -I '{}' bash -c 'PDFtoText "$#" || exit 255' arg0 {}
if [[ $? -ne 0 ]]
then
exit 1
fi
Why don't use something like
find /path/to/folder/ -type f -name '*.pdf' -print0 | \
xargs -0 -I{} \
sh -c 'echo "===== file: {}"; pdftotext "{}" - | grep -C1 --color "pattern"'
It always prints the filename. Do you think it's an acceptable compromise? Otherwise the echo part can be moved after the grep with a && as suggested before.
I prefer to use -print0 in combination with -0 just to deal with filenames with spaces.
I'd remove the -P6 option because the output of the 6 processes in parallel could be mixed.
I wanna make some custom commands for my terminal (i'm using Ubuntu).
I've already learned that i need to, for example, edit '.bash_aliases' file (in /home/your_user_name/), type 'source ~/.bash_aliases', and it should work then.
Well some things really works, like if i write (in '.bash_aliases') something like:
my_comm(){
if [ "$1" = aaa ]; then
echo hi a
fi
if [ "$1" = bbb ]; then
echo hello b
fi
#echo this is a comment :]
echo ending echo
}
then if i'll save file, type 'source ~/.bash_aliases', and run:
my_comm
it will print:
ending echo
and writing
my_comm bbb
will give:
hello b
ending echo
That's nice, but i want to know few more things, and i can't find them by google :(
------------------------------------------QUESTIONS----------------------------------------
(1)
how can i set a variable and then get the variable?
like:
var myVar = "some_dir"
cd /home/user/'myVar'/some_sub_dir/
?
(2)
i wanna make a function to shortcut a find/grep command that i use often:
find . -name "var_1" -print0 | xargs -0 grep -l "var_2"
I did something like:
ff(){
find . -name '"$1"' -print0 | xargs -0 grep "$3" '"$2"'
}
so, now executing:
ff views.py url -l
should give me:
find . -name 'views.py' -print0 | xargs -0 grep -l 'url'
but instead i recive:
grep: find . -name "$1" -print0
: There is no such file or directory
help pls :)
(1) how can i set a variable and then get the variable?
Like this:
myVar="/long/name/may have/a space/"
....
cd /home/user/"$myVar"/someSubDir.
Double quotes don't prevent variable substitution (unlike single quotes).
(2) i wanna make a function to shortcut a find/grep command that i use
often:
find . -name '"$1"' -print0 | xargs -0 grep "$3" '"$2"'
You achieve nothing useful with multiple kind of quotes here; actually you prevent $1 and $2 from being substituted and that breaks your function. Try this:
find . -name "$1" -print0 | xargs -0 grep "$3" "$2"
You can even using alias keyword for single instructions or use function keyword and combine couple of instructions in one. you can have a look at this
find . -name "filename including space" -print0 | xargs -0 ls -aldF > log.txt
find . -name "filename including space" -print0 | xargs -0 rm -rdf
Is it possible to combine these two commands into one so that only 1 find will be done instead of 2?
I know for xargs -I there may be ways to do it, which may lead to errors when proceeding filenames including spaces. Any guidance is much appreciated.
find . -name "filename including space" -print0 |
xargs -0 -I '{}' sh -c 'ls -aldF {} >> log.txt; rm -rdf {}'
Ran across this just now, and we can invoke the shell less often:
find . -name "filename including space" -print0 |
xargs -0 sh -c '
for file; do
ls -aldF "$file" >> log.txt
rm -rdf "$file"
done
' sh
The trailing "sh" becomes $0 in the shell. xargs provides the files (returrned from find) as command line parameters to the shell: we iterate over them with the for loop.
If you're just wanting to avoid doing the find multiple times, you could do a tee right after the find, saving the find output to a file, then executing the lines as:
find . -name "filename including space" -print0 | tee my_teed_file | xargs -0 ls -aldF > log.txt
cat my_teed_file | xargs -0 rm -rdf
Another way to accomplish this same thing (if indeed it's what you're wanting to accomplish), is to store the output of the find in a variable (supposing it's not TB of data):
founddata=`find . -name "filename including space" -print0`
echo "$founddata" | xargs -0 ls -aldF > log.txt
echo "$founddata" | xargs -0 rm -rdf
I believe all these answers by now have given out the right ways to solute this problem. And I tried the 2 solutions of Jonathan and the way of Glenn, all of which worked great on my Mac OS X. The method of mouviciel did not work on my OS maybe due to some configuration reasons. And I think it's similar to Jonathan's second method (I may be wrong).
As mentioned in the comments to Glenn's method, a little tweak is needed. So here is the command I tried which worked perfectly FYI:
find . -name "filename including space" -print0 |
xargs -0 -I '{}' sh -c 'ls -aldF {} | tee -a log.txt ; rm -rdf {}'
Or better as suggested by Glenn:
find . -name "filename including space" -print0 |
xargs -0 -I '{}' sh -c 'ls -aldF {} >> log.txt ; rm -rdf {}'
As long as you do not have newline in your filenames, you do not need -print0 for GNU Parallel:
find . -name "My brother's 12\" records" | parallel ls {}\; rm -rdf {} >log.txt
Watch the intro video to learn more: http://www.youtube.com/watch?v=OpaiGYxkSuQ
Just a variation of the xargs approach without that horrible -print0 and xargs -0, this is how I would do it:
ls -1 *.txt | xargs --delimiter "\n" --max-args 1 --replace={} sh -c 'cat {}; echo "\n"'
Footnotes:
Yes I know newlines can appear in filenames but who in their right minds would do that
There are short options for xargs but for the reader's understanding I've used the long ones.
I would use ls -1 when I want non-recursive behavior rather than find -maxdepth 1 -iname "*.txt" which is a bit more verbose.
You can execute multiple commands after find using for instead of xargs:
IFS=$'\n'
for F in `find . -name "filename including space"`
do
ls -aldF $F > log.txt
rm -rdf $F
done
The IFS defines the Internal Field Separator, which defaults to <space><tab><newline>. If your filenames may contain spaces, it is better to redefine it as above.
I'm late to the party, but there is one more solution that wasn't covered here: user-defined functions. Putting multiple instructions on one line is unwieldy, and can be hard to read/maintain. The for loop above avoids that, but there is the possibility of exceeding the command line length.
Here's another way (untested).
function processFiles {
ls -aldF "$#"
rm -rdf "$#"
}
export -f processFiles
find . -name "filename including space"` -print0 \
| xargs -0 bash -c processFiles dummyArg > log.txt
This is pretty straightforward except for the "dummyArg" which gave me plenty of grief. When running bash in this way, the arguments are read into
"$0" "$1" "$2" ....
instead of the expected
"$1" "$2" "$3" ....
Since processFiles{} is expecting the first argument to be "$1", we have to insert a dummy value into "$0".
Footnontes:
I am using some elements of bash syntax (e.g. "export -f"), but I believe this will adapt to other shells.
The first time I tried this, I didn't add a dummy argument. Instead I added "$0" to the argument lines inside my function ( e.g. ls -aldf "$0" "$#" ). Bad idea.
Aside from stylistic issues, it breaks when the "find" command returns nothing. In that case, $0 is set to "bash", Using the dummy argument instead avoids all of this.
Another solution:
find . -name "filename including space" -print0 \
| xargs -0 -I FOUND echo "$(ls -aldF FOUND > log.txt ; rm -rdf FOUND)"
I'm trying to rename all files in current directory such that upper case name is converted to lower. I'm trying to do it like this:
ls -1|gawk '{print "`mv "$0" "tolower($0)"`"}'|xargs -i -t eval {}
I have two files in the directory, Y and YY
-t added for debugging, and output is:
eval `mv Y y`
xargs: eval: No such file or directory
if I execute the eval on its own, it works and moves Y to y.
I know there are other ways to achieve this, but I'd like to get this working if I can!
Cheers
eval is a shell builtin command, not a standalone executable. Thus, xargs cannot run it directly. You probably want:
ls -1 | gawk '{print "`mv "$0" "tolower($0)"`"}' | xargs -i -t sh -c "{}"
Although you're looking at an xargs solution, another method to perform the same thing can be done with tr (assuming sh/bash/ksh syntax):
for i in *; do mv $i `echo $i | tr '[A-Z]' '[a-z]'`; done
If your files are created by creative users, you will see files like:
My brother's 12" records
The solutions so far do not work on that kind of files. If you have GNU Parallel installed this will work (even on the files with creative names):
ls | parallel 'mv {} "$(echo {} | tr "[:upper:]" "[:lower:]")"'
Watch the intro video to learn more: http://www.youtube.com/watch?v=OpaiGYxkSuQ
You can use eval with xargs like the one below.
Note: I only tested this in bash shell
ls -1| gawk '{print "mv "$0" /tmp/"toupper($0)""}'| xargs -I {} sh -c "eval {}"
or
ls -1| gawk '{print "mv "$0" /tmp/"toupper($0)""}'| xargs -I random_var_name sh -c "eval random_var_name"
I generally use this approach when I want to avoid one-liner for loop.
e.g.
for file in $(find /some/path | grep "pattern");do somecmd $file; done
The same can be written like below
find /some/path | grep "pattern"| xargs -I {} sh -c "somecmd {}"