How can I make xargs execute two commands? - shell

I tried this
s3cmd ls s3://somebucket/data/ | awk '{print $4}' | xargs -I
%s s3cmd -v -c s3.cfg cp %s 's3://anotherbucket/data/' && s3cmd
-c s3.cfg rm %s -v
it does not work, of course, because the second command (s3cmd rm) is not treated as part of the xargs argument...
How can I do it?
Background is that the move operation of the s3cmd in my case appears not to delete the source file, so I wanted to replace it with a copy and a delete, which appears to work.

i ended up just envoking sh (took that hint form the xarg manpage) and working from there on, like this:
s3cmd ls s3://somebucket/data/ | awk '{print $4}' | xargs -n 1 sh -c '
s3cmd -v -c s3.cfg cp $0 's3://anotherbucket/data/'; s3cmd -c s3.cfg
rm $0 -v';

Related

Reporting with cut and grep

I'm trying to create a script that gets an extension and reports in two columns, the user and the amount of files that user owns with that extension.
The results must be printed in report.txt
Here is my code.
#!/bin/bash
#Uncoment to create /tmp/test/ with 50 txt files
#mkdir /tmp/test/
#touch /tmp/test/arch{01..50}.txt
clear
usage(){
echo "The script needs an extension to search"
echo "$0 <extension>"
}
if [ $# -eq 0 ]; then
usage
exit 1
fi
folder="/tmp/test/"
touch report.txt
count=0
pushd $folder
for file in $(ls -l); do
grep "*.$1" | cut -d " " -f3 >> report.txt
done
popd
The program just runs endlessly. And I'm not even counting the files for each user.
How can I solve this using only grep and cut?
With GNU stat :
stat -c '%U' *."$1" | sort | uniq -c | awk '{print $2,"\t",$1}' > report.txt
As pointed out by mklement0, under BSD/OSX you must use a -f option with stat :
stat -f '%Su' *."$1" | sort | uniq -c | awk '{print $2,"\t",$1}' > report.txt
Edit :
To process many files and avoid argument number limitation, you'd better use a printf piped to the stat command (thanks again mklement0) :
printf '%s\0' *."$1" | xargs -0 stat -c '%U' | sort | uniq -c | awk '{print $2,"\t",$1}'
You don't need a loop for this (unless you later need to loop over several folders), and changing the working directory in a script is rarely necessary. Also, reading ls output is usually not recommended.
Here's a version that replaces the loop, and uses du:
ext="$1"
printf "Folder '%s':\t" "$folder" >>report.txt
du -hc "$folder"/*."$ext" | sed -n '$p' >>report.txt

Bash: moving a group of files of a certain size with grep, awk and xargs

At work, I need to upload images to a website. They cannot be larger than 300 KB. In order to group the images that are ready to be uploaded, I devised the following line in Bash:
du -h * | grep "[0-2]..K" | awk '{print $2}' | xargs mv Ready/
This did not work, however, because the shell returned the following:
usage: mv [-f | -i | -n] [-v] source target
mv [-f | -i | -n] [-v] source ... directory
Finally, I resorted to a for-loop to accomplish the same:
for file in $(du -h * | grep "[0-2]..K" | awk '{print $2}')
do
mv -v ${file} Ready/
done
Can somebody explain why the first line doesn't work? It is probably something very simple I'm missing, but I can't seem to find it.
I'm on Mac OS X 10.7, Bash version 4.3.
I would use the find command to get all files smaller than a certain size, makes the code a lot cleaner and easier to read like so:
find . -size -300k -name *.png -exec mv {} Ready/ \;
The reason your first command fails is because you have to reference the value you are piping in since it is not at the end of the statement. This should work:
du -h * | grep "[0-2]..K" | awk '{print $2}' | xargs -0 -I {} mv {} Ready/

Don't call xargs if the previous command doesn't return anything [duplicate]

This question already has answers here:
How to ignore xargs commands if stdin input is empty?
(7 answers)
Closed 6 years ago.
I have the following command that checks if any new files are added and automatically calls svn add on all these files
svn status | grep -v "^.[ \t]*\..*" | grep "^?" | awk '{print $2}' | xargs svn add
But when there are no files, svn add results in a warning.
How to stop from xargs from getting called the previous command doesn't result in any values? The solution needs to work with both GNU and BSD (Mac OS X) versions of xargs.
If you're running the GNU version, use xargs -r:
--no-run-if-empty
-r
If the standard input does not contain any nonblanks, do not run the command.
Normally, the command is run once even if there is no input. This option
is a GNU extension.
http://linux.die.net/man/1/xargs
If you're using bash, another way is to just store outputs in arrays. And run svn only if the there is an output.
readarray -t OUTPUT < <(exec svn status | grep -v "^.[ \t]*\..*" | grep "^?" | awk '{print $2}')
[[ ${#OUTPUT[#]} -gt 0 ]] && svn add "${OUTPUT[#]}"
I ended up using this. Not very elegant but works.
svn status | grep -v "^.[ \t]*\..*" | grep "^?" && svn status | grep -v "^.[ \t]*\..*" | grep "^?" | awk '{print $2}' | xargs svn add
ls /empty_dir/ | xargs -n10 chown root # chown executed every 10 args
ls /empty_dir/ | xargs -L10 chown root # chown executed every 10 lines
ls /empty_dir/ | xargs -i cp {} {}.bak # every {} is replaced with the args from one input line
ls /empty_dir/ | xargs -I ARG cp ARG ARG.bak # like -i, with a user-specified placeholder
https://stackoverflow.com/a/19038748/1655942

Find a specific file then pipe to stdout/awk

I'm looking for a way to traverse directories recursively to find a specific file, then stop the search and pipe the filename path to an awk function or something similar. I asked a question earlier that was similar, but after testing on machines other than mine it turns out the locate command isn't going to work since not everyone uses it on their system.
Code that I used with locate:
dir="/path/to/destination/";
mkdir "$dir";
locate -l 1 target_file.txt | \
awk -v dir="$dir" '{printf "cp \"%s\" \"%s\"\n", $1, dir}' | \
sh
The find(1) command will do it. To only get one line, use head(1).
dir="/path/to/destination/";
mkdir "$dir";
find /path/location -name target_file.txt |
head -n 1 |
awk -v dir="$dir" '{printf "cp \"%s\" \"%s\"\n", $1, dir}' |
sh
If you know only one file exists then you can use
find ./ -name "target_file.txt" -exec cp -r {} $dir \;
And if you are not sure, use head to limit 1 and use xargs
find ./ -name "target_file.txt" | head -1 | xargs -I {} cp -r {} $dir/

xargs to execute a string - what am I doing wrong?

I'm trying to rename all files in current directory such that upper case name is converted to lower. I'm trying to do it like this:
ls -1|gawk '{print "`mv "$0" "tolower($0)"`"}'|xargs -i -t eval {}
I have two files in the directory, Y and YY
-t added for debugging, and output is:
eval `mv Y y`
xargs: eval: No such file or directory
if I execute the eval on its own, it works and moves Y to y.
I know there are other ways to achieve this, but I'd like to get this working if I can!
Cheers
eval is a shell builtin command, not a standalone executable. Thus, xargs cannot run it directly. You probably want:
ls -1 | gawk '{print "`mv "$0" "tolower($0)"`"}' | xargs -i -t sh -c "{}"
Although you're looking at an xargs solution, another method to perform the same thing can be done with tr (assuming sh/bash/ksh syntax):
for i in *; do mv $i `echo $i | tr '[A-Z]' '[a-z]'`; done
If your files are created by creative users, you will see files like:
My brother's 12" records
The solutions so far do not work on that kind of files. If you have GNU Parallel installed this will work (even on the files with creative names):
ls | parallel 'mv {} "$(echo {} | tr "[:upper:]" "[:lower:]")"'
Watch the intro video to learn more: http://www.youtube.com/watch?v=OpaiGYxkSuQ
You can use eval with xargs like the one below.
Note: I only tested this in bash shell
ls -1| gawk '{print "mv "$0" /tmp/"toupper($0)""}'| xargs -I {} sh -c "eval {}"
or
ls -1| gawk '{print "mv "$0" /tmp/"toupper($0)""}'| xargs -I random_var_name sh -c "eval random_var_name"
I generally use this approach when I want to avoid one-liner for loop.
e.g.
for file in $(find /some/path | grep "pattern");do somecmd $file; done
The same can be written like below
find /some/path | grep "pattern"| xargs -I {} sh -c "somecmd {}"

Resources