using xargs in an alias to remove svn unversioned files interactively? - bash

I'm trying to create a bash alias to remove, interactively, all svn unversioned files. I've gotten as far as
svn st | grep '^\?'| sed 's/^? //'
to get a list of such files. Piping into xargs -p rm just gives me a single prompt for all the files, e.g.
rm fileA fileB fileC fileD ?...
whereas I want to confirm each file individually. On the command line, I can do
rm -i $(svn st | grep '^\?'| sed 's/^? //')
to get the desired behavior, but it doesn't work when I stick it in an alias or function.

Related

How to delete all files not in a set

I have a plain text file with a list of file names. For example,
A.doc
E.doc
F.pdf
I would like to delete all files in the current directory except for those.
Can this be done in bash?
Let's say the list of files not to delete is goodfiles.txt. Then:
ls | grep -vx -f goodfiles.txt
Gives you the list of "other" files, that you want to delete. If you confirm those are the files you want to delete, then:
ls | grep -vx -f goodfiles.txt | xargs -d '\n' rm

I want to delete files from same directory which are not in the result of grep command

I want to delete files from same directory which are not in the result of grep command.
I found grep -L <WORD> <filename> |xargs rm
which finds provided word in files. But I want to delete files which are not currently being used.
I am able to find out files currently being used using lsof command.
use grep -v for invert the selection.
Basically what is mentioned by Amit; do
grep -Lv <WORD> <filename> |xargs rm
the -v option inverts the selection.

Sed with Xargs cannot open passed file (Cygwin)

Trying to use the beauty of Sed so I don't have to manually update a few hundred files. I'll note my employer only allows use of Win8 (joy), so I use Cygwin all day until I can use my Linux boxes at home.
The following works on a Linux (bash) command line, but not Cygwin
> grep -lrZ "/somefile.js" . | xargs -0 -l sed -i -e 's|/somefile.js|/newLib.js|g'
sed: can't read ./testTarget.jsp: No such file or directory
# works
> sed -i 's|/somefile.js|/newLib.js|g' ./testTarget.jsp
So the command by itself works, but not passed through Xargs. And, before you say to use Perl instead of Sed, the Perl equivalent throws the very same error
> grep -lrZ "/somefile.js" . | xargs -0 perl -i -pe 's|/somefile.js|/newLib.js|g'
Can't open ./testTarget.jsp
: No such file or directory.
Use the xargs -n option to split up the arguments and force separate calls to sed.
On windows using GnuWin tools (not Cygwin) I found that I need to split up the input to sed. By default xargs will pass ALL of the files from grep to one call to sed.
Let's say you have 4 files that match your grep call, the sed command will run through xargs like this:
sed -i -e 's|/somefile.js|/newLib.js|g' ./file1 ./file2 ./subdir/file3 ./subdir/file4
If the number of files is too large sed will give you this error.
Use the -n option to have xargs call sed repeatedly until it exhausts all of the arguments.
grep -lrZ "/somefile.js" . | xargs -0 -l -n 2 sed -i -e 's|/somefile.js|/newLib.js|g'
In my small example using -n 2 will internally do this:
sed -i -e 's|/somefile.js|/newLib.js|g' ./file1 ./file2
sed -i -e 's|/somefile.js|/newLib.js|g' ./subdir/file3 ./subdir/file4
I had a large set of files and directories (around 3000 files), and using xargs -n 5 worked great.
When I tried -n 10 I got errors. Using xargs --verbose I could see some of the commandline calls were getting cut off at around 500 characters. So you may need to make -n smaller depending on the path length of the files you are woking with.

Escaping piped output between commands

I use the below command to delete changed files sometimes when using hg.
hg status -n | xargs rm
I have come across an problem where if the output of
hg status -n
contains any file paths with spaces in the file will not be found. Usually i would quote or escape spaces in file names but im not sure how to do this with piped output. Any help would be great thanks :)
Tell both commands to use NUL as the delimiter:
hg status -n0 | xargs -0 rm
Also be careful: the -n option will print even files Mercurial doesn't know about.
Maybe you want this instead?
hg status -mn0 | xargs -0 rm
Also, don't forget about hg revert or hg purge. Maybe they do what you want, e.g.
hg revert --all --no-backup
or
.hgrc
[extensions]
hgext.purge=
shell
hg purge
I don't have hg installed. So I will do it with ls:
$ touch 'file A' 'file B'
$ ls -1
file A
file B
$ ls | xargs rm
rm: cannot remove `file': No such file or directory
rm: cannot remove `A': No such file or directory
rm: cannot remove `file': No such file or directory
rm: cannot remove `B': No such file or directory
$ ls | tr '\n' '\0' | xargs -0 rm
$ ls
Let xargs handle that with the -I option:
hg status -n | xargs -I FileName rm FileName
-I increases the safety, but reduces the efficiency as only one filename at a time will be passed to 'rm'
An example:
$ printf "%s\n" one "2 two" "three 3 3" | xargs printf "%s\n"
one
2
two
three
3
3
$ printf "%s\n" one "2 two" "three 3 3" | xargs -I X printf "%s\n" X
one
2 two
three 3 3
Beside -0, newer xargs has option -d which can help you doing such things:
<command returning \n-separated paths> | xargs -d \\n rm -v

Best way to do a find/replace in several files?

what's the best way to do this? I'm no command line warrior, but I was thinking there's possibly a way of using grep and cat.
I just want to replace a string that occurs in a folder and sub-folders. what's the best way to do this? I'm running ubuntu if that matters.
I'll throw in another example for folks using ag, The Silver Searcher to do find/replace operations on multiple files.
Complete example:
ag -l "search string" | xargs sed -i '' -e 's/from/to/g'
If we break this down, what we get is:
# returns a list of files containing matching string
ag -l "search string"
Next, we have:
# consume the list of piped files and prepare to run foregoing command
# for each file delimited by newline
xargs
Finally, the string replacement command:
# -i '' means edit files in place and the '' means do not create a backup
# -e 's/from/to/g' specifies the command to run, in this case,
# global, search and replace
sed -i '' -e 's/from/to/g'
find . -type f -print0 | xargs -0 -n 1 sed -i -e 's/from/to/g'
The first part of that is a find command to find the files you want to change. You may need to modify that appropriately. The xargs command takes every file the find found and applies the sed command to it. The sed command takes every instance of from and replaces it with to. That's a standard regular expression, so modify it as you need.
If you are using svn beware. Your .svn-directories will be search and replaced as well. You have to exclude those, e.g., like this:
find . ! -regex ".*[/]\.svn[/]?.*" -type f -print0 | xargs -0 -n 1 sed -i -e 's/from/to/g'
or
find . -name .svn -prune -o -type f -print0 | xargs -0 -n 1 sed -i -e 's/from/to/g'
As Paul said, you want to first find the files you want to edit and then edit them. An alternative to using find is to use GNU grep (the default on Ubuntu), e.g.:
grep -r -l from . | xargs -0 -n 1 sed -i -e 's/from/to/g'
You can also use ack-grep (sudo apt-get install ack-grep or visit http://petdance.com/ack/) as well, if you know you only want a certain type of file, and want to ignore things in version control directories. e.g., if you only want text files,
ack -l --print0 --text from | xargs -0 -n 1 sed -i -e 's/from/to/g'
# `from` here is an arbitrary commonly occurring keyword
An alternative to using sed is to use perl which can process multiple files per command, e.g.,
grep -r -l from . | xargs perl -pi.bak -e 's/from/to/g'
Here, perl is told to edit in place, making a .bak file first.
You can combine any of the left-hand sides of the pipe with the right-hand sides, depending on your preference.
An alternative to sed is using rpl (e.g. available from http://rpl.sourceforge.net/ or your GNU/Linux distribution), like rpl --recursive --verbose --whole-words 'F' 'A' grades/
For convenience, I took Ulysse's answer (after correcting the undesirable error printing) and turned it into a .zshrc / .bashrc function:
function find-and-replace() {
ag -l "$1" | xargs sed -i -e s/"$1"/"$2"/g
}
Usage: find-and-replace Foo Bar
The typical (find|grep|ack|ag|rg)-xargs-sed combination has a few problems:
Difficult to remember and get correct. Eg, forgetting the xargs -r option will run the command even when no files are found, potentially causing problems.
Retrieving the file list, and the actual replacement uses different CLI tools and can have a different search behaviour.
These problems were big enough for such an invasive and dangerous operation as recursive search-and-replace, to start the development of a dedicated tool: mo.
Early tests seem to indicate that its performance is between ag and rg and it solves following problems I encounter with them:
A single invocation can filter on filename and content. Following command searches for the word bug in all source files that have a v1 indication:
mo -f 'src/.*v1.*' -p bug -w
Once the search results are OK, actual replacement for bug with fix can be added:
mo -f 'src/.*v1.*' -p bug -w -r fix
comment() {
}
doc() {
}
function agr {
doc 'usage: from=sth to=another agr [ag-args]'
comment -l --files-with-matches
ag -0 -l "$from" "${#}" | pre-files "$from" "$to"
}
pre-files() {
doc 'stdin should be null-separated list of files that need replacement; $1 the string to replace, $2 the replacement.'
comment '-i backs up original input files with the supplied extension (leave empty for no backup; needed for in-place replacement.)(do not put whitespace between -i and its arg.)'
comment '-r, --no-run-if-empty
If the standard input does not contain any nonblanks,
do not run the command. Normally, the command is run
once even if there is no input. This option is a GNU
extension.'
AGR_FROM="$1" AGR_TO="$2" xargs -r0 perl -pi.pbak -e 's/$ENV{AGR_FROM}/$ENV{AGR_TO}/g'
}
You can use it like this:
from=str1 to=sth agr path1 path2 ...
Supply no paths to make it use the current directory.
Note that ag, xargs, and perl need to be installed and on PATH.

Resources