I was trying to rename all files using find but after i ran this...
find . -name '*tablet*' -exec sh -c "new=$(echo {} | sed 's/tablet/mobile/') && mv {} $new" \;
i found that my files where gone, changed it to echo the value of $new and found that it always kept the name of the first file so it basically renamed all files to have the same name
$ find . -name '*tablet*' -exec sh -c "new=$(echo {} | sed 's/tablet/mobile/') && echo $new" \;
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
also tried to change to export new=..., same result
Why doesn't the value of new change?
The problem I believe is that the command substitution is expanded by bash once then find uses the result in each invocation. I could be wrong with the reason.
When I have similar stuff before I write out a shell script eg
#! /bin/bash
old="$1"
new="${1/tablet/mobile}"
if [[ "${old}" != "${new}" ]]; then
mv "${old}" "${new}"
fi
that takes care of renaming the file then I can call that script from the find command
find . -name "*tablet*" -exec /path/to/script '{}' \;
makes things much simpler to sort out.
EDIT:
HAHA after some messing around with the quoting you can sort this out by changing the double quotes to single quotes encapsulating the command. As is the $() is expanded by the shell command. if done as below the command substitution is done by the shell invoked by the exec.
find . -name "*tablet*" -exec sh -c 'new=$( echo {} | sed "s/tablet/mobile/" ) && mv {} $new' \;
SO the issue is to do with when the command substitution is expanded, by puting it in single quotes we force the expansion in each invokation of sh.
Related
I have some wav files. For each of those files I would like to create a new text file with the same name (obviously with the wav extension being replaced with txt).
I first tried this:
find . -name *.wav -exec 'touch $(echo '{}" | sed -r 's/[^.]+\$/txt/')" \;
which outputted
< touch $(echo {} | sed -r 's/[^.]+$/txt/') ... ./The_stranglers-Golden_brown.wav > ?
Then find complained after I hit y key with:
find: ‘touch $(echo ./music.wav | sed -r 's/[^.]+$/txt/')’: No such file or directory
I figured out I was using a pipe and actually needed a shell. I then ran:
find . -name *.wav -exec sh -c 'touch $(echo "'{}"\" | sed -r 's/[^.]+\$/txt/')" \;
Which did the job.
Actually, I do not really get what is being done internally, but I guess a shell is spawned on every file right ? I fear this is memory costly.
Then, what if I need to run this command on a large bunch of files and directories !?
Now is there a way to do this in a more efficient way ?
Basically I need to transform the current file's name and to feed touch command.
Thank you.
This find with bash parameter-expansion will do the trick for you. You don't need sed at all.
find . -type f -name "*.wav" -exec sh -c 'x=$1; file="${x##*/}"; woe="${file%.*}"; touch "${woe}.txt"; ' sh {} \;
The idea is the part
x=$1 represents each of the entry returned from the output of find
file="${x##*/}" strips the path of the file leaving only the last file name part (only filename.ext)
The part woe="${file%.*}" stores the name without extension, and the new file is created with an extension .txt from the name found.
EDIT
Parameter expansion sets us free from using Command substitution $() sub-process and sed.
After looking at sh man page, I figured out that the command up above could be simplified.
Synopsis -c [-aCefnuvxIimqVEbp] [+aCefnuvxIimqVEbp] [-o option_name] [+o option_name] command_string [command_name [argument ...]]
...
-c Read commands from the command_string operand instead of from the stan‐dard input. Special parameter 0 will be set from the command_name oper‐and and the positional parameters ($1, $2, etc.) set from the remaining argument operands.
We can directly pass the file path, skipping the shell's name (which is useless inside the script anyway). So {} is passed as the command_name $0 which can be expanded right away.
We end up with a cleaner command.
find . -name *.wav -exec sh -c 'touch "${0%.*}".txt ;' {} \;
So I have the following command which looks for a series of files and appends three lines to the end of everything found. Works as expected.
find /directory/ -name "file.php" -type f -exec sh -c "echo -e 'string1\string2\nstring3\n' >> {}" \;
What I need to do is also look for any instance of string1, string2, or string3 in the find ouput of file.php prior to echoing/appending the lines so I don't append a file unnecessarily. (This is being run in a crontab)
Using | grep -v "string" after the find breaks the -exec command.
How would I go about accomplishing my goal?
Thanks in advance!
That -exec command isn't safe for strings with spaces.
You want something like this instead (assuming finding any of the strings is reason not to add any of the strings).
find /directory/ -name "file.php" -type f -exec sh -c "grep -q 'string1|string2|string3' \"\$1\" || echo -e 'string1\nstring2\nstring3\n' >> \"\$1\"" - {} \;
To explain the safety issue.
find places {} in the command it runs as a single argument but when you splat that into a double-quoted string you lose that benefit.
So instead of doing that you pass the file as an argument to the shell and then use the positional arguments in the shell command with quotes.
The command above simply chains the echo to a failure from grep to accomplish the goal.
Using this will perform a grep for each file found:
find . -name "$FILE" 2>null | xargs grep "search_string" >> $grep_out
But what if I want to execute custom code for each file found, rather than executing a grep? I would like to parse each file my own way, which is motivation for doing this. Could I write the code in the pipe? Should I execute a separate script using the pipe? Can I expand the pipe's scope to execute the next lines in the code before finding the next file?
Several ways to go about it, each with pros and cons. In addition to anubhava's inline method, you could use the -exec flag and a custom script. Example:
find . -name "$FILE" -exec /path/to/script.sh {} +
Then write /path/to/script.sh so that it accepts an arbitrary number of file arguments. Example:
#!/bin/bash
for file in "$#"; do
echo "$file"
done
This approach affords reuse over the inline method, but is less efficient.
The {} + business on find passes multiple files to a single invocation of the script, rather than firing up the script multiple times -- saves a bit on process overhead. If you want the script to execute fresh for each single file, use {} \; instead (and just ue "$1" in your script, no looping needed).
The "$#" bit keeps the file names quoted, important for the cases where your file names have white space in them.
find . -name "$FILE" 2>null -execdir /path/to/script.sh {} \;
This way, no more need to make a for loop somewhere.
You can use while loop like this in BASH:
while read f; do
# process files here
echo "$f"
done < <(find . -name "$FILE")
For using it with sh (which doesn't support process substitution):
find . -name "$FILE" | while read f; do
# process files here
echo "$f"
done
Read more about process substitution
You could use -exec option (instead of xargs) :
find . -name "$FILE" -exec ./test.sh {} \;
With a script test.sh which contain whatever you want. For example :
$ cat test.sh
#!/bin/bash
echo "name=$1"
grep "string" "$1"
$ cat test
string
string2
test
$ sudo find . -name "test" -exec ./test.sh {} \;
name=./test
string
string2
I have found several similar questions that have solutions, except they don't involve variables.
I have a particular pattern in a tree of files and directories - the pattern is the word TEMPLATE. I want a script file to rename all of the files and directories by replacing the word TEMPLATE with some other name that is contained in the variable ${newName}
If I knew that the value of ${newName} was say "Fred lives here", then the command
find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/Fred lives here}"' {} \;
will do the job
However, if my script is:
newName="Fred lives here"
find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/${newName}}"' {} \;
then the word TEMPLATE is replaced by null rather than "Fred lives here"
I need the "" around $0 because there are spaces in the path name, so I can't do something like:
find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/"${newName}"}"' {} \;
Can anyone help me get this script to work so that all files and directories that contain the word TEMPLATE have TEMPLATE replaced by whatever the value of ${newName} is
eg, if newName="A different name" and a I had directory of
/foo/bar/some TEMPLATE directory/with files then the directory would be renamed to
/foo/bar/some A different name directory/with files
and a file called some TEMPLATE file would be renamed to
some A different name file
You have two options.
1) The easiest solution is export newName. If you don't export the variable, then it's not available in subshells, and bash -c is a subshell. That's why you're getting TEMPLATE replaced by nothing.
2) Alternatively, you can try to construct a correctly quoted command line containing the replacement of $newName. If you knew that $newName were reasonably well-behaved (no double quotes or dollar signs, for example), then it's easy:
find . -name '*TEMPLATE*' \
-exec bash -c 'mv "$0" "${0/TEMPLATE/'"${newName}"'}"' {} \;
(Note: bash quoting is full of subtleties. The following has been edited several times, but I think it is now correct.)
But since you can't count on that, probably, you need to construct the command line by substituting both the filename and the substitution as command line parameters. But before we do that, let's fix the $0. You shouldn't be using $0 as a parameter. The correct syntax is:
bash -c '...$1...$1...' bash "argument"
Note the extra bash (many people prefer to use _); it's there to provide a sensible name for the subprocess.
So with that in mind:
find . -name '*TEMPLATE*' \
-exec bash -c 'mv "$1" "${1/TEMPLATE/$2}"' bash {} "$newName" \;
You an get around having to use quotes with IFS=$'\n' and since bash -c is a subshell an export of any variable is required. This works:
#!/bin/bash
IFS=$'\n'
export newName="Fred lives here"
find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/${newName}}"' {} \;
If you do not mind two more lines and would like a script that is easier to read (no export required):
#!/bin/bash
IFS=$'\n'
newName="Fred lives here"
for file in $(find . -name '*TEMPLATE*'); do
mv ${file} ${file/TEMPLATE/${newName}}
done
I want to do something on the lines of:
find -name *.mk | xargs "for i in $# do mv i i.aside end"
I realize that there might be more than on error in this, but I'd like to specifically know about this sort of inline command definition that I can pass xargs to.
This particular command isn't a great example, but you can use an "inline shell script" by giving sh -c 'here is the script' as a command. And you can give it arguments which will be $# inside the script but there's a catch: the first argument after here is the script goes to $0 inside the script, so you have to put an extra word there or you'll lose the first argument.
find . -name '*.mk' -exec sh -c 'for i; do mv "$i" "$i.aside"; done' fnord '{}' +
Another fun feature I took advantage of there is the fact that for loops iterate over the command line arguments by default: for i; do ... is equivalent to for i in "$#"; do ...
I reiterate, the above command is convoluted and slow compared to the many other methods of doing the bulk mv. I'm posting it only to show some cool syntax.
There's no need for xargs here
find -name *.mk -exec mv {} {}.aside \;
I'm not sure what the semantics of your for loop should be, but blindly coding it would give something like this:
find -name *.mk | while read file
do
for i in $file; do mv $i $i.aside; done
done
If the body is used in multiple places, you can also use bash functions.
In some version of find an argument is needed : . for the current directory
Star * must be escaped
You can try with echo command to be sure what command will do
find . -name '*.mk' -print0 | xargs -0i sh -c "echo mv '{}' '{}.aside'"
man xargs
/-i
man sh
/-c
I'm certain you could do this in a nice manner, but since you requested xargs:
find -name "*.tk" | xargs -I% mv % %.aside
Looping over filenames makes no sense, since you can only rename one at a time. Using inline uglyness is not necessary, but I could not make it work with the pipe and either eval or bash -c.