I have a script that I use to copy all of the files in one folder and move them to a new folder... the line i use to do it looks like this
find "$SOURCEFOLDER" -type f | xargs -I {} ln {} "$ENDFOLDER/$TR_NEW_TORRENT_NAME/${basename}"
and it works perfectly
the thing is I'd like to also use sed to remove any brackets from the basename of the new file but i don't know how to incorporate the sed command
sed -e 's/\[[^][]*\]//g' FILE
how would I go about doing this? Is there a better or simpler way to do all the things I want?
I believe following will work for you:
find "$SOURCEFOLDER" -type f -exec bash -c "sed -e 's/\[[^][]*\]//g' {} ; xargs -I {} ln {} "$ENDFOLDER/$TR_NEW_TORRENT_NAME/${basename}" \;
Idea is to combine the commands this ways:
another way is to use two -exec
find "$SOURCEFOLDER" -type f -exec sed -e 's/\[[^][]*\]//g' {}\; -exec ln {} "$ENDFOLDER/$TR_NEW_TORRENT_NAME/${basename}" \;
I hope this will help.
You can use -execdir option of find for this renaming and avoid xargs altogether:
find "$SOURCEFOLDER" -type f -execdir bash -c 'sed "s/\[[^][]*\]//g" <<< "$1";
ln "$1" "$ENDFOLDER/$TR_NEW_TORRENT_NAME/${basename}"' - '{}' \;
Related
I am using find to list files within multiple directories with a specific extension. I tried
find /path/to/encompassing/directory/ -d -name "*modified.tif" | xargs cp Destination_Directory/
but it didn't work. Using
find /path/ -d -name "*modified.tif" -type f -exec cp {} Destination_Directory \;
works but I don't understand why xargs isn't working.
If you write
find -name '*modified.tif' | xargs cp directory
then that's the same as writing
cp directory file1modified.tif file2modified.tif
(or whatever filenames matched), which is the wrong way around, because xargs by default appends arguments.
find -name '*modified.tif' -exec cp {} directory \;
is the same as
cp file1modified.tif directory
cp file2modified.tif directory
which is what you want.
You can achieve the same with xargs by using
xargs -I{} cp {} directory
to specify where in the command you want to use the argument, but that implies that only one file at a time will be copied (because -I implies -L1).
To avoid calling cp once per file, you can use the -t option for cp so the files to be copied can be appended to the end of the command (requires GNU cp):
find -name '*modified.tif' | xargs cp -t directory
which is equivalent to
cp -t directory file1modified.tif file2modified.tif
or better, taking care of blanks in filenames,
find -name '*modified.tif' -print0 | xargs -0 cp -t directory
Alternatively, without xargs:
find -name '*modified.tif' -exec cp -t directory {} +
where -exec {} + makes sure to invoke cp as few times as possible.
xargs passes each word from its standard input as the last argument to cp, not the first. As a result, you are trying to run the series of commands
cp Destination_Directory/ foo
cp Destination_Directory/ bar
# etc
If you are using GNU cp, you can fix this simply by using the -t option to specify that Destination_Directory is the target, rather than a source.
... | xargs cp -t Destination_Directory
# cp -t Destination_Directory foo
# cp -t Destination_Directory bar
# etc
You might be able to use the -I option in xargs to make it use the incoming file name as the first argument:
... | xargs -I '{}' cp '{}' Destination_Directory
however, this makes a lot of assumptions about the names find will produce. (No leading or trailing whitespace, and no newlines in the file names.) (For that matter, xargs without -I is treating each whitespace-delimited word from its input as a separate argument for a call to cp.) In general, you should not try to use the output of find programmatically. Stick with its -exec primary instead.
Your code
find /path/ -d -name "*modified.tif" -type f -exec cp {} Destination_Directory \;
is the right way to go. No shell is involved, so each file name is passed as is as the first argument to cp.
I don't use xargs, but I think it should work like this :
cp `find /path/to/encompassing/directory/ -d -name "*modified.tif"` Destination_Directory/
No need for a pipe then.
I am trying to execute this command:
find ./ -type f -readable -writable -exec sed -i "s/A/B/g" {} \;
and I'm getting this error:
find: -readable: unknown primary or operator
I am trying to find and replace extensively, anything with the name 'A' to 'B', this will apply to file names and text inside of files.
What's the best way for me to execute this on Mac terminal?
The BSD version of find doesn't have the -readable or -writable primaries, but you can fake them using the test command:
find ./ -type f -exec test -r {} -a -w {} \; -exec sed -i "" "s/A/B/g" {} \;
There's a possible problem here in that test's syntax can be ambiguous with more than three arguments, so a compound test expression like this might be misparsed. I don't think this can be a problem with paths that begin with ./, but if you're worried about it you can use two separate tests:
find ./ -type f -exec test -r {} \; -exec -w {} \; -exec sed -i "" "s/A/B/g" {} \;
Also, note that I added a null argument to sed after -i. This is another BSD-vs-GNU thing. The BSD version of sed requires an argument (the extension to use for a backup) to the -i option, so in order to avoid making a backup you have to explicitly supply a blank. The GNU version, on the other hand, would be confused by sed -i .bak because it requires the argument to -i be directly attached to it (sed -i.bak). Wheee.
Here is my current code, my goal is to find every file in a given directory (recursively) and replace "FIND" with "REPLACEWITH" and overwrite the files.
FIND='ALEX'
REPLACEWITH='<strong>ALEX</strong>'
DIRECTORY='/some/directory/'
find $DIRECTORY -type f -name "*.html" -print0 |
LANG=C xargs -0 sed -i "s|$FIND|$REPLACEWITH|g"
The error I am getting is:
sed: 1: "/some/directory ...": command a expects \ followed by text
As given in BashFAQ #21, you can use perl to perform search-and-replace operations with no potential for data being treated as code:
in="$FIND" out="$REPLACEWITH" find "$DIRECTORY" -type f -name '*.html' \
-exec perl -pi -e 's/\Q$ENV{"in"}/$ENV{"out"}/g' '{}' +
If you want to include only files matching the FIND string, find can be told to only pass files which grep flags on to perl:
in="$FIND" out="$REPLACEWITH" find "$DIRECTORY" -type f -name '*.html' \
-exec grep -F -q -e "$FIND" '{}' ';' \
-exec perl -pi -e 's/\Q$ENV{"in"}/$ENV{"out"}/g' '{}' +
Because grep is being used to evaluate individual files, it's necessary to use one grep call per file so its exit status can be evaluated on a per-file basis; thus, the use of the less efficient -exec ... {} ';' action. For perl, it's possible to put multiple files to process on one command, hence the use of -exec ... {} +.
Note that fgrep is line-oriented; if your FIND string contains multiple lines, then files with any one of those lines will be passed to perl for replacements.
You can have find invoke sed directly although I think all the modification times on your files will be affected (which might matter or not):
find $DIRECTORY -type f -name "*.html" -exec sed -i "s|$FIND|$REPLACEWITH|g" '{}' ';'
The following works fine when I type it exactly in the command line:
find /<some_path>/{epson,epson_laser,epson_inkjet} -iname "*.ppd"
-exec grep "\*ModelName\:" {} \; | sed 's/.*\"\(.*\)\"/\1/'
However, when I try to call the following from a
bash script I get find: missing argument to -exec'.
I have also tried the following (in many variants):
eval find "$1" -iname "*.ppd" -exec 'bash -c grep "\*ModelName\:" "$1" | sed "s/.*\"\(.*\)\"/\1/" \;
as was mentioned in find-exec-echo-missing-argument-to-exec.
How can I get to work first code not only in terminal, but also in bash script?
P.S.: I've used eval only for expanding string "/<some_path>/{epson,epson_laser,epson_inkjet}" to multiple paths. Does anyone know better solution for doing this?
If you want to execute multiple commands over the output of find, just use the -exec options as many times required:
find -exec command1 "{}" \; -exec command2 "{}" \;
You can also define the conditions to execute an option:
find \( -exec command1 \; -false -o -exec command2 \; \)
In your case, you need something like this:
find /<some_path>/{epson,epson_laser,epson_inkjet} -iname "*.ppd" -exec grep "\*ModelName\:" "{}" \; sed 's/.*\"\(.*\)\"/\1/' "{}" \;
I'm trying to construct a find command to process a bunch of files in a directory using two different executables. Unfortunately, -exec on find doesn't allow to use pipe or even \| because the shell interprets that character first.
Here is specifically what I'm trying to do (which doesn't work because pipe ends the find command):
find /path/to/jpgs -type f -exec jhead -v {} | grep 123 \; -print
Try this
find /path/to/jpgs -type f -exec sh -c 'jhead -v {} | grep 123' \; -print
Alternatively you could try to embed your exec statement inside a sh script and then do:
find -exec some_script {} \;
A slightly different approach would be to use xargs:
find /path/to/jpgs -type f -print0 | xargs -0 jhead -v | grep 123
which I always found a bit easier to understand and to adapt (the -print0 and -0 arguments are necessary to cope with filenames containing blanks)
This might (not tested) be more effective than using -exec because it will pipe the list of files to xargs and xargs makes sure that the jhead commandline does not get too long.
With -exec you can only run a single executable with some arguments, not arbitrary shell commands. To circumvent this, you can use sh -c '<shell command>'.
Do note that the use of -exec is quite inefficient. For each file that is found, the command has to be executed again. It would be more efficient if you can avoid this. (For example, by moving the grep outside the -exec or piping the results of find to xargs as suggested by Palmin.)
Using find command for this type of a task is maybe not the best alternative. I use the following command frequently to find files that contain the requested information:
for i in dist/*.jar; do echo ">> $i"; jar -tf "$i" | grep BeanException; done
As this outputs a list would you not :
find /path/to/jpgs -type f -exec jhead -v {} \; | grep 123
or
find /path/to/jpgs -type f -print -exec jhead -v {} \; | grep 123
Put your grep on the results of the find -exec.
There is kind of another way you can do it but it is also pretty ghetto.
Using the shell option extquote you can do something similar to this in order to make find exec stuff and then pipe it to sh.
root#ifrit findtest # find -type f -exec echo ls $"|" cat \;|sh
filename
root#ifrit findtest # find -type f -exec echo ls $"|" cat $"|" xargs cat\;|sh
h
I just figured I'd add that because at least the way i visualized it, it was closer to the OP's original question of using pipes within exec.