Piping find to find - bash

I want to pipe a find result to a new find. What I have is:
find . -iname "2010-06*" -maxdepth 1 -type d | xargs -0 find '{}' -iname "*.jpg"
Expected result: Second find receives a list of folders starting with 2010-06, second find returns a list of jpg's contained within those folders.
Actual result: "find: ./2010-06 New York\n: unknown option"
Oh darn. I have a feeling it concerns the format of the output that the second find receives as input, but my only idea was to suffix -print0 to first find, with no change whatsoever.
Any ideas?

You need 2 things. -print0, and more importantly -I{} on xargs, otherwise the {} doesn't do anything.
find . -iname "2010-06*" -maxdepth 1 -type d -print0 | xargs -0 -I{} find '{}' -iname '*.jpg'

Useless use of xargs.
find 2010-06* -iname "*.jpg"
At least Gnu-find accepts multiple paths to search in. -maxdepth and type -d is implicitly assumed.

How about
find . -iwholename "./2010-06*/*.jpg
etc?

Although you did say that you specifically want this find + pipe problem to work, its inefficient to fork an extra find command. Since you are specifying -maxdepth as 1, you are not traversing subdirectories. So just use a for loop with shell expansion.
for file in *2010-06*/*.jpg
do
echo "$file"
done
If you want to find all jpg files inside each 2010-06* folders recursively, there is also no need to use multiple finds or xargs
for directory in 2010-06*/
do
find $directory -iname "*.jpg" -type f
done
Or just
find 2006-06* -type f -iname "*.jpg"
Or even better, if you have bash 4 and above
shopt -s globstar
shopt -s nullglob
for file in 2010-06*/**/*.jpg
do
echo "$file"
done

Related

Find command output to echo without variable assignment, in one line

I'm trying to write one line of code that finds all .sh files in the current directory and its subdirectories, and print them without the .sh extension (preferably without the path too).
I think I got the find command down. I tried using the output of
find . -type f -iname "*.sh" -print
as input for echo, and formatting it along these lines
echo "${find_output%.sh}"
However, I cannot get it to work in one line, without variable assigment.
I got inspiration from this answer on stackoverflow https://stackoverflow.com/a/18639136/15124805
to use this line:
echo "${$( find . -type f -iname "*.sh" -print)%.sh}"
But I get this error:
ash: ${$( find . -type f -iname "*.sh" -print)%.sh}: bad substitution
I also tried using xargs
find . -type f -iname "*.sh" -print |"${xargs%.sh}" echo
But I get a "command not found error" -probably I didn't use xargs correctly, but I'm not sure how I could improve this or if it's the right way to go.
How can I make this work?
That's the classic useless use of echo. You simply want
find . -type f -iname "*.sh" -exec basename {} .sh \;
If you have GNU find, you can also do this with -printf.
However, basename only matches .sh literally, so if you really expect extensions with different variants of capitalization, you need a different approach.
For the record, the syntax you tried to use for xargs would attempt to use the value of a variable named xargs. The correct syntax would be something like
find . -type f -iname "*.sh" -print |
xargs -n 1 sh -c 'echo "${1%.[Ss][Hh]}"' _
but that's obviously rather convoluted. In some more detail, you need sh because the parameter expansion you are trying to use is a feature of the shell, not of echo (or xargs, or etc).
(You can slightly optimize by using a loop:
find . -type f -iname "*.sh" -print |
xargs sh -c 'for f; do
echo "${f%.[Ss][Hh]}"
done' _
but this is still not robust for all file names; see also https://mywiki.wooledge.org/BashFAQ/020 for probably more than you realized you needed to know about this topic. If you have GNU find and GNU xargs, you can use find ... -print0 | xargs -r0)

How to write the wc's stdout into a file?

The below command will show how many characters contains in every file in current directory.
find -name '*.*' |xargs wc -c
I want to write the standout into a file.
find -name '*.*' |xargs wc -c > /tmp/record.txt
It encounter an issue:
wc: .: Is a directory
How to write all the standard output into a file?
Why -name '*.*'? That will not find every file and will find directories. You need to use -type f, and better than piping the result to xargs is using -exec:
find . -type f -maxdepth 1 -exec wc -c {} + > /tmp/record.txt
-maxdepth 1 guarantees that the search won't dive in subdirectories.
I think you maybe meant find |xargs wc -c?
find -name '.' just returns .
Filter only files, if you want only files.
find -type f

How to randomly name file when using find exec in a bash script?

When it comes to quickly converting a bunch of files and randomly renaming them I use a pretty simple way to do so with a for loop:
for i in *; do convert [...] $i ../output/$RANDOM.jpg; done
Easy as that. The details what imagemagick does here aren't important here. It works as intended. It's just about how to handle the bash stuff.
Now my current case the folder does not only contain photos, it also does contain subfolders with other photos themself. Expected behavior is again that all photos are randomly renamed and the output files are merged in a single folder.
Since I don't know a way to recursively loop with for, I use a find construct here.
find . \( -iname "*.jpg" -or -iname "*.png" \) -exec convert [...] {} ../output/$RANDOM.jpg \;
Problem is $RANDOM does only get called once, so it stays the same over the whole process and the images get overwritten again and again. So in fact the output folder does only one image, the one that got processed the last.
So the question is:
How do I get the $RANDOM variable to change with each new file?
Kind regards!
Throw it into a loop.
find . \( -iname "*.jpg" -or -iname "*.png" \) -type f -print0 |
while read -d '' -r f
do convert [...] "$f" ../output/$RANDOM.jpg # copied mostly from your find above
done
The -print0 and read -d '' are unnecessary if you never have embedded newlines in your filenames.
Don't use find at all; just use the globstar option.
shopt -s globstar
for f in **/*.jpg **/*.png; do
convert [...] "$i" ../output/$RANDOM.jpg
done
I would go with a shell loop as detailed in the other answers, but it's still useful to know how to run arbitrary shell code like $RANDOM in a find -exec command. You do it by running a shell:
find . \( -iname "*.jpg" -or -iname "*.png" \) \
-exec bash -c 'convert [...] "$1" "../output/$RANDOM.jpg"' _ {} \;

How do I use find command with pipe in bash?

The directory structure looks like
home
--dir1_foo
----subdirectory.....
--dir2_foo
--dir3_foo
--dir4_bar
--dir5_bar
I'm trying to use 'find' command to get directories containing specific strings first, (in this case 'foo'), then use 'find' command again to retrieve some directories matching conditions.
So, I first tried
#!/bin/bash
for dir in `find ./ -type d -name "*foo*" `;
do
for subdir in `find $dir -mindepth 2 -type d `;
do
[Do some jobs]
done
done
, and this script works fine.
Then I thought that using only one loop with pipe like below would also work, but this does not work
#!/bin/bash
for dir in `find ./ -type d -name "*foo*" | find -mindepth 2 -type d `;
do
[Do some jobs]
done
and actually this script works the same as
for dir in `find -mindepth 2 -type d`;
do
[Do some jobs]
done
, which means that the first find command is ignored..
What is the problem?
What your script is doing is not a good practice and has lot of potential pitfalls. See BashFAQ- Why you don't read lines with "for" to understand why.
You can use xargs with -0 to read null delimited files and use the another find command without needing to use the for-loop
find ./ -type d -name "*foo*" -print0 | xargs -0 -I{.} find {.} -mindepth 2 -type d
The string following -I in xargs acts like a placeholder for the input received from the previous pipeline and passes it to the next command. The -print0 option is GNU specific which is a safe option to hande filenames/directory names containing spaces or any other shell meta-characters.
So with the above command in-place, if you are interested in doing some action over the output from 2nd command, do a process-substitution syntax with the while command,
while IFS= read -r -d '' f; do
echo "$f"
# Your other actions can be done on "$f" here
done < <(find ./ -type d -name "*foo*" -print0 | xargs -0 -I{.} find {.} -mindepth 2 -type d -print0)
As far the reason why your pipelines using find won't work is that you are not reading the previous find command's output. You needed either xargs or -execdir while the latter is not an option I would recommend.

find option available to omit leading './' in result

I think this is probably a pretty n00ber question but I just gotsta ask it.
When I run:
$ find . -maxdepth 1 -type f \( -name "*.mp3" -o -name "*.ogg" \)
and get:
./01.Adagio - Allegro Vivace.mp3
./03.Allegro Vivace.mp3
./02.Adagio.mp3
./04.Allegro Ma Non Troppo.mp3
why does find prepend a ./ to the file name? I am using this in a script:
fList=()
while read -r -d $'\0'; do
fList+=("$REPLY")
done < <(find . -type f \( -name "*.mp3" -o -name "*.ogg" \) -print0)
fConv "$fList" "$dBaseN"
and I have to use a bit of a hacky-sed-fix at the beginning of a for loop in function 'fConv', accessing the array elements, to remove the leading ./. Is there a find option that would simply omit the leading ./ in the first place?
The ./ at the beginning of the file is the path. The "." means current directory.
You can use "sed" to remove it.
find . -maxdepth 1 -type f \( -name "*.mp3" -o -name "*.ogg" \) | sed 's|./||'
I do not recommend doing this though, since find can search through multiple directories, how would you know if the file found is located in the current directory?
If you ask it to search under /tmp, the results will be on the form /tmp/file:
$ find /tmp
/tmp
/tmp/.X0-lock
/tmp/.com.google.Chrome.cUkZfY
If you ask it to search under . (like you do), the results will be on the form ./file:
$ find .
.
./Documents
./.xmodmap
If you ask it to search through foo.mp3 and bar.ogg, the result will be on the form foo.mp3 and bar.ogg:
$ find *.mp3 *.ogg
click.ogg
slide.ogg
splat.ogg
However, this is just the default. With GNU and other modern finds, you can modify how to print the result. To always print just the last element:
find /foo -printf '%f\0'
If the result is /foo/bar/baz.mp3, this will result in baz.mp3.
To print the path relative to the argument under which it's found, you can use:
find /foo -printf '%P\0'
For /foo/bar/baz.mp3, this will show bar/baz.mp3.
However, you shouldn't be using find at all. This is a job for plain globs, as suggested by R Sahu.
shopt -s nullglob
files=(*.mp3 *.ogg)
echo "Converting ${files[*]}:"
fConv "${files[#]}"
find . -maxdepth 1 -type f \( -name "*.mp3" -o -name "*.ogg" \) -exec basename "{}" \;
Having said that, I think you can use a simpler approach:
for file in *.mp3 *.ogg
do
if [[ -f $file ]]; then
# Use the file
fi
done
If your -maxdepth is 1, you can simply use ls:
$ ls *.mp3 *.ogg
Of course, that will pick up any directory with a *.mp3 or *.ogg suffix, but you probably don't have such a directory anyway.
Another is to munge your results:
$ find . -maxdepth 1 -type f \( -name "*.mp3" -o -name "*.ogg" \) | sed 's#^\./##'
This will remove all ./ prefixes, but not touch other file names. Note the ^ anchor in the substitution command.

Resources