Find command output to echo without variable assignment, in one line - bash

I'm trying to write one line of code that finds all .sh files in the current directory and its subdirectories, and print them without the .sh extension (preferably without the path too).
I think I got the find command down. I tried using the output of
find . -type f -iname "*.sh" -print
as input for echo, and formatting it along these lines
echo "${find_output%.sh}"
However, I cannot get it to work in one line, without variable assigment.
I got inspiration from this answer on stackoverflow https://stackoverflow.com/a/18639136/15124805
to use this line:
echo "${$( find . -type f -iname "*.sh" -print)%.sh}"
But I get this error:
ash: ${$( find . -type f -iname "*.sh" -print)%.sh}: bad substitution
I also tried using xargs
find . -type f -iname "*.sh" -print |"${xargs%.sh}" echo
But I get a "command not found error" -probably I didn't use xargs correctly, but I'm not sure how I could improve this or if it's the right way to go.
How can I make this work?

That's the classic useless use of echo. You simply want
find . -type f -iname "*.sh" -exec basename {} .sh \;
If you have GNU find, you can also do this with -printf.
However, basename only matches .sh literally, so if you really expect extensions with different variants of capitalization, you need a different approach.
For the record, the syntax you tried to use for xargs would attempt to use the value of a variable named xargs. The correct syntax would be something like
find . -type f -iname "*.sh" -print |
xargs -n 1 sh -c 'echo "${1%.[Ss][Hh]}"' _
but that's obviously rather convoluted. In some more detail, you need sh because the parameter expansion you are trying to use is a feature of the shell, not of echo (or xargs, or etc).
(You can slightly optimize by using a loop:
find . -type f -iname "*.sh" -print |
xargs sh -c 'for f; do
echo "${f%.[Ss][Hh]}"
done' _
but this is still not robust for all file names; see also https://mywiki.wooledge.org/BashFAQ/020 for probably more than you realized you needed to know about this topic. If you have GNU find and GNU xargs, you can use find ... -print0 | xargs -r0)

Related

Bash script to return all elements given an extension, without using print flags

I want to create shell script that search inside all folders of the actual directory and return all files that satisfy some condition, but without using any print flag.
(Here the condition is to end with .py)
What I have done:
find . -name '*.py'| sed -n 's/\.py$//p'
The output:
./123
./test
./abc/dfe/test3
./testing
./test2
What I would like to achieve:
123
test
test3
testing
test2
Use -exec:
find . -name '*.py' -exec sh -c 'for f; do f=${f%.py}; echo "${f##*/}"; done' sh {} +
If GNU basename is an option, you can simplify this to
find . -name '*.py' -exec basename -s .py {} +
POSIX basename is a little more expensive, as you'll have to call it on every file individually:
find . -name '*.py' -exec basename {} .py \;
Using GNU grep instead of sed:
find . -name '*.py' | grep -oP '[^/]+(?=\.py$)'
If portability is not a concern, this is a very readable option:
find . -name '*.py' | xargs basename -a
This is also differentiated from chepner's answer in that it retains the .py file ending in the output.
I'm not familiar with the -exec flag, and I'm sure his one-liners can be customized to do the same, but I couldn't do so off the top of my head.
Chepner's version achieves the same with the small modification:
find . -name '*.py' -exec basename {} \;
if you want the literal output from find and didn't intend to drop the file endings when you used dummy variables (123,test, etc.) in your question.
find shows entries relative to where you ask it to search, you can simply replace the . with a *:
find * -name '*.py'| sed -n 's/\.py$//p'
(Be aware that this skips top level hidden directories)
This might work for you (GNU parallel):
find . -name '*.py*' 2>/dev/null | parallel echo "{/.}"

How do I use find command with pipe in bash?

The directory structure looks like
home
--dir1_foo
----subdirectory.....
--dir2_foo
--dir3_foo
--dir4_bar
--dir5_bar
I'm trying to use 'find' command to get directories containing specific strings first, (in this case 'foo'), then use 'find' command again to retrieve some directories matching conditions.
So, I first tried
#!/bin/bash
for dir in `find ./ -type d -name "*foo*" `;
do
for subdir in `find $dir -mindepth 2 -type d `;
do
[Do some jobs]
done
done
, and this script works fine.
Then I thought that using only one loop with pipe like below would also work, but this does not work
#!/bin/bash
for dir in `find ./ -type d -name "*foo*" | find -mindepth 2 -type d `;
do
[Do some jobs]
done
and actually this script works the same as
for dir in `find -mindepth 2 -type d`;
do
[Do some jobs]
done
, which means that the first find command is ignored..
What is the problem?
What your script is doing is not a good practice and has lot of potential pitfalls. See BashFAQ- Why you don't read lines with "for" to understand why.
You can use xargs with -0 to read null delimited files and use the another find command without needing to use the for-loop
find ./ -type d -name "*foo*" -print0 | xargs -0 -I{.} find {.} -mindepth 2 -type d
The string following -I in xargs acts like a placeholder for the input received from the previous pipeline and passes it to the next command. The -print0 option is GNU specific which is a safe option to hande filenames/directory names containing spaces or any other shell meta-characters.
So with the above command in-place, if you are interested in doing some action over the output from 2nd command, do a process-substitution syntax with the while command,
while IFS= read -r -d '' f; do
echo "$f"
# Your other actions can be done on "$f" here
done < <(find ./ -type d -name "*foo*" -print0 | xargs -0 -I{.} find {.} -mindepth 2 -type d -print0)
As far the reason why your pipelines using find won't work is that you are not reading the previous find command's output. You needed either xargs or -execdir while the latter is not an option I would recommend.

Shell stop script if find command fails

Good day.
In a script of fine i have the following find command:
find -maxdepth 1 \! -type d -name "some_file_name_*" -name "*.txt" -name "*_${day_month}_*" -exec cp {} /FILES/directory1/directory2/directory3/ +
I want to know how to stop the script if the command does't find anything.
Use GNU xargs with the -r switch and a pipeline to ensure the output of find is passed to cp only if its non-empty.
find -maxdepth 1 \! -type d -name "some_file_name_*" -name "*.txt" -name "*_${day_month}_*" \
| xargs -r I{} cp "{}" /FILES/directory1/directory2/directory3/
I{} is a place-holder for the output from the find command which is passed to cp,
The flags, -r and I{} represent the following according to the man xargs page,
-r, --no-run-if-empty
If the standard input does not contain any nonblanks, do not run
the command. Normally, the command is run once even if there is
no input. This option is a GNU extension.
-I replace-str
Replace occurrences of replace-str in the initial-arguments with
names read from standard input.
You may add -exec false {} so you get a false exit status when something is found (which makes it a bit upside-down though)
if find . -name foo -exec echo ok ';' -exec false {} +
then
echo 'not found'
exit
fi
echo found
See similar question in stackexchange: How to detect whether “find” found any matches?, in particular this answer which suggests the false trick

Unix find -exec: Why do the following behaviors differ?

The following works as intended:
$ find . -name .git -exec dirname '{}' \;
./google/guava
./JetBrains/intellij-community
./zsh-users/zsh-syntax-highlighting
But the following only returns dots:
$ find . -name .git -exec echo "$(dirname '{}')" \;
.
.
.
Why is that, and how can I use $(dirname '{}') in a find -exec command?
Please note, I am asking about UNIX find (OS X and FreeBSD, specifically), not GNU.
Reason for behaviour difference
Your shell is evaluating the $(dirname) before find, leading to this command being executed :
find . -name .git -exec echo . ;
Other ways to do this
You can of course use shell expansion inside find, by calling another shell yourself (or better, calling a script using the shell you want as shebang).
In other words:
find . -name .git -exec sh -c 'dirname {}' \;
Solution without dirname (POSIX, faster, one less subprocess to call) :
find . -name .git -exec sh -c 'path={}; echo "${path%/*}" ' \;
Combing /u/tripleee's answer (upvote him not me) with the find optimization :
find . -name .git -exec sh -c 'for f; do echo "${f%/*}"; done' _ {} \+
If you're using gnu find then you can ditch dirname and use printf:
find . -name .git -printf "%h\n"
The general answer is to run -exec sh (or -exec bash if you need Bash features).
find . -name .git -exec sh -c 'for f; do dirname "$f"; done' _ {} \+
dirname can easily be replaced with something simpler, but in the general case, this is a useful pattern for when you want a shell to process the results from find.
-exec ... \+ is running the shell on as many matches as possible, instead of executing a separate shell for each match. This optimization is not available in all versions of find.
If you have completely regular file names (no newlines in the results, etc) you might be better off with something like
find . -name .git | sed 's%/[^/]*$%%'
However, assuming that file names will be regular is a huge recurring source of bugs. Don't do that for a general-purpose tool.

Piping find to find

I want to pipe a find result to a new find. What I have is:
find . -iname "2010-06*" -maxdepth 1 -type d | xargs -0 find '{}' -iname "*.jpg"
Expected result: Second find receives a list of folders starting with 2010-06, second find returns a list of jpg's contained within those folders.
Actual result: "find: ./2010-06 New York\n: unknown option"
Oh darn. I have a feeling it concerns the format of the output that the second find receives as input, but my only idea was to suffix -print0 to first find, with no change whatsoever.
Any ideas?
You need 2 things. -print0, and more importantly -I{} on xargs, otherwise the {} doesn't do anything.
find . -iname "2010-06*" -maxdepth 1 -type d -print0 | xargs -0 -I{} find '{}' -iname '*.jpg'
Useless use of xargs.
find 2010-06* -iname "*.jpg"
At least Gnu-find accepts multiple paths to search in. -maxdepth and type -d is implicitly assumed.
How about
find . -iwholename "./2010-06*/*.jpg
etc?
Although you did say that you specifically want this find + pipe problem to work, its inefficient to fork an extra find command. Since you are specifying -maxdepth as 1, you are not traversing subdirectories. So just use a for loop with shell expansion.
for file in *2010-06*/*.jpg
do
echo "$file"
done
If you want to find all jpg files inside each 2010-06* folders recursively, there is also no need to use multiple finds or xargs
for directory in 2010-06*/
do
find $directory -iname "*.jpg" -type f
done
Or just
find 2006-06* -type f -iname "*.jpg"
Or even better, if you have bash 4 and above
shopt -s globstar
shopt -s nullglob
for file in 2010-06*/**/*.jpg
do
echo "$file"
done

Resources