jenkins bash script - remove directory path from find command results - bash

I need to search in my dist directory for minified .js and .css files within jenkins.
I have a bash script with a find cmd, like this:
for path in $(/usr/bin/find dist/renew-online -maxdepth 1 -name "*.js" -or -name "*.css" -type f); do
# URL of the JavaScript file on the web server
url=$linkTarget/$path
echo "url=$linkTarget/$path"
Where linkTarget is: http://uat.xxxx.com/renew-online.
I want to attach the minified files form dist/renew-online to the linkTarget,
for example:
http://uat.xxxx.com/renew-online/main-es2015.cf7da54187dc97781fff.js
BUT I keeping getting: http://uat.xxxx.com/renew-online/dist/renew-online/main-es2015.cf7da54187dc97781fff.js
I've tried with -maxdepth 0 also but can't get the correct url - newbie at scripts!
Hopefully one of you guys can help, thanks your time

This can be achieved by using 'find' command only:
/usr/bin/find dist/renew-online -maxdepth 1 \( -name "*.js" -o -name "*.css" \) -type f -printf "$linkTarget/%f\n"
It is also recommended to isolate 'or' statements inside round brackets.

This is more a bash question than a jenkins one and you have multiple ways to do it.
If all your files are in a single path, and actually you are forcing with the depth, you can use a cut
for path in $(/usr/bin/find dist/renew-online -maxdepth 1 -name "*.js" -or -name "*.css" -type f | cut -d'/' -f2); do
In the other hand the here https://serverfault.com/questions/354403/remove-path-from-find-command-output by the usage of -printf '%f\n'
Please note as well that the usage of find in a for loop is fragile and it is recommended to use a while https://github.com/koalaman/shellcheck/wiki/SC2044
EDIT
the field used in cut depends on the folders you have in your find syntax. The most accurate way is the one in the serverfault link above

Related

find and delete folder and/or zip file in a directory [duplicate]

I was trying to get a list of all python and html files in a directory with the command find Documents -name "*.{py,html}".
Then along came the man page:
Braces within the pattern (‘{}’) are not considered to be special (that is, find . -name 'foo{1,2}' matches a file named foo{1,2}, not the files foo1 and foo2.
As this is part of a pipe-chain, I'd like to be able to specify which extensions it matches at runtime (no hardcoding). If find just can't do it, a perl one-liner (or similar) would be fine.
Edit: The answer I eventually came up with include all sorts of crap, and is a bit long as well, so I posted it as an answer to the original itch I was trying to scratch. Feel free to hack that up if you have better solutions.
Use -o, which means "or":
find Documents \( -name "*.py" -o -name "*.html" \)
You'd need to build that command line programmatically, which isn't that easy.
Are you using bash (or Cygwin on Windows)? If you are, you should be able to do this:
ls **/*.py **/*.html
which might be easier to build programmatically.
Some editions of find, mostly on linux systems, possibly on others aswell support -regex and -regextype options, which finds files with names matching the regex.
for example
find . -regextype posix-egrep -regex ".*\.(py|html)$"
should do the trick in the above example.
However this is not a standard POSIX find function and is implementation dependent.
You could programmatically add more -name clauses, separated by -or:
find Documents \( -name "*.py" -or -name "*.html" \)
Or, go for a simple loop instead:
for F in Documents/*.{py,html}; do ...something with each '$F'... ; done
This will find all .c or .cpp files on linux
$ find . -name "*.c" -o -name "*.cpp"
You don't need the escaped parenthesis unless you are doing some additional mods. Here from the man page they are saying if the pattern matches, print it. Perhaps they are trying to control printing. In this case the -print acts as a conditional and becomes an "AND'd" conditional. It will prevent any .c files from being printed.
$ find . -name "*.c" -o -name "*.cpp" -print
But if you do like the original answer you can control the printing. This will find all .c files as well.
$ find . \( -name "*.c" -o -name "*.cpp" \) -print
One last example for all c/c++ source files
$ find . \( -name "*.c" -o -name "*.cpp" -o -name "*.h" -o -name "*.hpp" \) -print
I had a similar need. This worked for me:
find ../../ \( -iname 'tmp' -o -iname 'vendor' \) -prune -o \( -iname '*.*rb' -o -iname '*.rjs' \) -print
My default has been:
find -type f | egrep -i "*.java|*.css|*.cs|*.sql"
Like the less process intencive find execution by Brendan Long and Stephan202 et al.:
find Documents \( -name "*.py" -or -name "*.html" \)
Braces within the pattern \(\) is required for name pattern with or
find Documents -type f \( -name "*.py" -or -name "*.html" \)
While for the name pattern with and operator it is not required
find Documents -type f ! -name "*.py" -and ! -name "*.html"
#! /bin/bash
filetypes="*.py *.xml"
for type in $filetypes
do
find Documents -name "$type"
done
simple but works :)
I needed to remove all files in child dirs except for some files. The following worked for me (three patterns specified):
find . -depth -type f -not -name *.itp -and -not -name *ane.gro -and -not -name *.top -exec rm '{}' +
This works on AIX korn shell.
find *.cbl *.dms -prune -type f -mtime -1
This is looking for *.cbl or *.dms which are 1 day old, in current directory only, skipping the sub-directories.
find MyDir -iname "*.[j][p][g]"
+
find MyDir -iname "*.[b][m][p]"
=
find MyDir -iname "*.[jb][pm][gp]"
What about
ls {*.py,*.html}
It lists out all the files ending with .py or .html in their filenames

Excluding multiple filetypes with find

I have a folder with 20k plus Images and most gui filemanagers (like dolphin) aren't able to manage this amount of data.
So I decided to use the bash instead. My problem is the following:
most of the files are *.IMG or *.LBL files
I am not interested in those files. I look for the others
with find . -type f -not -name "*.LBL" I am able to see all files instead of the *.LBL
with find . -type f -not -name "*.IMG" I am able to see all files instead of the *.IMG
both is not very helpful, since it still fills my terminal
either combining both seems not to work:
find . -type f -not -name "*.LBL" -o -not -name "*.IMG"
What is the correct way to see the files inside a folder excluding multiple filesuffixes?
Group conditions, I think -o -not isn't working as expected. Try this:
find . -type f -not \( -name "*.LBL" -o -name "*.IMG" \)
You can use bash's extended pattern matching (Might have to be turned on in a script with shopt -s extglob; usually enabled by default in an interactive shell):
printf "%s\n" !(*.LBL|*.IMG)

Why is my `find` command giving me errors relating to ignored directories?

I have this find command:
find . -type f -not -path '**/.git/**' -not -path '**/node_modules/**' | xargs sed -i '' s/typescript-library-skeleton/xxx/g;
for some reason it's giving me these warnings/errors:
find: ./.git/objects/3c: No such file or directory
find: ./.git/objects/3f: No such file or directory
find: ./.git/objects/41: No such file or directory
I even tried using:
-not -path '**/.git/objects/**'
and got the same thing. Anybody know why the find is searching in the .git directory? Seems weird.
why is the find searching in the .git directory?
GNU find is clever and supports several optimizations over a naive implementation:
It can flip the order of -size +512b -name '*.txt' and check the name first, because querying the size will require a second syscall.
It can count the hard links of a directory to determine the number of subdirectories, and when it's seen all it no longers needs to check them for -type d or for recursing.
It can even rewrite (-B -or -C) -and -A so that if the checks are equally costly and free of side effects, the -A will be evaluated first, hoping to reject the file after 1 test instead of 2.
However, it is not yet clever enough to realize that -not -path '*/.git/*' means that if you find a directory .git then you don't even need to recurse into it because all files inside will fail to match.
Instead, it dutifully recurses, finds each file and matches it against the pattern as if it was a black box.
To explicitly tell it to skip a directory entirely, you can instead use -prune. See How to exclude a directory in find . command
Both more efficient and more correct would be to avoid the default -print action, change -not -path ... to -prune, and ensure that xargs is only used with NUL-delimited input:
find . -name .git -prune -o \
-name node_modules -prune -o \
-type f -print0 | xargs -0 sed -i '' s/typescript-library-skeleton/xxx/g '{}' +
Note the following points:
We use -prune to tell find to not even recurse down the undesired directories, rather than -not -path ... to tell it to discard names in those directories after they were found.
We put the -prunes before the -type f, so we're able to match directories for pruning.
We have an explicit action, not depending on the default -print. This is important because the default -print effectively has a set of parenthesis: find ... behaves like find '(' ... ')' -print, not like find ... -print, no if explicit action is given.
We use xargs only with the -0 argument enabling NUL-delimited input, and the -print0 action on the find side to generate a NUL-delimited list of names. NUL is the only character which cannot be present in an arbitrary file path (yes, newlines can be present) -- and thus the only character which is safe to use to separate paths. (If the -0 extension to xargs and the -print0 extension to find are not guaranteed to be available, use -exec sed -i '' ... {} + instead).

find with nested command reading blacklist

I have a script that recursively searches all directories for specific files or specific file endings.
These certain files I want to save the path in a description file.
Looks for example like this:
./org/apache/commons/.../file1.pom
./org/apache/commons/.../file1.jar
./org/apache/commons/.../file1.zip
and so on.
In a blacklist , I describe which file endings I want to ignore.
! -path "./.cache/*" ! -path "./org/*" ! -name "*.sha1" ! -name"*.lastUpdated"
and so on.
Now i want to read this blacklist file while the search to ignore the described files:
find . -type f $(cat blacklist) > artifact.descriptor
Unfortunately, the blacklist will not be included while the search.
When:
echo "find . -type f $(cat blacklist) > artifact.descriptor"
Result is as expected:
find . -type f ! -path "./.cache/*" ! -path "./org/*" ! -name "*.sha1" ! -name"*.lastUpdated" > artifact.descriptor
But it does not work or exclude the described files.
I tried with following command and it works, but i want to know why not with with find alone.
find . -type f | grep -vf $blacklist > artifact.descriptor
Hopefully someone can explain it to me :)
Thanks a lot.
As tripleee suggests, it is generally considered bad practice to store a command in a variable because it does not catch all the cornercases.
However you can use eval as a workaround
/tmp/test$ ls
blacklist test.a test.b test.c
/tmp/test$ cat blacklist
-not -name *.c -not -name *.b
/tmp/test$ eval "find . -type f "`cat blacklist`
./test.a
./blacklist
In your case I think it fails because the quotes in your blacklist file are considered as a literal and not as enclosing the patterns and I think it works if you remove them, but still it's probably not safe for other reasons.
! -path ./.cache/* ! -path ./org/* ! -name *.sha1 ! -name *.lastUpdated

Piping find to find

I want to pipe a find result to a new find. What I have is:
find . -iname "2010-06*" -maxdepth 1 -type d | xargs -0 find '{}' -iname "*.jpg"
Expected result: Second find receives a list of folders starting with 2010-06, second find returns a list of jpg's contained within those folders.
Actual result: "find: ./2010-06 New York\n: unknown option"
Oh darn. I have a feeling it concerns the format of the output that the second find receives as input, but my only idea was to suffix -print0 to first find, with no change whatsoever.
Any ideas?
You need 2 things. -print0, and more importantly -I{} on xargs, otherwise the {} doesn't do anything.
find . -iname "2010-06*" -maxdepth 1 -type d -print0 | xargs -0 -I{} find '{}' -iname '*.jpg'
Useless use of xargs.
find 2010-06* -iname "*.jpg"
At least Gnu-find accepts multiple paths to search in. -maxdepth and type -d is implicitly assumed.
How about
find . -iwholename "./2010-06*/*.jpg
etc?
Although you did say that you specifically want this find + pipe problem to work, its inefficient to fork an extra find command. Since you are specifying -maxdepth as 1, you are not traversing subdirectories. So just use a for loop with shell expansion.
for file in *2010-06*/*.jpg
do
echo "$file"
done
If you want to find all jpg files inside each 2010-06* folders recursively, there is also no need to use multiple finds or xargs
for directory in 2010-06*/
do
find $directory -iname "*.jpg" -type f
done
Or just
find 2006-06* -type f -iname "*.jpg"
Or even better, if you have bash 4 and above
shopt -s globstar
shopt -s nullglob
for file in 2010-06*/**/*.jpg
do
echo "$file"
done

Resources