Create an if statement in Bash commandline (not a script) - bash

I am trying to create an if statement in bash command line, not a script but a statement can be typed into the command line with no returns because I have to run this script through a groovy command line call.
if [cat $(find ./ -name userId.txt) == "517980"]; then cat $(find ./ -name userId.txt); fi
The Jenkins groovy script looks like this
node("puppet-$ENVIRONMENT") {
sh "/opt/puppet/bin/puppet module uninstall ${module} || echo 'NOT INSTALLED!'"
sh "pwd"
sh "rm -rf *"
//unarchive the tar in the remote file system and install it
unarchive mapping: ['*.*': './']
sh 'if [cat $(find ./ -name userId.txt) == "517980"]; then echo "it works"; fi'
sh "ls -alrt"
sh '/opt/puppet/bin/puppet module install --force $(find ./ -name *.tar.gz)'
}

file=$(find ./ -name userId.txt) && [ -n "$file" ] && { contents=$(cat "$file"); [ "$contents" == "517980" ] && echo "$contents"; }
don't need to run find or cat more than once
[ is actually a command not mere syntax: it needs a space to separate it from its arguments
You're missing $() around your cat calls

Related

bash how to error out when subshell error

I have a script with the following command to upload a bunch of zip files to a site:
find . -name "*.zip" -exec echo {} \; -exec sh -c 'err=$(curl -s --data-binary "#{}" http://mystorage.com | jq -r ".error"); if [ -z $err ] || [ $err = "file already exists" ]; then exit 0; else exit 1; fi' \;
The intention is that if any file fail to upload with the reason other than "file already exists" then the script must fail. However, if i run this command alone, it never exit with 1. My guess is that the subshell opened in the 2nd -exec returns 1 but the -exec ignore the return status and return 0 for the whole find command. Is there a way to make my command fail when the subshell fail?
I wouldn't bother with find for this. Just use an ordinary loop (with the globstar option to search recursively, if necessary).
shopt -s globstar nullglob
for f in **/*.zip; do
err=$(curl -s --data-binary "#$f" http://mystorage.com | jq -r ".error")
if [ -n "$err" ] && [ "$err" = "file already exists" ]; then
exit 1
fi
done
Note that you don't want to exit 0 when the first job succeeds; just do nothing and let the next file be uploaded.
You need to look at the four exec forms find can use:
https://www.man7.org/linux/man-pages/man1/find.1.html
-exec command {} \;
-exec command {} \+
-execdir command {} \;
-execdir command {} \+
They all have different behaviors regarding their boolean value in the search boolean expression.
My brief interpretation (I have only read the manpages, I have not actually tried this):
Commands that end with a semicolon result in the -exec term having a true or false value, and do not effect the exit value of find even when the wrapped command has a nonzero exit value.
Commands that end with a plus cause find to exit with a nonzero value when a wrapped command has a nonzero result value.
So I think you want to switch to the -exec form where the command ends with a '+'.
But really I think you need a better multithreaded script that can handle each failure separately, and remember which ones it has successfully uploaded.
Using GNU find:
find . -name '*.zip' -print -exec sh -c '
err=$(curl -s --data-binary "#$1" http://mystorage.com | jq -r ".error")
[ -z "$err" ] || [ "$err" = 'file already exists' ] || exit 1
' _ {} \; \
-o -quit
Note that this exits if .error is empty. That may be what you want, but maybe you want this (or similar):
find . -name '*.zip' -print -exec sh -c '
response=$(curl -s --data-binary "#$1" http://mystorage.com) || exit 1
[ "$response" ] || exit 1
[ "$(echo "$response" | jq -r .error)" = 'file already exists' ] && exit 1
' _ {} \; \
-o -quit
Now it exits prematurely if curl fails, if the response is empty, or if .error matches.

Bash conditional with shell variable

I want to check if a file exists with [ -f "$1" ] but it's not working. The command is working with plain text like [ -f "filename.xml" ].
I echoed $1 which is for example filename.xml. Any ideas?
sourcePath=/SPECIFICPATH/${1};
echo $sourcePath;
echo $1;
find /EXAMPLEPATH -name pages -type d -execdir bash -c 'cd pages && [ -f "$1" ] && pwd && cp $sourcePath .' \;
I'm working in automator using a shell script block.
You’re invoking an entirely new shell with bash -c …, so you need to pass $1 along. Same with $sourcePath, if it’s not exported.
find /EXAMPLEPATH -name pages -type d -execdir bash -c 'cd pages && [ -f "$1" ] && pwd && cp "$2" .' bash "$1" "$sourcePath" \;
(In bash -c … bash "$1" "$sourcePath", the second bash is $0.)
There's no need for the subshell if you move some of the logic to the find command.
find /EXAMPLEPATH -wholename "*/pages/$1" -print -execdir cp "$sourcePath" . \;
-wholename matches a file named $1 in a pages directory.
-print replaces pwd.
Now you can call cp directly.

Execute command in all immediate subdirectories

I'm trying to add a shell function (zsh) mexec to execute the same command in all immediate subdirectories e.g. with the following structure
~
-- folder1
-- folder2
mexec pwd would show for example
/home/me/folder1
/home/me/folder2
I'm using find to pull the immediate subdirectories. The problem is getting the passed in command to execute. Here's my first function defintion:
mexec() {
find . -mindepth 1 -maxdepth 1 -type d | xargs -I'{}' \
/bin/zsh -c "cd {} && $#;";
}
only executes the command itself but doesn't pass in the arguments i.e. mexec ls -al behaves exactly like ls
Changing the second line to /bin/zsh -c "(cd {} && $#);", mexec works for just mexec ls but shows this error for mexec ls -al:
zsh:1: parse error near `ls'
Going the exec route with find
find . -mindepth 1 -maxdepth 1 -type d -exec /bin/zsh -c "(cd {} && $#)" \;
Gives me the same thing which leads me to believe there's a problem with how I'm passing the arguments to zsh. This also seems to be a problem if I use bash: the error shown is:
-a);: -c: line 1: syntax error: unexpected end of file
What would be a good way to achieve this?
Can you try using this simple loop which loops in all sub-directories at one level deep and execute commands on it,
for d in ./*/ ; do (cd "$d" && ls -al); done
(cmd1 && cmd2) opens a sub-shell to run the commands. Since it is a child shell, the parent shell (the shell from which you're running this command) retains its current folder and other environment variables.
Wrap it around in a function in a proper zsh script as
#!/bin/zsh
function runCommand() {
for d in ./*/ ; do /bin/zsh -c "(cd "$d" && "$#")"; done
}
runCommand "ls -al"
should work just fine for you.
#!/bin/zsh
# A simple script with a function...
mexec()
{
export THE_COMMAND=$#
find . -type d -maxdepth 1 -mindepth 1 -print0 | xargs -0 -I{} zsh -c 'cd "{}" && echo "{}" && echo "$('$THE_COMMAND')" && echo -e'
}
mexec ls -al
using https://github.com/sharkdp/fd but you could as well use plain old find instead of fdfind
function inDirs() { fdfind --type d --max-depth 1 --exec bash -c "x={} && echo && echo \$x && echo \${x//?/=} && cd {} && echo '-> '$* && $*" ; }

gbash 'git rm' multiple files that are found by a 'find' command

I want to 'git rm' a bunch of files that are found by a 'find' command. The files should have a certain suffix. I got this:
TEST_PATH='/usr/src'
function main() {
for i in "$#"
do
echo "current i = ${i}"
COMMAND='find $TEST_PATH -maxdepth 20 -name '*_${i}.txt' -exec git rm {} \;'
# COMMAND="$(find $TEST_PATH -maxdepth 20 name '*_${i}.txt' -print0 | xargs -0 -I{} cp {} .)"
# COMMAND="find $TEST_PATH -maxdepth 20 -name '*_${i}.txt' -exec cp {} . \;"
# COMMAND="find . '*.BUILD' | while read file; do echo "$file"; done \;"
done
echo "Running Command: $COMMAND"
$COMMAND
}
gbash::main "$#"
Running it will throw an error like this:
$ sh abc.sh 123
current i = 123
Running Command: find ../../src/python/servers/innertube/tests/ -maxdepth 20 -name "*_9421870.txt" -exec rm {}\;
find: missing argument to `-exec'
I've read and tried all the solutions on stackoverflow (see the commented out code) but none works...
Update
The problem is that you should eval contents of the variable containing command:
eval $COMMAND
From man eval:
The eval utility shall construct a command by concatenating arguments together, separating each with a <space> character. The constructed command shall be read and executed by the shell.
Original answer
Replace {}\; with {} \; or {} +.
Read the man page for find. The action used in your command is documented as:
-exec command ;
Execute command; true if 0 status is returned. All following arguments > to find are taken to be arguments to the command until an argument consisting of ; is encountered. The string {} is replaced by the current file name being processed everywhere it occurs in the arguments to the command...
So the command failed because the {}\; sequence is interpreted as command.

find statement in cygwin bash script

for i in `find . -type f -name "VF-Outlet*.edi" -exec basename \{} \;` ; do
if [ -n "${i}" ];
then
echo file "VF-Outlet found";
sed -e 's/\*UK\*00/\*UP\*/g;s/XQ.*$/XQ\*H\*20150104\*20150110/g' $i > ${i}_fix
else
echo file "VF-Outlet" not found;
fi
done
The above code works if the file is found. The 'echo' statement prints file found.
If the file is not found however, nothing prints. I tried all the various tests for empty string, and unset variables, nothing works.
Also if I try:
i=`find . -type f -name "VF-Outlet*.edi" -exec basename \{} \;`;
Then do the test:
if [ -n "${i}" ];
then
echo file ${i} found;
else
echo file "VF-Outlet" not found;
fi
done
It works correctly if the file is found or not.
Need help in figuring this out. I need the for loop to test multiple files.
The reason it is not working is due to the fact that "for" does not take null value as input for the variable "i"
For ex:
for i in echo > /dev/null; do echo hi; done
The above command wont give any result, because no value has been assigned to value $i for running the loop.
In the case mentioned here if we check the script in debug mode, we can see that the script dies at initial variable assignment.
# sh -x script.sh
+ find . -type f -name VF-Outlet*.edi -exec basename {} ;
here, script.sh file contains the script you have provided.
If there is a file present in the directory, the script will successfully execute.
# sh -x tet
+ find . -type f -name VF-Outlet*.edi -exec basename {} ;
+ [ -n VF-Outlet1.edi ]
+ echo file VF-Outlet found
file VF-Outlet found
As #shellter mentioned, this not how I would have done. You can use -f instead of -n to check if a file exists.
Hope this helps!

Resources