I have a list of names in a list (say name.txt) which has the set of name lists one by one.
name.txt
babu
praveen
kamal
sneha
This name will be passed as run time argument $1 in my bash script.
Now I have to do a match to check if the inputted name is in my list or not.
If it's not there then I will print saying invalid name and exit. Can you help me with this?
I have tried this with
if [[ "$1" != "babu" || "$1" != "praveen" || "$1" != "kamal" ... ]]; then
exit
fi
but this doesn't look good professionally.
Is there any other simple and decent way to achieve this?
I guess you could use grep:
if ! grep -qFxf name.txt <<<"$1"; then
exit
fi
-q is "quiet mode" - the exit status indicates a match
-F matches fixed strings, rather than regular expressions
-x is a full line match, so names that are substrings won't match
-f means that the list of patterns are contained within a file
The string contained within the first argument is passed to grep using a here string. This is a bash feature, so if you're using another shell, you could go with something like if ! echo "$1" | grep -qFxf name.txt instead (but the bash way is preferred as it saves invoking a subprocess).
If you want to ensure that any error output from grep is suppressed, you can add 2>/dev/null to the command, as suggested in the comments. This prevents any messages sent to standard error from being displayed.
Related
I am trying to disable nodes from apache loadbalancer using shell script. I got some idea online but I am not able to understand piece of code written for disabling the nodes form loadbalancer. Below is the code I am referring:
disable() {
balancer=$1
worker=$2
if [ -z "$balancer" ] || [ -z "$worker" ]; then
echo "Usage: $0 [-s host] [-p port] [-m balancer-manager] disable balancer_name worker_route"
echo " balancer_name : balancer/cluster name"
echo " worker_route : worker route e.g.)"
exit 1
fi
echo "Disabling $2 of $1..."
nonce=`$CURL -s "http://${server}:${port}/${manager}" | grep nonce | grep "${balancer}" | sed "s/.*nonce=\(.*\)['\"].*/\1/" | tail -n 1`
if [ -z "$nonce" ]; then
echo "balancer_name ($balancer) not found"
exit 1
fi
Can you please help me understand the meaning of above mentioned code. Especially, about nonce.
Picking apart your command, you probably first need to understand how pipes work in Unix. Each command has standard input and standard output; the notation first | second is a pipeline where the standard output from first becomes the standard input for second; so instead of first printing anything, or second reading something from a file, we pass whatever first would print as the input for second.
nonce=... assigns ... to the variable nonce. The name suggests this is intended to be used as a cryptographic nonce.
`cmd` is an obsolescent synonym for $(cmd). This command substitution replaces (substitutes) the command cmd with its output. So the value of nonce will be whatever this subshell prints to standard output.
$CURL is probably going to run curl, but we can only guess. Usually you would make sure curl is on your PATH and simply use the literal command curl.
curl -s http://whatever fetches the contents of the URL and prints them to standard output. The -s option suppresses any status messages. The output gets piped to ...
grep nonce, which prints to standard output any line which contains a match of the regular expression nonce (which simply matches this text verbatim anywhere on a line) and suppresses all others; which then gets piped to ...
grep "${balancer}" which similarly prints only lines which match whatever regular expression the variable balancer contains (the braces are harmless but useless); which then gets piped to ...
sed "s/.*nonce=\(.*\)['\"].*/\1/" which picks out the stuff between the parentheses -- anything up to the last single or double quote after nonce=; which then gets piped to ...
tail -n 1 which discards all lines except the last one.
So in summary, pick out the last occurrence of nonce= from the content behind the remote URL, and print that, but only up to just before the first single or double quote.
This is all rather clumsy and inefficient; any pipe involving multiple grep and sed commands should probably be refactored to a simple Awk script.
nonce=$(curl -s "http://${server}:${port}/${manager}" |
awk -v b="$balancer" '/nonce/ && $0 ~ b {
sub(/^.*nonce=/, ""); sub(/[\047\042][^\047\042]*$/, ""); n=$0 }
END { print n }')
The sed command in particular looks slightly obscure; normally, we would expect the output we want to extract to be between quotes, but this extracts up to just before the last single or double quote. The command sed -n 's/.*\(stuff\).*/\1/p' would be the normal way to only print the lines from which we actually managed to extract stuff. But without seeing what the URL contains, we can only speculate about whether this is correct or not. Certainly the conventional syntax would have allowed the author to omit the first grep entirely.
Getting the last nonce is probably just a guardrail to make sure there is never more than one; I would assume we would normally expect only a single match.
grep, sed, and Awk all operate on regular expressions. If you are new to regex, perhaps visit the Stack Overflow regex tag info page and check out the list of learning resources near the end.
Going forward, probably try https://explainshell.com/ before asking for human assistance.
I have the following unix shell script, which is used to list the files in the given directory. Only we need to pass the extension of the file and script should list the file or files or display custom message.
My try:
Script:
#!/bin/sh
FileNameWithPath=`ls home\docs\customers\*.$1 | wc -w`
if [ $FileNameWithPath -gt 0 ]
then
ls home\docs\customes\*.$1
else
echo "Custom Message about failure(File not found)"
fi
Run:
$ ./Test.sh txt
Note: The above script works fine if i give file extension which is exists but if i give some non exists file extension it will through error plus custom error message. I just want to print custom message that's it.
You can do it with a single command:
ls home/docs/customers/*.$1 2> /dev/null || echo "Custom message about failure (File not found)"
The first command (the 'ls') try to list the files. If it fails, it will print an error message (suppressed by '2> /dev/null') and returns an error code. Since the exit code is different by 0, the second part (the 'echo') will be executed.
If you want to keep your code, you can drop the ls error redirecting stderr to /dev/null in this way:
FileNameWithPath=`ls home\docs\customers\*.$1 2>/dev/null | wc -w`
This doesn't require use of ls.
You can do this with globbing itself:
# turn on glob failure for no matches
shopt -s failglob
# list files or a custom error message
(echo home/docs/customers/*."$1") 2>/dev/null ||
echo "Custom Message about failure"
The error message you get happens in the line where you are assigning to FileNameWithPath. You can suppress it by redirecting it to /dev/null. i.e. 2>/dev/null.
It is much better (and Posix compliant) to use $() instead of the backtick operator, given that you started your script with #!/bin/sh rather than #!/bin/bash. You will then be portable across the modern bourne shells.
Another big win for using $() is that they can be nested easily, whereas you have to escape the backtick when you nest it.
As Andrea Carron points out in their answer, you can do the whole thing on one line using the || logical-or operator. This is a very common idiom.
On the off-chance that your MVCE refers to something more complex, I fixed it for you below.
#!/bin/sh
FileNameWithPath=$(ls home\docs\customers\*.$1 2>/dev/null | wc -w)
if [ $FileNameWithPath -gt 0 ]
then
ls home\docs\customes\*.$1
else
echo "Custom Message about failure(File not found)"
fi
Just add error redirection to null device file in second line of your script:-
FileNameWithPath=`ls home\docs\customers\*.$1 2>/dev/null | wc -w`
I am trying to write a function in a bash script that gets lines from stdin and picks out the first line which is not contained in a file.
Here is my approach:
doubles=file.txt
firstnotdouble(){
while read input_line; do
found=0;
cat $doubles |
while read double_line; do
if [ "$input_line" = "$double_line" ]
then
found=1;
break
fi
done
if [ $found -eq 0 ] # no double found, echo and break!
then
echo $input_line
break
fi
done
}
After some debugging attempts I realized that when found is set to 1 in the first if block, it does not keep its value until the next if block. That's why it's not working. Why does the script act as if there were two found variables in different "scopes"?
The second question would be if the approach as a whole could be optimized.
As indicated in the comments, the issue with environment variables is that the commands in a pipeline (that is, a series of commands separated by |) run in subshells, and each subshell has its own environment variables. You could have avoided the problem by avoiding the UUOC (useless use of cat), writing:
while read ...; do ... done < "$doubles"
instead of the pipeline.
A (much) faster way than using a while read loop repeatedly through the doubles file is to use grep:
# Specify the file to be scanned as the first argument
firstnotdouble() {
while IFS= read -r double_line; do
if ! grep -qxF "$double_line" "$1"; then
echo "$double_line"
return
fi
done
return 1
}
In the grep:
-q suppress print out, and stop on first match
-x pattern must match the entire line
-F pattern is a simple string instead of a regular expression.
In the read:
IFS= avoids spaces being trimmed
-r avoids backslashes being deleted
With GNU grep, you could use -xF -m1 (or even -xFm1 if you like being cryptic) instead of -qxF, and then leave out the echo. The grep extension -m N limits the number of matches found to N.
So I am writing a script that will curl a site I've written that will return a string. I am using it to compare a value that is saved within a file. I am having trouble getting my conditional statement to return true for matching values.
Here is a snippet of my bash code
var1=$(curl -s -w %{http_code}\\n www.somesite.com)
var2=$(cat somefile)
if [[ "$var1" = "$var2" ]]; then
echo "TRUE"
else
echo "FALSE"
fi
I have manually looked at both the strings and they seem to be identical. I've done wc with all applicable options with it. I've copy and pasted both the values into Notepad++ and did a find for the expected string and it said that both values matched the find command.
Obviously, if I manually put the values in the condition it returns true, so I know its not my conditional statement.
My guess is there is some type of hidden character on the end of curl...('\r' or '\n' or something that I am unaware)...or maybe on the end of the file that I am unaware of. Is this a known issue when trying to compare curl output and file content?
This may be a dumb question, but for the life of me I cannot seem to get these strings to match doing it dynamically instead of hardcoding the values in the condition.
Any suggestions would be great. Thanks
$ diff -q -b <(echo $'abc\ndef') <(echo $'abc\r\ndef') > /dev/null ; echo $?
0
$ diff -q -b <(echo $'abc\ndef') <(echo $'abc\r\nde') > /dev/null ; echo $?
1
I'm currently writing a bash script which loads video files up to to YouTube using GoogleCL.
As I'm doing this uploading stuff in a loop (because there can be multiple video files) I would like to check if each file had been uploaded successfully before I upload the next one.
The command google youtube post --access unlisted --category Tech $f (where $f represents the file) outputs a string which tells me whether the upload has been successful or not.
But I don't know how to redirect that "return string" into a variable to do check the successs.
That's what I have:
for f in ./*.ogv ./*.mov ./*.mp4
do
if [[ '*' != ${f:2:1} ]]
then
echo "Uploading video file $f"
# How to put the return value of the following command into a variable?
google youtube post --access unlisted --category Tech $f > /dev/null
# Now I assume that the output of the command above is available in the variable RETURNVALUE
if [[ $RETURNVALUE == *uploaded* ]]
then
echo "Upload successful."
else
echo "Upload failed."
fi
fi
done
Can anybody help me?
My guess is that you could depend on the error code from the google command as well (I'm assuming it returns error if it failed to upload, but you should probably double check this).
for f in ./*.ogv ./*.mov ./*.mp4; do
if [[ '*' != ${f:2:1} ]]; then
echo "Uploading video file $f"
if google youtube post --access unlisted --category Tech "$f" > /dev/null
then
echo "Upload successful."
else
echo "Upload failed."
fi
fi
done
A common misconception is that if wants a bracketed expression to evaluate, this is not true, if always takes a command and checks the error status; usually this command is [ which is an alias for test, which evaluates the expression. (And yes, I'd be surprised if there isn't an optimized shortcut to make it go faster inside bash, but conceptually it's still true).
Capturing output is done via backticks, like so
result=`command argument a b c`
or using $()
result=$(command argument a b c)
but it's probably better to use the error code in this case.
EDIT:
You have a funny if thing in your function.. I didn't notice at first, but it can be avoided if you enable nullglob shell option (this will make ./*.mov to expand to the empty string, if there are no files). Also, quote that $f or it'll break if your file names contain spaces
shopt -s nullglob
for f in ./*.ogv ./*.mov ./*.mp4; do
echo "Uploading video file $f"
if google youtube post --access unlisted --category Tech "$f" > /dev/null
then
echo "Upload successful."
else
echo "Upload failed."
fi
done
HTH.
I would call it, The command ... outputs a string. 'Return' is a keyword, and the return value is a number, where 0 means by convention success (0 errors) and a different value indicates an error code.
You grab the output by:
result=$(google youtube post --access unlisted --category Tech $f)
but will often see the inferior solution:
result=`cmd param1 param2`
inferior, because backticks are easily confused with apostrophes (depending on the font) and hard to nest, so don't use them.
From 'man bash':
The return value of a simple command
is its exit status, or 128+n if the
command is terminated by signal n.
and:
return [n]
Causes a function to exit with the return value specified
by n. If n is omitted, the return
status is that of the last
command executed in the function body. If used outside a
function, but during execution of a
script by the . (source)
command, it causes the shell to stop executing that script
and return either n or the exit status
of the last command
executed within the script as the exit status of the
script. If used outside a function
and not during execution of a
script by ., the return status is false. Any command
associated with the RETURN trap is
executed before execution
resumes after the function or script.
The return value/exit code of the last command is gained through $?.
The keyword for the meaning you meant is command substitution. Again 'man bash':
Command Substitution
Command substitution allows the output of a command to replace the
command name. There are two forms:
$(command)
or
`command`
Bash performs the expansion by executing command and replacing the command substitution with the standard
output of the command, with any trailing newlines deleted. Embedded newlines
are not deleted, but they may be
removed during word splitting.
The command substitution $(cat file) can be replaced by the
equivalent but faster $(< file).
When the old-style backquote form of substitution is used,
backslash retains its literal meaning
except when followed by $, `,
or . The first backquote not preceded by a backslash terminates the
command substitution. When using the
$(command) form,
all characters between the parentheses make up the command; none
are treated specially.
Command substitutions may be nested. To nest when using the
backquoted form, escape the inner
backquotes with backslashes.
If the substitution appears within double quotes, word splitting
and pathname expansion are not
performed on the results.
If you are still getting output after > /dev/null then it's coming out on stderr, so standard backticks or $() won't work.
First, see if the return code indicates a problem: Examine $? after success and failure.
If it turns out the return code isn't sufficent, you can redirect the output:
RETURNVALUE=$(google youtube post --access unlisted --category Tech $f 2>&1 >/dev/null)
2>&1 puts stderr on the stdout fd, before you redirect stdout to nothing.
Use $()
variable=$(google youtube post --access unlisted --category Tech $f )
You can assign it to a variable by using backticks:
CONTENT=`google youtube post --access unlisted --category Tech $f > /dev/null`
Example for access youtube
rm ~/temp/youtube_top_title1.txt
rm ~/temp/youtube_top1.txt
curl "http://www.youtube.com/"|\
grep -o 'watch[^"]*'|\
sort -n|\
uniq -c|\
awk '{if ($1!="1") print $2}'|\
while read i ; do\
page_youtube=`curl -s "http://www.youtube.com/${i}"`;\
echo `echo -e "$page_youtube" |awk '{if ($1 ~ "") start=1; if ($1 ~ "") start=0; if (start == 1) print $0 }'|grep -v ''` >> ~/temp/youtube_top_title1.txt
url_current=$(echo -e "$page_youtube"|\
grep -o "url=[^,]*,"|\
sed 's/url=//g;s/,//g'|\
php -r '$f=fopen("php://stdin","r");while (FALSE!==($line=fgets($f))) echo urldecode($line);'|\
awk '{print $1}'|\
sed 's/;$//g'|\
sed 's/\\u.*//g'|\
tail -2|\
head -1)
echo "$url_current" >> ~/temp/youtube_top1.txt
done
The answer to the question is to use the read command.
Sure all these other answers show ways to not do what the OP asked, but that really screws up the rest of us who searched for the OP's question.
Disclaimer: This is a duplicate answer
Here is how you do it...
Command:
# I would usually do this on one line, but for readability...
series | of | commands \
| \
(
read string;
mystic_command --opt "$string" /path/to/file
) \
| \
handle_mystified_file
Here is what it is doing and why it is important:
Let's pretend that the series | of | commands is a very complicated series of piped commands.
mystic_command is could accept stdin as the file, but not the option arg therefore it must come in as a variable**
read takes stdin and places it into the variable $string
Putting the read and the mystic_command into a "sub shell" via parenthesis is not necessary but makes it flow like a continuous pipe as if the 2 commands where in a separate script file.
** There is always an alternative, and in this case the alternative is ugly and unreadable:
mystic_command --opt "$(series | of | commands)" /path/to/file | handle_mystified_file